I don’t know Cyber Dopamine’s credentials and don’t care enough to look.
But this has come up a lot in the past. And most of the time the difference between the OSes (assuming you turn off the most blatant of things) is noise that would disappear if your sample size was large enough and is comparable to the kinds of variations you see when all software is fixed but you have two different CPUs of the same model from the same batch but with slight manufacturing variance. Gamers Nexus and Level1Tech have both discussed this to varying degrees when going over their testing methodologies.
But if you are just doing the first run or two? THAT is where the very noticeable differences come in. Because shader precaching is a thing. At a super high level, it is the idea that every computer is unique and there are tweakable “knobs” that can improve performance when rendering things. Shader Precaching generally refers to testing and setting those knobs ahead of time to maximize performance. That is a VERY incorrect summary but it gets the point across.
Some games have an obnoxious 1-10 minute wait the first time you boot it up because they are, in the background, running through all of that to build that precache. Others say “fuck it, we’ll do it live” and you have a performance hit the first time you load an area or asset… that can be REALLY substantial on that first boot when EVERYTHING is being loaded for the first time.
But, because Linux tends to run games through (time to piss more people off) an emulator like Wine/Proton, it is already generally doing a translation layer even before the shader math. So that is why Steam will often push “good enough” defaults to linux clients every day and there is a setting to even optimize those for your system in the background while you whack it to your favorite onlyfans account.
But… that also gives Linux a pretty big advantage on that first boot on games that are doing it live. Because… we aren’t doing it live.
It might have been phoronix that actually did a deep dive on this? But the various blogsites love to either run their own numbers or find a youtuber who did, not take this into account, and get those juicy clicks.
because Linux tends to run games through (time to piss more people off) an emulator like Wine/Proton
It’s not that this will piss people off, it’s that it’s factually inaccurate. Wine Is Not (an) Emulator. It’s right there in the name. It’s a compatibility layer. You aren’t emulating Windows.
(But this nitpicking doesn’t have any impact on your point overall regarding shader pre-caching.)
While I agree that more extensive tests would be interesting, there are quite a few inaccuracies in your argument that make me disagree with your conclusion.
Firstly, Steam’s shader pre-caching does not work like that. There are no “good enough” defaults being generated every day. The caches you may receive have been generated and uploaded by other players with the same compatible system configuration (I think GPU and driver version, but there might be other factors I’m not aware of). So you only receive shader caches when someone else with a compatible system has uploaded them already. Furthermore, these caches aren’t necessarily complete. As games often compile shaders on the fly and not everyone plays every bit of content in every game, local compilation still happens pretty regularly.
Secondly, shader compilation is by no means limited to the first launch of the game. When the game itself doesn’t have a dedicated (and comprehensive) shader compilation step at startup, shaders may have to be compiled every time you access content in a game that you haven’t seen before. Sure, this is most noticeable at the start of the game, but it might also happen when the final boss enters its sparkly third phase, punching your stuttering, shader-compiling butt into the next wall (which might also require further shader compilation).
Lastly, I disagree with the general assessment that these advantages are somehow invalid because they might be limited to shader compilation stutters. When I download a new game, I want it to run properly. The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers. Microsoft hasn’t implemented something similar for their new shiny handheld. So I think it’s absolutely legitimate to call them out on their unoptimized device. I think, for the end user, it’s really just semantics to discuss whether the issue at its core exists because of Linux or because of Windows or because of caching or because of some unknown deity who just really likes Gaben. The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows. And that’s both very sad (because I’d really like to have some serious Steam Deck competition) and really funny (because seeing a very big corporation’s gaming division fumble everything they do makes me giggle).
… Also, Wine is not an emulator, but I think you know that. ;)
In some situation shader pre-caching makes things worse rather than better: for example in my machine Borderlands 2 would take 10 minutes to update shaders at pretty much every game start even though I have a Gbps Internet connection.
Eventually it turned out that you really shouldn’t be running “Latest Proton” with it as any update to Proton would trigger a full update of the shader chache (or worse, local generation, which in my machine took hours). Of course, information about that shit was nowhere to be found, nor was the default configuration of that game under Linux setup to just run the game with a specific Proton version.
Switching shader pre-caching off also solved the problem, but to avoid the situation as you described of “shader translation at the time the shader is first used” causing unexpected slowdowns at bad time, when I figured out the that it was using “Latest proton” that was triggering full shader cache downloads I switched it all to use shader pre-chaching with a specific, fixed proton version.
All this to say that the was Steam does shader pre-caching isn’t a silver bullet - for some games it makes them near unplayable by default until you figure out the specific configuration changes needed (and, with at best many minutes before each game start actually succeeds, trial and error is a very slow and frustrating way to figure out what’s going on and how to fix it).
not everyone plays every bit of content in every game, local compilation still happens pretty regularly.
Yes. Which is why I emphasized that this mostly comes into play on the first boot of a game and would go away VERY rapidly if you ran multiple collections.
It is not limited to the first launch but it is most noticeable on it.
The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers.
Funny enough, MS have been talking about (and implementing?) that for their xbox store games or whatever. Which is actually a REALLY good approach if you ask me.
The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows.
No, we don’t. Which is my point.
Again, I would love to see a rigorous methodology that argues this (and if I am not familiar with Cyber Dopamine’s game, please, someone link me to their methodology). But I have a scientific background: I can cherry pick whatever you pay me to cherry pick.
And, quite honestly? If the performance is mostly noise outside of that first boot after a major patch? That might be awesome if I am paying 80 bucks to play a 4 hour CoD campaign once. That is less than meaningless if I am looking at something like Crusader Kings 3 where I will log hundreds of hours before the next one.
Which gets back to… what is the point of looking at benchmarks? Is it to feel happy that your team won? Okay. You do you. Or is it to make informed decisions.
And as someone who would really prefer Linux (Proton or otherwise) to be a first class citizen? Actively misleading people is just going to hurt in even the medium run.
… Also, Wine is not an emulator, but I think you know that. ;)
My inner pedant thinks that it is worth actually looking at the definition of an emulator rather than the title of software. But mostly it is just a good way to weed out the people who were never going to engage in a good faith discussion :)
the wine part, its more or less based on the end goal of the software. traditional game emulation is software that is used to try to mimic the hardware in question. Wine focuses on trying to be windows, rather than the physical PC hardware itself, so the type of thing youre trying to emulate is fundamentally different.
emulation in general is functionally a compatibility layer. Wine exists on the highest level of compatibility layer because its not trying to rewrite what the hardware is necessary doing (as both the hardware is the same)
Wine is the same level as Teknoparrot is (teknoparrot is compatibility layer for arcade maches that use X86 based pcs for hardware), vs MAME, which emulates non x86 based arcade machines.
it gets mega pendantic because of how closely emulation and compatibility layers are intertwined, but they arent two overlapping circles. a compatibility can use emulation but doesn’t require it.
I just like defining the difference between the two simply by the goal. emulation targets machine to then get to software, where compatibility layers target the software directly. and will target hardware only when necessary.
But they also have nothing to do with moment to moment performance. And I would argue that people, generally, aren’t pulling out a steam deck or a 1000 USD xbox to play a quick hand of Balatro while in line at the bank. I love that waking up my bazzite box under my TV actually takes less time than the nvidia shield I watch youtube on. I am not going to pretend that there is a meaningful difference between a 20 second boot and a 2 minute boot if I am sitting down to get angry at Silksong again. Same with my Steam Deck where I usually push the power button and then put the case back in my bag on a plane or off to the side while I get cozy at home.
Its similar to the initial OS install. I will never stop being enraged when I have to deal with cortana and all that bullshit to install windows on a box. I positively love that most Linux distros are like 3 clicks and then a two minute wait if I am not doing weird partition shenanigans. But that is more a plus to the OS and not a reason I would upend my digital existence.
Whereas this article is very specifically about moment to moment framerate.
I don’t know Cyber Dopamine’s credentials and don’t care enough to look.
But this has come up a lot in the past. And most of the time the difference between the OSes (assuming you turn off the most blatant of things) is noise that would disappear if your sample size was large enough and is comparable to the kinds of variations you see when all software is fixed but you have two different CPUs of the same model from the same batch but with slight manufacturing variance. Gamers Nexus and Level1Tech have both discussed this to varying degrees when going over their testing methodologies.
But if you are just doing the first run or two? THAT is where the very noticeable differences come in. Because shader precaching is a thing. At a super high level, it is the idea that every computer is unique and there are tweakable “knobs” that can improve performance when rendering things. Shader Precaching generally refers to testing and setting those knobs ahead of time to maximize performance. That is a VERY incorrect summary but it gets the point across.
Some games have an obnoxious 1-10 minute wait the first time you boot it up because they are, in the background, running through all of that to build that precache. Others say “fuck it, we’ll do it live” and you have a performance hit the first time you load an area or asset… that can be REALLY substantial on that first boot when EVERYTHING is being loaded for the first time.
But, because Linux tends to run games through (time to piss more people off) an emulator like Wine/Proton, it is already generally doing a translation layer even before the shader math. So that is why Steam will often push “good enough” defaults to linux clients every day and there is a setting to even optimize those for your system in the background while you whack it to your favorite onlyfans account.
But… that also gives Linux a pretty big advantage on that first boot on games that are doing it live. Because… we aren’t doing it live.
It might have been phoronix that actually did a deep dive on this? But the various blogsites love to either run their own numbers or find a youtuber who did, not take this into account, and get those juicy clicks.
It’s not that this will piss people off, it’s that it’s factually inaccurate. Wine Is Not (an) Emulator. It’s right there in the name. It’s a compatibility layer. You aren’t emulating Windows.
(But this nitpicking doesn’t have any impact on your point overall regarding shader pre-caching.)
While I agree that more extensive tests would be interesting, there are quite a few inaccuracies in your argument that make me disagree with your conclusion.
Firstly, Steam’s shader pre-caching does not work like that. There are no “good enough” defaults being generated every day. The caches you may receive have been generated and uploaded by other players with the same compatible system configuration (I think GPU and driver version, but there might be other factors I’m not aware of). So you only receive shader caches when someone else with a compatible system has uploaded them already. Furthermore, these caches aren’t necessarily complete. As games often compile shaders on the fly and not everyone plays every bit of content in every game, local compilation still happens pretty regularly.
Secondly, shader compilation is by no means limited to the first launch of the game. When the game itself doesn’t have a dedicated (and comprehensive) shader compilation step at startup, shaders may have to be compiled every time you access content in a game that you haven’t seen before. Sure, this is most noticeable at the start of the game, but it might also happen when the final boss enters its sparkly third phase, punching your stuttering, shader-compiling butt into the next wall (which might also require further shader compilation).
Lastly, I disagree with the general assessment that these advantages are somehow invalid because they might be limited to shader compilation stutters. When I download a new game, I want it to run properly. The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers. Microsoft hasn’t implemented something similar for their new shiny handheld. So I think it’s absolutely legitimate to call them out on their unoptimized device. I think, for the end user, it’s really just semantics to discuss whether the issue at its core exists because of Linux or because of Windows or because of caching or because of some unknown deity who just really likes Gaben. The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows. And that’s both very sad (because I’d really like to have some serious Steam Deck competition) and really funny (because seeing a very big corporation’s gaming division fumble everything they do makes me giggle).
… Also, Wine is not an emulator, but I think you know that. ;)
In some situation shader pre-caching makes things worse rather than better: for example in my machine Borderlands 2 would take 10 minutes to update shaders at pretty much every game start even though I have a Gbps Internet connection.
Eventually it turned out that you really shouldn’t be running “Latest Proton” with it as any update to Proton would trigger a full update of the shader chache (or worse, local generation, which in my machine took hours). Of course, information about that shit was nowhere to be found, nor was the default configuration of that game under Linux setup to just run the game with a specific Proton version.
Switching shader pre-caching off also solved the problem, but to avoid the situation as you described of “shader translation at the time the shader is first used” causing unexpected slowdowns at bad time, when I figured out the that it was using “Latest proton” that was triggering full shader cache downloads I switched it all to use shader pre-chaching with a specific, fixed proton version.
All this to say that the was Steam does shader pre-caching isn’t a silver bullet - for some games it makes them near unplayable by default until you figure out the specific configuration changes needed (and, with at best many minutes before each game start actually succeeds, trial and error is a very slow and frustrating way to figure out what’s going on and how to fix it).
Yes. Which is why I emphasized that this mostly comes into play on the first boot of a game and would go away VERY rapidly if you ran multiple collections.
It is not limited to the first launch but it is most noticeable on it.
Funny enough, MS have been talking about (and implementing?) that for their xbox store games or whatever. Which is actually a REALLY good approach if you ask me.
No, we don’t. Which is my point.
Again, I would love to see a rigorous methodology that argues this (and if I am not familiar with Cyber Dopamine’s game, please, someone link me to their methodology). But I have a scientific background: I can cherry pick whatever you pay me to cherry pick.
And, quite honestly? If the performance is mostly noise outside of that first boot after a major patch? That might be awesome if I am paying 80 bucks to play a 4 hour CoD campaign once. That is less than meaningless if I am looking at something like Crusader Kings 3 where I will log hundreds of hours before the next one.
Which gets back to… what is the point of looking at benchmarks? Is it to feel happy that your team won? Okay. You do you. Or is it to make informed decisions.
And as someone who would really prefer Linux (Proton or otherwise) to be a first class citizen? Actively misleading people is just going to hurt in even the medium run.
My inner pedant thinks that it is worth actually looking at the definition of an emulator rather than the title of software. But mostly it is just a good way to weed out the people who were never going to engage in a good faith discussion :)
the wine part, its more or less based on the end goal of the software. traditional game emulation is software that is used to try to mimic the hardware in question. Wine focuses on trying to be windows, rather than the physical PC hardware itself, so the type of thing youre trying to emulate is fundamentally different.
emulation in general is functionally a compatibility layer. Wine exists on the highest level of compatibility layer because its not trying to rewrite what the hardware is necessary doing (as both the hardware is the same)
Wine is the same level as Teknoparrot is (teknoparrot is compatibility layer for arcade maches that use X86 based pcs for hardware), vs MAME, which emulates non x86 based arcade machines.
it gets mega pendantic because of how closely emulation and compatibility layers are intertwined, but they arent two overlapping circles. a compatibility can use emulation but doesn’t require it.
I just like defining the difference between the two simply by the goal. emulation targets machine to then get to software, where compatibility layers target the software directly. and will target hardware only when necessary.
Those quicker sleep and resume times speak for themselves though
But they also have nothing to do with moment to moment performance. And I would argue that people, generally, aren’t pulling out a steam deck or a 1000 USD xbox to play a quick hand of Balatro while in line at the bank. I love that waking up my bazzite box under my TV actually takes less time than the nvidia shield I watch youtube on. I am not going to pretend that there is a meaningful difference between a 20 second boot and a 2 minute boot if I am sitting down to get angry at Silksong again. Same with my Steam Deck where I usually push the power button and then put the case back in my bag on a plane or off to the side while I get cozy at home.
Its similar to the initial OS install. I will never stop being enraged when I have to deal with cortana and all that bullshit to install windows on a box. I positively love that most Linux distros are like 3 clicks and then a two minute wait if I am not doing weird partition shenanigans. But that is more a plus to the OS and not a reason I would upend my digital existence.
Whereas this article is very specifically about moment to moment framerate.