While I agree that more extensive tests would be interesting, there are quite a few inaccuracies in your argument that make me disagree with your conclusion.
Firstly, Steam’s shader pre-caching does not work like that. There are no “good enough” defaults being generated every day. The caches you may receive have been generated and uploaded by other players with the same compatible system configuration (I think GPU and driver version, but there might be other factors I’m not aware of). So you only receive shader caches when someone else with a compatible system has uploaded them already. Furthermore, these caches aren’t necessarily complete. As games often compile shaders on the fly and not everyone plays every bit of content in every game, local compilation still happens pretty regularly.
Secondly, shader compilation is by no means limited to the first launch of the game. When the game itself doesn’t have a dedicated (and comprehensive) shader compilation step at startup, shaders may have to be compiled every time you access content in a game that you haven’t seen before. Sure, this is most noticeable at the start of the game, but it might also happen when the final boss enters its sparkly third phase, punching your stuttering, shader-compiling butt into the next wall (which might also require further shader compilation).
Lastly, I disagree with the general assessment that these advantages are somehow invalid because they might be limited to shader compilation stutters. When I download a new game, I want it to run properly. The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers. Microsoft hasn’t implemented something similar for their new shiny handheld. So I think it’s absolutely legitimate to call them out on their unoptimized device. I think, for the end user, it’s really just semantics to discuss whether the issue at its core exists because of Linux or because of Windows or because of caching or because of some unknown deity who just really likes Gaben. The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows. And that’s both very sad (because I’d really like to have some serious Steam Deck competition) and really funny (because seeing a very big corporation’s gaming division fumble everything they do makes me giggle).
… Also, Wine is not an emulator, but I think you know that. ;)
In some situation shader pre-caching makes things worse rather than better: for example in my machine Borderlands 2 would take 10 minutes to update shaders at pretty much every game start even though I have a Gbps Internet connection.
Eventually it turned out that you really shouldn’t be running “Latest Proton” with it as any update to Proton would trigger a full update of the shader chache (or worse, local generation, which in my machine took hours). Of course, information about that shit was nowhere to be found, nor was the default configuration of that game under Linux setup to just run the game with a specific Proton version.
Switching shader pre-caching off also solved the problem, but to avoid the situation as you described of “shader translation at the time the shader is first used” causing unexpected slowdowns at bad time, when I figured out the that it was using “Latest proton” that was triggering full shader cache downloads I switched it all to use shader pre-chaching with a specific, fixed proton version.
All this to say that the was Steam does shader pre-caching isn’t a silver bullet - for some games it makes them near unplayable by default until you figure out the specific configuration changes needed (and, with at best many minutes before each game start actually succeeds, trial and error is a very slow and frustrating way to figure out what’s going on and how to fix it).
not everyone plays every bit of content in every game, local compilation still happens pretty regularly.
Yes. Which is why I emphasized that this mostly comes into play on the first boot of a game and would go away VERY rapidly if you ran multiple collections.
It is not limited to the first launch but it is most noticeable on it.
The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers.
Funny enough, MS have been talking about (and implementing?) that for their xbox store games or whatever. Which is actually a REALLY good approach if you ask me.
The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows.
No, we don’t. Which is my point.
Again, I would love to see a rigorous methodology that argues this (and if I am not familiar with Cyber Dopamine’s game, please, someone link me to their methodology). But I have a scientific background: I can cherry pick whatever you pay me to cherry pick.
And, quite honestly? If the performance is mostly noise outside of that first boot after a major patch? That might be awesome if I am paying 80 bucks to play a 4 hour CoD campaign once. That is less than meaningless if I am looking at something like Crusader Kings 3 where I will log hundreds of hours before the next one.
Which gets back to… what is the point of looking at benchmarks? Is it to feel happy that your team won? Okay. You do you. Or is it to make informed decisions.
And as someone who would really prefer Linux (Proton or otherwise) to be a first class citizen? Actively misleading people is just going to hurt in even the medium run.
… Also, Wine is not an emulator, but I think you know that. ;)
My inner pedant thinks that it is worth actually looking at the definition of an emulator rather than the title of software. But mostly it is just a good way to weed out the people who were never going to engage in a good faith discussion :)
the wine part, its more or less based on the end goal of the software. traditional game emulation is software that is used to try to mimic the hardware in question. Wine focuses on trying to be windows, rather than the physical PC hardware itself, so the type of thing youre trying to emulate is fundamentally different.
emulation in general is functionally a compatibility layer. Wine exists on the highest level of compatibility layer because its not trying to rewrite what the hardware is necessary doing (as both the hardware is the same)
Wine is the same level as Teknoparrot is (teknoparrot is compatibility layer for arcade maches that use X86 based pcs for hardware), vs MAME, which emulates non x86 based arcade machines.
it gets mega pendantic because of how closely emulation and compatibility layers are intertwined, but they arent two overlapping circles. a compatibility can use emulation but doesn’t require it.
I just like defining the difference between the two simply by the goal. emulation targets machine to then get to software, where compatibility layers target the software directly. and will target hardware only when necessary.
While I agree that more extensive tests would be interesting, there are quite a few inaccuracies in your argument that make me disagree with your conclusion.
Firstly, Steam’s shader pre-caching does not work like that. There are no “good enough” defaults being generated every day. The caches you may receive have been generated and uploaded by other players with the same compatible system configuration (I think GPU and driver version, but there might be other factors I’m not aware of). So you only receive shader caches when someone else with a compatible system has uploaded them already. Furthermore, these caches aren’t necessarily complete. As games often compile shaders on the fly and not everyone plays every bit of content in every game, local compilation still happens pretty regularly.
Secondly, shader compilation is by no means limited to the first launch of the game. When the game itself doesn’t have a dedicated (and comprehensive) shader compilation step at startup, shaders may have to be compiled every time you access content in a game that you haven’t seen before. Sure, this is most noticeable at the start of the game, but it might also happen when the final boss enters its sparkly third phase, punching your stuttering, shader-compiling butt into the next wall (which might also require further shader compilation).
Lastly, I disagree with the general assessment that these advantages are somehow invalid because they might be limited to shader compilation stutters. When I download a new game, I want it to run properly. The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers. Microsoft hasn’t implemented something similar for their new shiny handheld. So I think it’s absolutely legitimate to call them out on their unoptimized device. I think, for the end user, it’s really just semantics to discuss whether the issue at its core exists because of Linux or because of Windows or because of caching or because of some unknown deity who just really likes Gaben. The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows. And that’s both very sad (because I’d really like to have some serious Steam Deck competition) and really funny (because seeing a very big corporation’s gaming division fumble everything they do makes me giggle).
… Also, Wine is not an emulator, but I think you know that. ;)
In some situation shader pre-caching makes things worse rather than better: for example in my machine Borderlands 2 would take 10 minutes to update shaders at pretty much every game start even though I have a Gbps Internet connection.
Eventually it turned out that you really shouldn’t be running “Latest Proton” with it as any update to Proton would trigger a full update of the shader chache (or worse, local generation, which in my machine took hours). Of course, information about that shit was nowhere to be found, nor was the default configuration of that game under Linux setup to just run the game with a specific Proton version.
Switching shader pre-caching off also solved the problem, but to avoid the situation as you described of “shader translation at the time the shader is first used” causing unexpected slowdowns at bad time, when I figured out the that it was using “Latest proton” that was triggering full shader cache downloads I switched it all to use shader pre-chaching with a specific, fixed proton version.
All this to say that the was Steam does shader pre-caching isn’t a silver bullet - for some games it makes them near unplayable by default until you figure out the specific configuration changes needed (and, with at best many minutes before each game start actually succeeds, trial and error is a very slow and frustrating way to figure out what’s going on and how to fix it).
Yes. Which is why I emphasized that this mostly comes into play on the first boot of a game and would go away VERY rapidly if you ran multiple collections.
It is not limited to the first launch but it is most noticeable on it.
Funny enough, MS have been talking about (and implementing?) that for their xbox store games or whatever. Which is actually a REALLY good approach if you ask me.
No, we don’t. Which is my point.
Again, I would love to see a rigorous methodology that argues this (and if I am not familiar with Cyber Dopamine’s game, please, someone link me to their methodology). But I have a scientific background: I can cherry pick whatever you pay me to cherry pick.
And, quite honestly? If the performance is mostly noise outside of that first boot after a major patch? That might be awesome if I am paying 80 bucks to play a 4 hour CoD campaign once. That is less than meaningless if I am looking at something like Crusader Kings 3 where I will log hundreds of hours before the next one.
Which gets back to… what is the point of looking at benchmarks? Is it to feel happy that your team won? Okay. You do you. Or is it to make informed decisions.
And as someone who would really prefer Linux (Proton or otherwise) to be a first class citizen? Actively misleading people is just going to hurt in even the medium run.
My inner pedant thinks that it is worth actually looking at the definition of an emulator rather than the title of software. But mostly it is just a good way to weed out the people who were never going to engage in a good faith discussion :)
the wine part, its more or less based on the end goal of the software. traditional game emulation is software that is used to try to mimic the hardware in question. Wine focuses on trying to be windows, rather than the physical PC hardware itself, so the type of thing youre trying to emulate is fundamentally different.
emulation in general is functionally a compatibility layer. Wine exists on the highest level of compatibility layer because its not trying to rewrite what the hardware is necessary doing (as both the hardware is the same)
Wine is the same level as Teknoparrot is (teknoparrot is compatibility layer for arcade maches that use X86 based pcs for hardware), vs MAME, which emulates non x86 based arcade machines.
it gets mega pendantic because of how closely emulation and compatibility layers are intertwined, but they arent two overlapping circles. a compatibility can use emulation but doesn’t require it.
I just like defining the difference between the two simply by the goal. emulation targets machine to then get to software, where compatibility layers target the software directly. and will target hardware only when necessary.