• ieatpwns@lemmy.world
    link
    fedilink
    arrow-up
    72
    ·
    23 hours ago

    You mean an os not filled with spyware bloat runs faster than one that is filled with spyware bloat? That’s wild

  • Qwel@sopuli.xyz
    link
    fedilink
    arrow-up
    14
    ·
    18 hours ago

    They tested exactly two games

    I mean Tom’s Hardware didn’t test anything, the Youtuber they are writing about tested Hogwarts Legacy and Kingdom Come Deliverance 2

    I will assert that there are more than two games on the market

  • Ek-Hou-Van-Braai@piefed.social
    link
    fedilink
    English
    arrow-up
    27
    ·
    23 hours ago

    I’d love to see the difference it makes to battery life.

    I switched my gaming laptop to Linux and the battery life more than doubled (while not gaming)

    • Dudewitbow@lemmy.zip
      link
      fedilink
      arrow-up
      14
      ·
      23 hours ago

      for the heavy games, itll likely not change much.

      however with light loads and low clock gaming (e.g 2d clocks playing like dead cells or something) the cpu governor on linux is far more efficient. Microsoft is pushing that game to game optimization on the handheld makers when it should really be them doing it.

      • Die4Ever@retrolemmy.com
        link
        fedilink
        English
        arrow-up
        12
        ·
        edit-2
        21 hours ago

        for the heavy games, itll likely not change much.

        Unless maybe you cap the fps.

        Example from the article, Hogwarts needed 35W to reach 60fps on Windows, and only needed 17W to reach 62fps on Linux.

        If you cap the fps to 60, then you also let the system rest when it manages to render a frame more quickly than 16.6 ms, saving more power in calm moments or maybe while menuing/managing inventory/reading quest logs (most games don’t have intense 3d graphics in the menus).

    • Dnb@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      18 hours ago

      It’s bad test.

      Watch the video the tdp, clock speeds for cpu and gpu are much higher on Linux which accounts for the differences

      So battery life would be much better in windows since it’s running fast more conservatively

  • Dettweiler@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    17
    ·
    23 hours ago

    My whole-ass PC got a huge performance increase going from W10 to Arch (SteamOS), and that’s even with using Proton/Wine.

  • mesa@piefed.social
    link
    fedilink
    English
    arrow-up
    9
    ·
    23 hours ago

    Interesting! Are the games tested on proton as well? Because if so that’s even more impressive.

    • nyankas@lemmy.ml
      link
      fedilink
      arrow-up
      16
      ·
      23 hours ago

      I don’t think Kingdom Come: Deliverance 2 has a native Linux version, so yes, these results are achieved using Proton.

      This is both a remarkable achievement for Valve, CodeWeavers, and everyone involved with the development of these compatibility tools, as well as a pretty damning showcase of Windows’ performance, even when configured as a somewhat debloated special handheld version.

      • Aceticon@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 hours ago

        In all fairness - opening a fucking text editor in Linux is way more performant than doing it in Windows 11 because the OS and standard service applications overhead is so much lower in Linux than Windows 11.

        One of the great elements in the achievement of everybody involved in working on all those compatibility tools is not having slacked at all in making them as highly performant as possible even though the lower overhead of OS itself would likely deliver them a win even if they did slack a bit.

  • NuXCOM_90Percent@lemmy.zip
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    23 hours ago

    I don’t know Cyber Dopamine’s credentials and don’t care enough to look.

    But this has come up a lot in the past. And most of the time the difference between the OSes (assuming you turn off the most blatant of things) is noise that would disappear if your sample size was large enough and is comparable to the kinds of variations you see when all software is fixed but you have two different CPUs of the same model from the same batch but with slight manufacturing variance. Gamers Nexus and Level1Tech have both discussed this to varying degrees when going over their testing methodologies.

    But if you are just doing the first run or two? THAT is where the very noticeable differences come in. Because shader precaching is a thing. At a super high level, it is the idea that every computer is unique and there are tweakable “knobs” that can improve performance when rendering things. Shader Precaching generally refers to testing and setting those knobs ahead of time to maximize performance. That is a VERY incorrect summary but it gets the point across.

    Some games have an obnoxious 1-10 minute wait the first time you boot it up because they are, in the background, running through all of that to build that precache. Others say “fuck it, we’ll do it live” and you have a performance hit the first time you load an area or asset… that can be REALLY substantial on that first boot when EVERYTHING is being loaded for the first time.

    But, because Linux tends to run games through (time to piss more people off) an emulator like Wine/Proton, it is already generally doing a translation layer even before the shader math. So that is why Steam will often push “good enough” defaults to linux clients every day and there is a setting to even optimize those for your system in the background while you whack it to your favorite onlyfans account.

    But… that also gives Linux a pretty big advantage on that first boot on games that are doing it live. Because… we aren’t doing it live.

    It might have been phoronix that actually did a deep dive on this? But the various blogsites love to either run their own numbers or find a youtuber who did, not take this into account, and get those juicy clicks.

    • TheRealKuni@piefed.social
      link
      fedilink
      English
      arrow-up
      11
      ·
      22 hours ago

      because Linux tends to run games through (time to piss more people off) an emulator like Wine/Proton

      It’s not that this will piss people off, it’s that it’s factually inaccurate. Wine Is Not (an) Emulator. It’s right there in the name. It’s a compatibility layer. You aren’t emulating Windows.

      (But this nitpicking doesn’t have any impact on your point overall regarding shader pre-caching.)

    • nyankas@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      ·
      21 hours ago

      While I agree that more extensive tests would be interesting, there are quite a few inaccuracies in your argument that make me disagree with your conclusion.

      Firstly, Steam’s shader pre-caching does not work like that. There are no “good enough” defaults being generated every day. The caches you may receive have been generated and uploaded by other players with the same compatible system configuration (I think GPU and driver version, but there might be other factors I’m not aware of). So you only receive shader caches when someone else with a compatible system has uploaded them already. Furthermore, these caches aren’t necessarily complete. As games often compile shaders on the fly and not everyone plays every bit of content in every game, local compilation still happens pretty regularly.

      Secondly, shader compilation is by no means limited to the first launch of the game. When the game itself doesn’t have a dedicated (and comprehensive) shader compilation step at startup, shaders may have to be compiled every time you access content in a game that you haven’t seen before. Sure, this is most noticeable at the start of the game, but it might also happen when the final boss enters its sparkly third phase, punching your stuttering, shader-compiling butt into the next wall (which might also require further shader compilation).

      Lastly, I disagree with the general assessment that these advantages are somehow invalid because they might be limited to shader compilation stutters. When I download a new game, I want it to run properly. The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers. Microsoft hasn’t implemented something similar for their new shiny handheld. So I think it’s absolutely legitimate to call them out on their unoptimized device. I think, for the end user, it’s really just semantics to discuss whether the issue at its core exists because of Linux or because of Windows or because of caching or because of some unknown deity who just really likes Gaben. The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows. And that’s both very sad (because I’d really like to have some serious Steam Deck competition) and really funny (because seeing a very big corporation’s gaming division fumble everything they do makes me giggle).

      … Also, Wine is not an emulator, but I think you know that. ;)

      • Aceticon@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 hour ago

        In some situation shader pre-caching makes things worse rather than better: for example in my machine Borderlands 2 would take 10 minutes to update shaders at pretty much every game start even though I have a Gbps Internet connection.

        Eventually it turned out that you really shouldn’t be running “Latest Proton” with it as any update to Proton would trigger a full update of the shader chache (or worse, local generation, which in my machine took hours). Of course, information about that shit was nowhere to be found, nor was the default configuration of that game under Linux setup to just run the game with a specific Proton version.

        Switching shader pre-caching off also solved the problem, but to avoid the situation as you described of “shader translation at the time the shader is first used” causing unexpected slowdowns at bad time, when I figured out the that it was using “Latest proton” that was triggering full shader cache downloads I switched it all to use shader pre-chaching with a specific, fixed proton version.

        All this to say that the was Steam does shader pre-caching isn’t a silver bullet - for some games it makes them near unplayable by default until you figure out the specific configuration changes needed (and, with at best many minutes before each game start actually succeeds, trial and error is a very slow and frustrating way to figure out what’s going on and how to fix it).

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        21 hours ago

        not everyone plays every bit of content in every game, local compilation still happens pretty regularly.

        Yes. Which is why I emphasized that this mostly comes into play on the first boot of a game and would go away VERY rapidly if you ran multiple collections.

        It is not limited to the first launch but it is most noticeable on it.

        The shared precompiled shaders are a great solution for a bunch of stuttering, and they are only available on Linux because they actually require the compatibility layers.

        Funny enough, MS have been talking about (and implementing?) that for their xbox store games or whatever. Which is actually a REALLY good approach if you ask me.

        The fact is: from the data we have so far, it looks like Bazzite is a better experience on a handheld Microsoft co-developed when compared to Windows.

        No, we don’t. Which is my point.

        Again, I would love to see a rigorous methodology that argues this (and if I am not familiar with Cyber Dopamine’s game, please, someone link me to their methodology). But I have a scientific background: I can cherry pick whatever you pay me to cherry pick.

        And, quite honestly? If the performance is mostly noise outside of that first boot after a major patch? That might be awesome if I am paying 80 bucks to play a 4 hour CoD campaign once. That is less than meaningless if I am looking at something like Crusader Kings 3 where I will log hundreds of hours before the next one.

        Which gets back to… what is the point of looking at benchmarks? Is it to feel happy that your team won? Okay. You do you. Or is it to make informed decisions.

        And as someone who would really prefer Linux (Proton or otherwise) to be a first class citizen? Actively misleading people is just going to hurt in even the medium run.


        … Also, Wine is not an emulator, but I think you know that. ;)

        My inner pedant thinks that it is worth actually looking at the definition of an emulator rather than the title of software. But mostly it is just a good way to weed out the people who were never going to engage in a good faith discussion :)

        • Dudewitbow@lemmy.zip
          link
          fedilink
          arrow-up
          1
          ·
          16 hours ago

          the wine part, its more or less based on the end goal of the software. traditional game emulation is software that is used to try to mimic the hardware in question. Wine focuses on trying to be windows, rather than the physical PC hardware itself, so the type of thing youre trying to emulate is fundamentally different.

          emulation in general is functionally a compatibility layer. Wine exists on the highest level of compatibility layer because its not trying to rewrite what the hardware is necessary doing (as both the hardware is the same)

          Wine is the same level as Teknoparrot is (teknoparrot is compatibility layer for arcade maches that use X86 based pcs for hardware), vs MAME, which emulates non x86 based arcade machines.

          it gets mega pendantic because of how closely emulation and compatibility layers are intertwined, but they arent two overlapping circles. a compatibility can use emulation but doesn’t require it.

          I just like defining the difference between the two simply by the goal. emulation targets machine to then get to software, where compatibility layers target the software directly. and will target hardware only when necessary.

      • NuXCOM_90Percent@lemmy.zip
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        22 hours ago

        But they also have nothing to do with moment to moment performance. And I would argue that people, generally, aren’t pulling out a steam deck or a 1000 USD xbox to play a quick hand of Balatro while in line at the bank. I love that waking up my bazzite box under my TV actually takes less time than the nvidia shield I watch youtube on. I am not going to pretend that there is a meaningful difference between a 20 second boot and a 2 minute boot if I am sitting down to get angry at Silksong again. Same with my Steam Deck where I usually push the power button and then put the case back in my bag on a plane or off to the side while I get cozy at home.

        Its similar to the initial OS install. I will never stop being enraged when I have to deal with cortana and all that bullshit to install windows on a box. I positively love that most Linux distros are like 3 clicks and then a two minute wait if I am not doing weird partition shenanigans. But that is more a plus to the OS and not a reason I would upend my digital existence.

        Whereas this article is very specifically about moment to moment framerate.