• EnsignRedshirt [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    17
    ·
    1 month ago

    I don’t like the “hardware is becoming more expensive so gaming is going to die” argument because it assumes that we need bleeding edge hardware to play games. The gaming hardware treadmill was only ever designed to make people spend more money on increasingly-diminishing gains instead of getting the most out of the hardware everyone already has.

    We’ve long-since passed the point of diminishing returns on graphical performance. AAA games released today don’t look that much better than they did 5 years ago, and AAA is no longer the main driver of gameplay innovation. Many of the best games of the last few years don’t require high-performance hardware, leaning more on art, music, writing, controls, unique gameplay elements, etc.

    Point being, I think that hardware becoming more expensive just means that people will opt for cheaper hardware, or hold onto their existing hardware longer, which will force developers to design games that can run on mid-range hardware. There is no reason that gaming needs ever-increasing hardware perfomance to deliver great experiences. The gaming industry is in for a rough go as they have to adapt to very different circumstances, but gaming is going to be fine.

    • hotspur [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 month ago

      It’s hilarious—I spent a bucket of cash on a high end gfx card year and a half ago to upgrade and have proceeded to play almost exclusively games that I could have run on my early 2000s computer. Feel silly about it but basically just saying you’re correct, you mostly dont need the super cards for most stuff? You can certainly get by in most cases with a second / third tier card.

      • EnsignRedshirt [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        I’m in a similar boat. Haven’t upgraded in 5 years or so and I haven’t felt the need. It would be nice to have smoother framerates or whatever, but there’s nothing I want to play that won’t run well enough on my machine. Even if I do want to upgrade, I don’t feel the pull towards the high-end hardware. I may just get the Steambox when it comes out and call that good enough.

        • hotspur [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          No you’re right, and that was one of the reasons I did it because I had changed to a 4k screen and unlike other things that actually does use the graphics card more. But I suspect the games Ive been playing would still be ok on my old card.

    • 9to5 [any, comrade/them]@hexbear.net
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      1 month ago

      You are not exactly wrong. Cyberpunk came out in 2020 and now in 2026 its still one of the best looking games (especially with a bit of mods) plus truthfully there are VERY few games that push the envelope graphically these days. Like literally the only upcoming releases I can think of are GTA 6 and Witcher 4 which are probably gonna be pretty hardware hungry when you wanna play them on high settings. The funny thing is Witcher 3 came out in 2015 ! And I think that is a solid level for high level graphics. Like as you said the returns for better and better graphics are getting ever smaller.

      • EnsignRedshirt [he/him]@hexbear.net
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 month ago

        Cyberpunk is the clearest inflection point in the diminishing returns cycle. It still looks great, runs better today than it did when it came out, and there’s nothing that’s come out since that looks like anything but a very minor incremental improvement. If you have a machine that can play Cyberpunk, you can play pretty much anything worth playing today, and Cyberpunk has been out for over 5 years now.

        AAA gaming is cooked, not because hardware is getting too expensive, but because expensive hardware is becoming irrelevant. If developers can make good games for low-end PCs, and gamers are happy to buy those games, then there’s no problem here. The minimum hardware technology required for gaming may have become commodified.

        • 7bicycles [he/him]@hexbear.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 month ago

          I always differentiate this in my head as “earliest proven example” and general industry trend and also into actual 0-improvements and diminishing returns and when I think about it it goes back so much more.

          For general industry trends 2019 is the point of actual 0 improvements. Things lke CONTROL, Metro Exodus and the RE2Make look entirely indistinguishable from things released today. But then so does MGS V or even Ground Zeroes all the way back from a decade ago.

          The point of diminishing returns sets in by 2007 first with Crysis though. Sure it was so much of a ressource hog nobody could run it at full fidelity but it’s laughable now and every bell and whistle added after doesn’t really do anything. Everything from 2019 could look like Crysis and that would be no difference at all. Then by 2010 the industry has basically caught up, Bad Company 2 doesn’t even look meaningfully different from 6 and Red Dead Redemption 2 manages to differentiate itself from 1 mostly by using the better part of a decade to get horse balls.

          • EnsignRedshirt [he/him]@hexbear.net
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 month ago

            Youre totally right, it was Crysis. By the 2010s all games were basically on the same level, and we didn’t see any real qualitative leaps. Not at all coincidentally, that’s also when we started to see lo-fi indie games find mainstream success.

            • chgxvjh [he/him, comrade/them]@hexbear.net
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              1 month ago

              By 2010 we had people talking about the end of Moore’s law and indeed we only got ~100 times more transistors in top of the line graphic processors compared to 2007, should have been 512 times according to Moore’s law.

              Price went up 10x or 6x when you account for CPI inflation when you compare the 8800GT to the RTX5090.

              Fat chunk of that performance improvement went towards increasing display resolution. Personally I’m good with Full HD gaming on desktop but I really wouldn’t want to go back to SD or VGA resolutions.