Jyk0L8eLs7jd7es.png

I’m completely speechless. This looks so terrible I thought it was a joke, but apparently Nvidia released these demos to impress people. DLSS 5 runs the entire game through an AI filter, making every character look like it’s running through an ultra realistic beauty filter.

The photo above is used as the promo image for the official blog post by the way. It completely ignores artistic intent and makes Grace’s face look “sexier” because apparently that’s what realism looks like now.

I wouldn’t be so baffled if this was some experimental setting they were testing, but they’re advertising this as the next gen DLSS. As in, this is their image of what the future of gaming should be. A massive F U to every artist in the industry. Well done, Nvidia.

  • joelfromaus@aussie.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    21 hours ago

    First off; downvoted for a lukewarm opinion? Come on Lemmy, be better.

    I’ve thought about this subject a lot and my thoughts are that it boils down to whether someone has been raised on movies (specifically 24fps) or video games (specifically 60fps).

    For me, movies look like a jittery mess. I have two TV’s and the motion smoothing on one is very good but I’ve never been able to get it just right on my other one. They’re the same brand of TV just a decade apart.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      21 hours ago

      Yeah, the ASICs in newer TVs are crazy powerful, and crazy good at it. They’re nothing like what you’d find in a phone or even a PC, and even a one-generation jump for our Sony TVs was an improvement.

      That’s what I was trying to emphasize. I think interpolation on old TVs, and maybe early versions of SVP, left a bad taste in people’s mouths. Kind of like fake HDR.


      …But I also think there’s a lot more sentiment against any kind of “processing” since the rise of AI slop.

      As an example I often cite, there was this old TV show I helped touch up for a “fan” release, a long time ago. One small component in a very long pipeline was a GAN upscaler… It worked fine. The original TV release was broken as hell, and people loved the improvement.

      Fast forward many years later, and I mention this was used in the “remaster” still floating around, and the same subreddit goes ballistic. They literally did not believe me, or cooed about the “flaws” of the original, or called it slop and against the rules and wanted me banned.

      And I suspect frame interpolation and resolution scaling in other contexts get tossed in that same bucket. Not that I blame anyone. AI does suck.

      • joelfromaus@aussie.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 hours ago

        Funny enough it’s actually the older of my two TVs that does it well. I think it marks a noticeable drop in product quality for that particular manufacturer. So still the same idea; that worse hardware gives bad results, but it’s not limited to the age of the TV just its component quality.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          21 hours ago

          Oh yeah, definitely. Lines enshittify.

          I just mean, generally if you look at a 2014 TV and a 2025 one, the experience of that old one is likely not represenative of the new.