Jyk0L8eLs7jd7es.png

I’m completely speechless. This looks so terrible I thought it was a joke, but apparently Nvidia released these demos to impress people. DLSS 5 runs the entire game through an AI filter, making every character look like it’s running through an ultra realistic beauty filter.

The photo above is used as the promo image for the official blog post by the way. It completely ignores artistic intent and makes Grace’s face look “sexier” because apparently that’s what realism looks like now.

I wouldn’t be so baffled if this was some experimental setting they were testing, but they’re advertising this as the next gen DLSS. As in, this is their image of what the future of gaming should be. A massive F U to every artist in the industry. Well done, Nvidia.

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    edit-2
    18 hours ago

    Hard disagree on motion interpolation. Bad interpolation looks awful, of course, but when it’s good, it’s like night and day to my eyes, and every TV I’ve ever used can disable it.

    Sometimes you can’t disable “jitter reduction” or whatever that’s branded as, but that’s not the same thing.

    • joelfromaus@aussie.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      14 hours ago

      First off; downvoted for a lukewarm opinion? Come on Lemmy, be better.

      I’ve thought about this subject a lot and my thoughts are that it boils down to whether someone has been raised on movies (specifically 24fps) or video games (specifically 60fps).

      For me, movies look like a jittery mess. I have two TV’s and the motion smoothing on one is very good but I’ve never been able to get it just right on my other one. They’re the same brand of TV just a decade apart.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        14 hours ago

        Yeah, the ASICs in newer TVs are crazy powerful, and crazy good at it. They’re nothing like what you’d find in a phone or even a PC, and even a one-generation jump for our Sony TVs was an improvement.

        That’s what I was trying to emphasize. I think interpolation on old TVs, and maybe early versions of SVP, left a bad taste in people’s mouths. Kind of like fake HDR.


        …But I also think there’s a lot more sentiment against any kind of “processing” since the rise of AI slop.

        As an example I often cite, there was this old TV show I helped touch up for a “fan” release, a long time ago. One small component in a very long pipeline was a GAN upscaler… It worked fine. The original TV release was broken as hell, and people loved the improvement.

        Fast forward many years later, and I mention this was used in the “remaster” still floating around, and the same subreddit goes ballistic. They literally did not believe me, or cooed about the “flaws” of the original, or called it slop and against the rules and wanted me banned.

        And I suspect frame interpolation and resolution scaling in other contexts get tossed in that same bucket. Not that I blame anyone. AI does suck.

        • joelfromaus@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          14 hours ago

          Funny enough it’s actually the older of my two TVs that does it well. I think it marks a noticeable drop in product quality for that particular manufacturer. So still the same idea; that worse hardware gives bad results, but it’s not limited to the age of the TV just its component quality.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            13 hours ago

            Oh yeah, definitely. Lines enshittify.

            I just mean, generally if you look at a 2014 TV and a 2025 one, the experience of that old one is likely not represenative of the new.

    • gravitas_deficiency@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      17 hours ago

      It’s great for sports. And some sitcoms. And maybe news (but why are you even watching cable news these days). That’s it.

      Persistence of vision serves a real purpose in filmography. “Optimizing” it away is very literally a corruption of the art and a betrayal of the director and cameraman’s skills and intent.

      I’ll stick with my vintage 2010 Phillips plasma 55”, thank you very much.

    • bridgeenjoyer@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      18 hours ago

      Yeah sorry I’m not into high def TV myself. It looks awful unless all you watch is sports and brand new marvel movies (hard no).

      You may think you’re disabling it ,until you compare it with another TV that actually does zero processing . night and day.

      Same effect as me thinking “huh, I guess the lag on my flat screen isn’t too bad for gaming” then plug into my CRT and holy snap, the clarity and precision response. (Clarifying, this is with old and new consoles, obviously anything with an analog output into a new TV ia horirible without a upscaler, but even with a retrotink 2x upscaler, it still sucks. You need to send over $700 to make it look decent enough).

      people don’t know what They took from us.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        17 hours ago

        I have. I A/B test it all the time. I pause and pixel peep.

        And I don’t watch any sports, nor any marvel movies.

        “huh, I guess the lag on my flat screen isn’t too bad for gaming”

        I’ve had CRTs. And I have one of those “zero latency” overclocked LCD monitors with no internal scaler. As much as I like them, they feel sluggish compared to something newer.

        Yeah sorry I’m not into high def TV myself.

        In that case, I suspect you haven’t tried it on more modern displays, or when its baked into transcoded footage with one of the better filters.

        Yes, it looks awful and artifacty processed by older LCDs. But it looks really good these days.

        • bridgeenjoyer@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          edit-2
          16 hours ago

          Yeah, I’m not one to pay a lot for TVs. I’d like an oled, but with the prices, I really have no need for it for gaming and the TV I have is fine for normal watching.

          Also isn’t it crazy how its taken this long for a display to be as good as a CRT (blacks and response time wise)?? Kind of the same thing with audio, how bad digital sucked originally and how we are just now fixing that with great DACS. Humans got it right the first time with tube amps and CRTs ! Not to mention they’re repairable.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            15 hours ago

            I’d like an oled, but with the prices, I really have no need for it for gaming and the TV I have is fine for normal watching.

            That is entirely fair. Electronics are all crazy expensive, really.

            Yeah, LCDs went from bad to “mixed” and stayed that way for a long time. Granted, some things like absolute sharpness are not great on a CRT, but still.