• fuckwit_mcbumcrumble@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 months ago

    That particular person appears to believe that upscalers, such as like DLSS and FSR, are “basic features”

    Do people really like DLSS/FSR that much that they consider it a basic feature? I can’t stand the look of it and I’d rather just run actually at a lower resolution since it ends up looking better.

    • qwertyqwertyqwerty
      link
      fedilink
      English
      arrow-up
      17
      ·
      10 months ago

      “Like” it? No, but it runs way better, and if you are using a high-resolution display, the quality upsampling methods are pretty decent on most games unless you are pixel peeping. I’d rather get 90+ fps with FSR3/DLSS3 with a 5 percent decrease in visual quality over ~45 fps at native resolution.

      • fuckwit_mcbumcrumble@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        My qualm is all of the visual artificing I see. Maybe it’s just the games I play, but there are some pretty bad graphical glitches that bother me, and the frame timing is off or something because it makes the game feel less smooth. Part of the smoothness is probably the relatively weak CPU in my laptop. But even on my desktop the frame pacing doesn’t feel the same as native.

        • Coelacanth@feddit.nu
          link
          fedilink
          English
          arrow-up
          5
          ·
          10 months ago

          I think preferring a lower-than-native resolution over DLSS as a blanket statement is a bit of a wild take, but there can definitely be problems like artifacts, especially in certain games. I’m playing RDR2 at the moment and the TAA (which is forced to High with DLSS) is poorly implemented and causes flickers which is definitely annoying, as an example. I played Alan Wake 2 on an older laptop that barely ran it and I definitely noticed artifacting from DLSS there, though in fairness I was demanding a lot from that machine by forcing it to play AW2.

          Frame time will of course be impacted so if you’re playing something really fast and twitchy you should stay away from DLSS probably. It’s also less bad if you don’t enable Frame Generation. Finally, both DLSS and Frame Generation input lag seems to scale with your baseline FPS. Using it to try to reach 60+ FPS will usually mean some input lag, using it when you’re already at ~60 FPS to get 80-100 or whatever means less noticeable input lag.

          • Whirlybird@aussie.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            10 months ago

            Finally, both DLSS and Frame Generation input lag seems to scale with your baseline FPS. Using it to try to reach 60+ FPS will usually mean some input lag, using it when you’re already at ~60 FPS to get 80-100 or whatever means less noticeable input lag.

            In most cases DLSS actually reduces your input lag because you’re getting a higher framerate. Not sure what you’re talking about.

            https://youtu.be/osLDDl3HLQQ?t=219

                • Coelacanth@feddit.nu
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  ·
                  10 months ago

                  I don’t think frame generation is crap outright, it’s still free frames, it’s just only really useful when you’re already at a solid frame rate.

                  • Whirlybird@aussie.zone
                    link
                    fedilink
                    English
                    arrow-up
                    2
                    ·
                    10 months ago

                    It’s not really free frames though. It’s no different to motion interpolation on TVs that makes everything look soap opera-like. The game is still playing at the lower framerate, and the disconnect between your input running at one framerate but what you’re seeing looking like it’s running at another is going to feel “off”.

        • zipzoopaboop@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          I’ve had a horrible experience with fsr, but dlss I haven’t noticed a single issue and always turn it on

        • Whirlybird@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          DLSS has nothing to do with the frame timing. DLSS also has very little, if any, visible “graphical glitches”.

    • Whirlybird@aussie.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      Do people really like DLSS/FSR that much that they consider it a basic feature?

      Absolutely, at least DLSS. DLSS is a gamechanger and a godsend. It can actually look better than native resolution while giving you massive performance increases. At worst it looks basically the same as native while giving you a massive performance boost. I’ve got a Ryzen 7/3070 PC and I’ll use DLSS everywhere it’s available as it’s basically just free frames.

      The same can’t be said for FSR however, it’s trash.

      I can’t stand the look of it and I’d rather just run actually at a lower resolution since it ends up looking better.

      DLSS rarely, if ever, looks worse.

      • Coelacanth@feddit.nu
        link
        fedilink
        English
        arrow-up
        5
        ·
        10 months ago

        The wildest thing is combining DLSS with DLDSR if you’re running say a 1440p system like I am. Set your monitor to 1.78x DLDSR resolution, run your game at 3413x1920 and enable DLSS quality. In the end you render at 2275x1280 but end up with way better image quality than native, and the upscaling+downsampling ends up being a great anti-aliasing method since it sorts out a lot of the bad TAA blur.