Trying it out in Shadows of Doubt right now, took performance from an unstable 25-31 fps to 61-71 fps with I set on performance mode and x2 fps. Don’t really notice input lag.

It’s not on the decky store yet, so you have to download the extension zip manually.

Here’s the extension github with full instructions and details.

Basically you’ll:

  1. Install the plugin. Once it’s on the decky store you can install it from there, but in the meantime do this:

    • Download the .zip from the release page
    • In Game Mode, go to the settings cog in the top right of the Decky Loader tab
    • Enable Developer Options
    • In the new Developer tab, select “Install from zip”.
    • Choose the “Lossless Scaling.zip” file you downloaded (likely in the Downloads folder)
    • If it does not show up, you may need to restart your device
  2. Purchase and install Lossless Scaling from Steam

  3. Open the plugin from the Decky menu

  4. Click “Install lsfg-vk” to automatically set up the compatibility layer

  5. Configure settings using the plugin’s UI controls:

    • Enable/disable LSFG
    • Set FPS multiplier (2-4) Note: The higher the multiplier, the greater the input lag
    • Enable performance mode - Reduces gpu load, which can sometimes majorly increase FPS gains
    • Adjust flow scale (0.25-1.0)
    • Toggle HDR mode
    • Toggle immediate mode (disable vsync)
  6. Apply launch commands to the game you want to use frame generation with:

    • Option 1 (Recommended): ~/lsfg %COMMAND% - Uses your plugin configuration
    • Option 2: Manual environment variables like ENABLE_LSFG=1 LSFG_MULTIPLIER=2 %COMMAND%
  • xthexder@l.sw0.com
    link
    fedilink
    arrow-up
    3
    ·
    32 minutes ago

    I was confused about what it meant by “Lossless” since it’s frame gen… there’s no compression, or anything to lose, it’s starting from nothing.

    As far as I can tell it means nothing, it’s just the branding for the “Lossless Scaling” tool on Steam. There’s no new lossless algorithm involved here.

  • morgan423@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 hours ago

    So I’m 0-1 so far (Samsung-screened SD OLED). Tried Baldur’s Gate 3 with a large variety of settings, it either crashed upon boot or booted with no video.

    I know it’s a DX11 game so it rarely agrees with tools like this, but I was hoping, lol. If I try anything else, I’ll edit this same post so as not to take over the thread.

  • Drasglaf@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    ·
    3 hours ago

    I’ve tried it with 2 games in my Legion Go with CachyOS.

    Heaven’s Vault: It launched without forcing a particular version of Proton, but it did nothing, same framerate with it on or off.

    Tacoma: I had to force GEProton in order for the game to run. It did nothing, same framerate with it on or off.

    And yes, I’ve followed the instructions and put “~/lsfg %COMMAND%” as an environment variable. Not sure if I’m doing something wrong, or if it just doesn’t work with every game.

    • morgan423@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 hours ago

      Since it didn’t crash you, but didn’t seem to kick in, check and make sure that you’re not full screen on your in-game settings. Go windowed or bordered instead.

      I have used it a handful of times on Windows and that was always a prereq to get it to do anything.

  • morgan423@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    Thanks for the heads up!

    I have a couple of RPGs where I have zero concern on input lag, but they could definitely use a frame boost. I’m going to give this a try today. 😀

    • Fubarberry@sopuli.xyzOPM
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      10 hours ago

      Different framegen techs have different requirements. Some like DLSS and the newer FSR require specific GPU hardware, some require being built into the game specifically. Lossless is great because it works on most hardware and most games.

      My understanding here is that it’s working as part of the Vulkan pipeline, but I don’t have enough knowledge in that area to answer more accurately than that. This article discusses what the dev of lsfg-vk had to do to get lossless framegen working on Linux, and it can give some insight into how it’s working.

    • kadu@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      18 hours ago

      Lossless Scaling’s implementation runs on leftover shader compute units, so it’s hardware agnostic, but heavily limited in terms of latency and quality of the interpolation.

  • kadu@lemmy.world
    link
    fedilink
    arrow-up
    4
    ·
    18 hours ago

    I love how when Nvidia announced frame generation Lemmy and Reddit were full of threads about how that’s horrible and useless and the latency was too high.

    Then AMD and Lossless Scaling launch their own version, with significabtly higher latency, and suddenly its great you don’t even notice the artifacts the latency isn’t that bad trust me guys

    • Yttra@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      7 hours ago

      No, they’re still all awful, lossless scaling especially, but:

      a) Nvidia simps have fewer excuses to support Corpo #1 over Corpo #2 now and

      b) people using lossless scaling often have either weaker hardware or lower standards in the first place

      • kadu@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        7 hours ago

        No, they’re still all awful

        I don’t disagree, but that’s not my point at all

    • miss phant@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      12
      ·
      edit-2
      14 hours ago

      Having it available as a technology is great, what sucks is how Nvidia marketed their new 50 series on the basis of triple fake frames due to lack of actual hardware improvements. They literally claimed 5070 = 4090 performance.

        • miss phant@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          4
          ·
          7 hours ago

          Single frame generation (2x FPS) was 4000 but 5000 added multi frame gen. (3x/4x FPS)

          1000015606

          Through that they justified 5070 matching 4090.

          • kadu@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            7 hours ago

            So what? I never said anything about multi frame generation.

            I’m talking about reception at launch on Reddit and Lemmy. This entire image is completely irrelevant to the point being made.

    • Fubarberry@sopuli.xyzOPM
      link
      fedilink
      English
      arrow-up
      6
      ·
      18 hours ago

      It depends on the game, framegen techs, and your base fps.

      It can be a great way to squeeze more performance out of a game in some circumstances, but it’s a big problem when games like MH:Wilds rely on it to meet an acceptable fps at all.

      • kadu@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        18 hours ago

        I’m commenting on the contrast of attitude towards the feature when its made by Nvidia versus Lemmy’s beloved AMD, not really on whether the feature is actually useful.

        • MentalEdge@sopuli.xyz
          link
          fedilink
          arrow-up
          16
          ·
          16 hours ago

          I don’t know about anyone else, but the reason I say stuff like “fake frames, real flames” about nvidia, is that they include framegen in their performance stats.

          As if it’s a tech that boosts actual performance. They use it to lie.

          When they say “this card is twice as powerful as last gen” what they really mean is, it is exactly the same, but we turned on 2x framegen. Nevermind that there’s no reason you couldn’t do the same on the last gen card.