• @RightHandOfIkaros@lemmy.world
    link
    fedilink
    English
    47
    edit-2
    8 months ago

    Saved you a click:

    Primarily, texture size has increased, texture count has increased, audio quality has increased, and the amount of audio files in a game has also typically increased.

    Its not really a deadlines or optimization problem. Compression always decreases fidelity, and many developers choose to compress as little as possible in order to achieve the highest fidelity. Since RAM and storage capacities have increased, the compromise of compressing everything at a great sacrifice to fidelity is not as obvious of a tradeoff anymore. Developers don’t have to choose between voicing an entire game with nearly unintelligible voice compression or only voicing important cutscenes. They can voice the entire game with minimal compression at the cost of a bigger install size, which is free for developers.

    • Neshura
      link
      fedilink
      English
      358 months ago

      It sort of is an optimization problem though because excess textures and audio files could be separated off into their own DLC packages (see Age of Empire II High-Res texture DLC and Steam’s Language Selection feature)

      The really big problem is people being riddled with 4K textures on 1080p monitors and 20 audio tracks for different languages when they only need one.

      • @RightHandOfIkaros@lemmy.world
        link
        fedilink
        English
        168 months ago

        I agree with the audio files for languages the player never plays, but 4k textures at a 1080p rendering resolution is not a problem.

        Texture map size depends priparily on how the UV maps of models make use of the texture, and how close the camera is to the objects using that texture on average. A large wall texture will have more noticeable detail with a 4k texture than a distant tree in the skybox. The details will be visible on the wall whether the player plays in 720p or 8k, depending on how close the camera gets to it. You may be fine with environments looking like they were made for the Nintendo64’s 4kb of texture RAM, but 1080p players still gain massive benefits in graphics quality with 4k textures.

        • snooggums
          link
          fedilink
          108 months ago

          Being unable to uninstall/choose not to install the 4k textures tied to ultra/very high settings that you will never use so they clutter up your storage space is a problem. If they aren’t installed then the highest settings can be disabled until they are installed.

          A skybox using a 4k texture on low is fine, we are talking about the textures that are only used when the settings are set to 4k or ultra or whatever.

      • @winterayars@sh.itjust.works
        link
        fedilink
        168 months ago

        And even lossy compression is not inherently bad. AAC is completely indistinguishable from lossless for most people and hardware setups, and very close anyway when it’s not. It uses a fraction of the space, though. (Not a comment on game dev practices, more a comment on compression.)

        • @Katana314@lemmy.world
          link
          fedilink
          English
          18 months ago

          I think I’ve been told that AAC uses just enough CPU to decode that developers don’t want it. Even that assessment could be wrong.

    • @tetris11@lemmy.ml
      link
      fedilink
      58 months ago

      This will get better as NN/AI chips become the norm in gaming. Compression gains, on the fly generation of textures, voice generation when needed, etc.

      I envision a future dev using rough shitty textures to conceptualise a game, and then an NN to bring it to life during runtime.

      You might even be able to load your own NN interpreter to make the world more cartoony, or change the intended setting entirely, or unlock the nsfw filter on the vanilla interpreter.