• Diplomjodler@lemmy.world
    link
    fedilink
    arrow-up
    200
    ·
    7 months ago

    Planned obsolescence is one of the major engines that keep our current system of oligarchic hypercapitalism alive. Won’t anybody think of the poor oligarchs?!?

  • huginn@feddit.it
    link
    fedilink
    arrow-up
    145
    ·
    7 months ago

    Resources are just way cheaper than developers.

    It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.

    And if you’re working with code that requires that serious of resource optimization you’ll invariably end up with low level code libraries that are hard to maintain.

    … But fuck the Always on internet connection and DRM for sure.

    • rbn@sopuli.xyz
      link
      fedilink
      arrow-up
      114
      ·
      7 months ago

      If you consider only the RAM on the developers’ PCs maybe. If you count in thousands of customer PCs then optimizing the code outperforms hardware upgrades pretty fast. If because of a new Windows feature millions have to buy new hardware that’s pretty desastrous from a sustainability point of view.

      • huginn@feddit.it
        link
        fedilink
        arrow-up
        18
        ·
        7 months ago

        Last time I checked - your personal computer wasn’t a company cost.

        Until it is nothing changes - and to be totally frank the last thing I want is to be on a corporate machine at home.

        • CosmicTurtle0@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          10
          ·
          7 months ago

          When I was last looking for a fully remote job, a lot of companies gave you a “technology allowance” every few years where they give you money to buy a computer/laptop. You could buy whatever you wanted but you had that fixed allowance. The computer belonged to you and you connected to their virtual desktops for work.

          Honestly, I see more companies going in this direction. My work laptop has an i7 and 16GB of RAM. All I do is use Chrome.

          • huginn@feddit.it
            link
            fedilink
            arrow-up
            10
            ·
            7 months ago

            It’d be nice to have that - yeah. My company issued me a laptop that only had 16gb of RAM to try and build Android projects.

            Idk if you know Gradle builds but a multi module project regularly consumes 20+GB of ram during a build. Despite the cost difference being paid for in productivity gains within a month it took 6 months and a lot of fighting to get a 32gb laptop.

            My builds immediately went from 8-15 minutes down to 1-4.

            • CosmicTurtle0@lemmy.dbzer0.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              7 months ago

              I always felt that this is where cloud computing should be. If you’re not building all the time, then 32GB is overkill.

              I know most editing and rendering of TV shows happen on someone’s computer and not in the cloud but wouldn’t it be more efficient to push the work to the cloud where you can create instances with a ton of RAM?

              I have to believe this is a thing. If it isn’t, someone should take my idea and then give me a slice.

              • huginn@feddit.it
                link
                fedilink
                arrow-up
                5
                ·
                7 months ago

                It’s how big orgs like Google do it, sure. Working there I had 192gb of ram on my cloudtop.

                That’s not exactly reducing the total spend on dev ram though - quite the opposite. It’s getting more ram than you can fit in a device available to the devs.

                But you can’t have it both ways: you can’t bitch and moan about “always on internet connections” and simultaneously push for an always on internet connected IDE to do your builds.

                I want to be able to work offline whenever I need to. That’s not possible if my resource starved terminal requires an Internet connection to run.

                Ram is dirt cheap and only getting cheaper.

          • Ardyssian@sh.itjust.works
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            Alternatively they could just use Windows VDI and give you a card + card reader that allows Remote Desktop Connection to avoid this hardware cost, like what my company is doing. Sigh

            • cmnybo@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              2
              ·
              7 months ago

              If the job is fully remote, then the workers could be living on the other side of the country. Using remote desktop with 100ms of latency is not fun.

        • trollbearpig@lemmy.world
          link
          fedilink
          arrow-up
          5
          ·
          edit-2
          7 months ago

          Or maybe you could actually read the comment you are replying to instead of being so confrontational? They are literally making the same point you are making, except somehow you sound dismissive, like we just need to take it.

          In case you missed it they were literally saying that the fact that the real cost of running software (like the AI recall bullshit) is externalized to consumers makes companies don’t give a shit about fixing this. Like literally the same you are saying. And this means that we all, as a society, are just wasting a fuck ton of resources. But capitalism is so eficient hahaha.

          But come on man, you really think that the only option is for us to run corporate machines in our homes? I don’t know if I should feel sorry about your lack of imagination, or if you are trying to strawman us here. I’m going to assume lack of imagination, don’t assume malice and all that.

          For example, that’s what simple legislation could do. For example, lets say I buy an cellphone/computer, then buy an app/program for that device, and the device has the required specifications to run the software. The company that sold me that software should be obligated by law to give me a version of the software that runs in my machine forever. This is not a lot to ask for, this is literally how software worked before the internet.

          But now, behind the cover of security and convenience, this is all out of the window. Each new windows/macos/ios/android/adobe/fucking anything update asks for more and more hardware and little to no meaningful new functionality. So we need to keep upgrading and upgrading, and spending and spending.

          But this is not a given, we can do better with very little sacrifices.

      • kibiz0r@midwest.social
        link
        fedilink
        English
        arrow-up
        4
        ·
        7 months ago

        As a developer, my default definition of “slow” is whether it’s slow on my machine. Not ideal, but chimp brain do chimp brain things. My eyes see my own screen all day, not yours.

    • 2xsaiko@discuss.tchncs.de
      link
      fedilink
      arrow-up
      6
      ·
      7 months ago

      You can also build a chair out of shitty plywood that falls apart when someone who weighs a bit more sits on it, instead of quality cut wood. I mean, fine if you want to make a bad product but then you’re making a bad product.

      • huginn@feddit.it
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        7 months ago

        Resource optimization has nothing to do with product quality. Really good experiences can be done with shitty resource consumption. Really bad experiences can be blisteringly fast in optimization.

        The reason programmers work in increasingly abstract languages is to do more with less effort at the cost of less efficient resource utilization.

        Rollercoaster Tycoon was ASM. Slay the Spire was Java. They’re both excellent games.

        • 2xsaiko@discuss.tchncs.de
          link
          fedilink
          arrow-up
          3
          ·
          7 months ago

          Yeah, I don’t really have a problem with games except for the stuff added on purpose just to make the user experience worse like DRM. I was more thinking about trends like using Electron for desktop development.

    • mPony@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      7 months ago

      It’s a lot cheaper to have double the ram

      yeah a lot cheaper to force someone else to buy double the RAM. No thanks.

      • huginn@feddit.it
        link
        fedilink
        arrow-up
        5
        ·
        7 months ago

        Companies don’t pay for your 2x RAM and it doesn’t slow down their user acquisition so they don’t care.

      • huginn@feddit.it
        link
        fedilink
        arrow-up
        4
        ·
        7 months ago

        Companies own the code you write.

        It’s not your code if you’re working for a corp - it’s theirs.

          • huginn@feddit.it
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            Psychopath

            Just because you don’t own something doesn’t mean you should trash it.

            First you insist that companies don’t own the code then you say if you don’t own it you don’t have to care.

            God I hope I never work with an idiot like you.

            • jaybone@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              7 months ago

              I don’t think I’m saying either of those things. I’m saying the opposite actually.

              You seem to be suggesting that even though you are responsible for writing code, the company should hire someone else to optimize it for you.

              • huginn@feddit.it
                link
                fedilink
                arrow-up
                2
                ·
                7 months ago

                Resources are just way cheaper than developers.

                It’s a lot cheaper to have double the ram than it is to pay for someone to optimize your code.

                I don’t see where you’re reading that idea.

                It’s a lot cheaper to double the ram ergo you do not have to pay someone to optimize your code.

                Where are you getting this bizarre inverse from?

  • FlapJackFlapper@lemm.ee
    link
    fedilink
    arrow-up
    102
    ·
    7 months ago

    Reminds me of a funny story I heard Tom Petty once tell. Apparently, he had a buddy with a POS car with a crappy stereo, and Tom insisted that all his records had to be mixed and mastered not so that they sound great on the studio’s million dollar equipment but in his friend’s car.

    • Flying Squid@lemmy.world
      link
      fedilink
      arrow-up
      42
      ·
      7 months ago

      That’s how my professors instructed me to mix. To make it sound as good on shitty speakers as possible and also sound good on expensive systems.

    • tfw_no_toiletpaper@lemmy.world
      link
      fedilink
      arrow-up
      37
      ·
      edit-2
      7 months ago

      Reminds me of the ass audio mixing in movies where it is only enjoyable in a 7.1 cinema or your rich friends home theater but not on your own setup

      • corsicanguppy@lemmy.ca
        link
        fedilink
        arrow-up
        20
        ·
        7 months ago

        It seems we’ve lost sight of reality there.

        As we don’t intend to attend much cinema any more, I hope they bring back essentially a Dolby Noise Switch for movies. I don’t want to sacrifice too much, but booming noise followed by what comes out as whispered dialogue really cheapens the experience.

        I hope they can find a process that gives us back a sound track for the sub-17:7 sound system.

        • glitchdx@lemmy.world
          link
          fedilink
          arrow-up
          8
          ·
          7 months ago

          Dynamic Range Compression. VLC player has it, possibly under a different name though. Set it up on my theater pc, and I almost don’t need subtitles anymore.

        • cmnybo@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          4
          ·
          7 months ago

          They could add more audio tracks for different systems. Blurays support multiple audio tracks and they are almost never full.

          • terminhell@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            I’ve always wanted to try putting something like a guitar compressor pedal in the audio chain just to normalize the peaks. My wife will find something to watch, but ends up spending half the time adjusting the volume, or just turning on subtitles.

              • terminhell@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                2
                ·
                7 months ago

                I have a much simpler setup though. Just a ‘smart’ TV and a sound bar I paid about $200 for so nothing fancy.

                Not actually looking for advice, just a thought experiment of quick, easy and cheap fixes.

    • Lorindól@sopuli.xyz
      link
      fedilink
      arrow-up
      29
      ·
      7 months ago

      I had the same exact approach back in the late 90’s. My friends had several band projects and when they were mixing their demos, I insisted that if the mixes sound good in a standard car stereo, they’ll sound good anywhere.

    • mPony@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      7 months ago

      This is still a perfectly sound method.

      Getting the music you made in your own DAW to sound good on your home speakers is almost easy. getting it to not suck on shitty speakers? that’s an art.

    • magikmw@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      edit-2
      7 months ago

      Then again my 2016 stock yaris had the best sound I ever heard anywhere.

  • puchaczyk@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    82
    ·
    7 months ago

    Most of the abstractions, frameworks, “bloats”, etc. are there to make development easier and therefore cheaper, but to run such software you need a more and more expensive hardware. In a way it is just pushing some of the development costs onto a consumer.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      arrow-up
      29
      ·
      7 months ago

      Most of the abstractions, frameworks, “bloats”, etc. are there to make development easier and therefore cheaper

      That’s true to an extent. But I’ve been on the back side of this kind of development, and the frameworks can quickly become their own arcane esoteric beasts. One guy implements the “quick and easy” framework (with 16 gb of bloat) and then fucks off to do other things without letting anyone else know how to best use it. Then half-dozen coders that come in behind have no idea how to do anything and end up making these bizarre hacks and spaghetti code patches to do what the framework was already doing, but slower and worse.

      The end result is a program that needs top of the line hardware to execute an oversized pile of javascripts.

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      5
      ·
      7 months ago

      If the software is much more expensive to develop, most is it just won’t exist at all. You can get the same effect by just not using software you feel is bloated.

    • Gladaed@feddit.de
      link
      fedilink
      arrow-up
      5
      ·
      7 months ago

      But this does not neccesarily mean the consumer pays more. Buying a current mavhine and having access to affordable software seems like a good deal.

      • uis@lemm.ee
        link
        fedilink
        arrow-up
        10
        ·
        7 months ago

        Capitalism makes it work only in one direction. Something became cheaper? Profits go up. Sometging became more expensive? Prices go up.

  • jpeps@lemmy.world
    link
    fedilink
    arrow-up
    66
    ·
    7 months ago

    Reminds me of the UK’s Government Digital Services, who want to digitise government processes but also have a responsibility to keep that service as accessible and streamlined as possible, so that even a homeless person using a £10 phone on a 2G data service still has an acceptable experience.

    An example. Here they painstakingly remove JQuery (most modern frameworks are way too big) from the site and shave 32Kb off the site size.

      • ameancow@lemmy.world
        link
        fedilink
        arrow-up
        20
        ·
        7 months ago

        Hasn’t been linked to reddit yet probably.

        Getting away from reddit has shown me that there are unspoiled places in the digital world out there, communities of people who actually care about the topic and not performatism and internet attention.

        • mPony@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          7 months ago

          a) don’t let in anyone who acts like petulant children b) give adults an outlet for occasional outbursts that would make them sound like petulant children

    • lolcatnip@reddthat.com
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      7 months ago

      At a certain point it makes more sense to subsidize better low-end hardware than to make every web site usable on a 20 year old flip phone. I’d argue that if saving 32 kB is considered a big win, you’re well past that point. Get that homeless guy a £50 phone and quit wasting the time of a bunch of engineers who make more than that in an hour.

      • uis@lemm.ee
        link
        fedilink
        arrow-up
        22
        ·
        7 months ago

        Get that homeless guy a home.

        Also, if you are in a basement/mountains/middle of Siberia, waiting for 32 kB takes quite some time.

        • lolcatnip@reddthat.com
          link
          fedilink
          English
          arrow-up
          5
          ·
          edit-2
          7 months ago

          I’m all for ending homelessness, but that’s really a different problem than we were discussing. I’m pretty confident jQuery isn’t stopping anyone from being housed.

          Anyway, there’s no way you’re gonna convince me 32 kB is a lot of data. It’s just not. Even the slowest 3G connections can download that much in half a second. Just the text of this thread is probably more than 32 kB. If you can’t download that much data, you only technically have Internet service at all.

          • uis@lemm.ee
            link
            fedilink
            arrow-up
            8
            ·
            7 months ago

            Even the slowest 3G connections can download that much in half a second.

            Even 3G is not always avaliable, even 3G sometimes slower than 2G.

            32 KB here, 32 KB there and boom - you have bitbucket.

            • lolcatnip@reddthat.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              7 months ago

              At least in the US, the reason 3G isn’t available is that it has been phased out, as has 2G. You may as well complain about how slow it is to send data with smoke signals, because 4G is table stakes for an internet-capable device now.

              • uis@lemm.ee
                link
                fedilink
                arrow-up
                6
                ·
                edit-2
                7 months ago

                US? US is wild place. A lot of people still on ADSL, but 2G and 3G equipment is thrown away and say “lol, you problem, buy new phone”. I won’t be surprised that there are a lot of places where internet is less stable than in a train going through tunnel under the mountain in the middle of Siberia. Which means no internet.

                I wonder what happens to internet connection in rural areas of USSA, since you suddenly started talking about it.

                In Europe(or at least in my part of Europe) there are places where mobile internet is overloaded like subway system and city center and places where mobile internet is very unstable like my house in suburban area and, agan, trains.

                And, as I mentioned, bitbucket. It struggles to load even on average PC.

              • bluewing@lemm.ee
                link
                fedilink
                arrow-up
                2
                ·
                7 months ago

                Where I live even 4G isn’t all that reliable. Making a phone call is mostly impossible and a text message is hit or miss unless I’m in a town or along a major road, This is due to terrain and the general lack of towers. 4G is spotty at best and with many areas having no service at all. And I ain’t never going to live long enough to ever see 5G out here.

                But, I would agree that while 32KB is pretty minor for an internet connection, things have a way of stacking. 32KB here, another extra 32KB there and pretty soon things can get ‘heavy’ to use.

      • uis@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        7 months ago

        Also, engieneers already had tech debt of updating to new jQuery version, which can result in a lot of wierd bugs, so it was achiveing two goals at once.

        And probably 50£ phone IS their target device.

    • Aux@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      The issue with UK services is that they all are fucking random and plenty of sections don’t work. There are billions of logins, bugs and sometimes you just get redirected to some bloody nightmare portal from 1990-s. And EU citizens couldn’t log in into HMRC portal for years after Brexit, what a fucking joke! And all they do is spend time removing jQuery, good fucking job!

  • Magister@lemmy.world
    link
    fedilink
    arrow-up
    51
    ·
    7 months ago

    When you see what ONE coder was able to do in the 80s, with 64K of RAM, on a 4MHz CPU, and in assembly, it’s quite incredible. I miss my Amstrad CPC6128 and all its good games.

    • prole@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      42
      ·
      edit-2
      7 months ago

      Still happens.

      Animal Well was coded by one guy, and it was ~35mb on release (I think it’s above 100 at this point after a few updates, but still). The game is massive and pretty complex. And it’s the size of an SNES ROM.

      Dwarf Fortress has to be one of the most complex simulations ever created, developed by two brothers and given out for free for several decades. The game, prior to adding actual graphics, DF was ~100mb and the Steam version is still remarkably compact.

      I am consistently amazed by people’s ingenuity with this stuff.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        17
        ·
        7 months ago

        SNES ROMs were actually around 4MB. People always spoke about them being 32 Meg or whatever, but they meant megabits.

        I did like Animal Well, but gave up after looking at one of the bunny solutions and deciding I didn’t have the patience for that.

        I think most of the size of games is just graphics and audio. I think the code for most games is pretty small, but for some godforsaken reason it’s really important that they include incredibly detailed doorknobs and 50 hours of high quality speech for a dozen languages in raw format.

        • uis@lemm.ee
          link
          fedilink
          arrow-up
          5
          ·
          7 months ago

          I think most of the size of games is just graphics and audio. I think the code for most games is pretty small, but for some godforsaken reason it’s really important that they include incredibly detailed doorknobs and 50 hours of high quality speech for a dozen languages in raw format.

          True. Even Xonotic - opensource game - has very small game engine, but game logic and assets(maps, textures, lightmaps) are 1 gig. And same with AltCraft - small engine, but minecraft assets are huge.

    • AnUnusualRelic@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      7 months ago

      When you see what they did in the 60s and 70s, where they ran an entire country’s social security system in a mainframe with a whooping 16Kb of memory (I’m not sure if it was 4 or 16, but it doesn’t make that much difference).

  • thefartographer@lemm.ee
    link
    fedilink
    arrow-up
    51
    ·
    7 months ago

    When my dad died suddenly in 2015 and I cleared out his office at his job, I spun down his Win95 machine that he’d been using for essential coding and testing. My father was that programmer—the one who directly spoke to a limited number of clients and stakeholders because he had a tendency to ask people if they were stupid.

    • theangryseal@lemmy.world
      link
      fedilink
      arrow-up
      41
      ·
      7 months ago

      Your dad sounds like the childhood hero of mine who got me into computers.

      Severe ADHD prevented me from ever learning to code, but I became damn good at repairs and things and just general understanding of computers because he was available to ask questions at almost any time.

      He went to school auctions every year and got me a pile of hardware to learn from. He never asked for anything in exchange. All around great guy.

      I heard him on the phone a few times dealing with the people who he worked with though. Good god he was mean. I couldn’t imagine him being that way with me ever, but he was brutal when it came to work and money.

      A dude called him one time while I was sitting there, he listened for a few minutes and he said, “I’ve got a 14 year old kid here, he’s been doing this stuff for about 2 years. I’m gonna let him walk you through this for the 10th fucking time because you’re a goddamn idiot and feeling like a fool when you hang up the phone with a grown man isn’t teaching you any lessons. Maybe get a pen for this one because if I have to remind that a child walked you through it last time, I’m not going to be so fucking friendly.” I was so nervous, apologized multiple times, when I was finished walking him through it he took the phone and said, “now don’t you feel stupid? 25 years and this kid just schooled you.”

      He told me, “you gotta be real with idiots or they’ll bother you with stupid problems every single day of your life.”

      I wish that lesson had stuck haha, it just wasn’t in me to be mean. As a result, a hobby that I was passionate about all of my life is something I avoid like the plague now. People ruined it for me by bothering me constantly.

      • Baggie@lemmy.zip
        link
        fedilink
        arrow-up
        19
        ·
        7 months ago

        I think it’s nice of you not to be mean. The industry turned me a bit mean as a defence against people constantly shoveling more work onto me. Try to protect it if you can! I miss my lack of mean dearly.

        • theangryseal@lemmy.world
          link
          fedilink
          arrow-up
          13
          ·
          7 months ago

          I seriously have a boiling hatred for computers now because I couldn’t even be a little bit mean. I’ve snapped a few times when people blamed me for problems years after I worked on their stuff, but mostly I just got trampled on and robbed at every turn because I didn’t want to upset anyone.

          By the time I was mean enough to demand payment and things like that, I already hated it.

          My daughter is passionate about computers, so nowadays if I so much as want to tweak something a little bit I let her do it unless she don’t want to. I don’t want to burn her out too.

          • Baggie@lemmy.zip
            link
            fedilink
            arrow-up
            3
            ·
            7 months ago

            That sucks my dude, it sounds like some really shitty people ruined something you liked. So far I’ve found that the only way to protect yourself against that stuff is to set healthy boundaries. It doesn’t have to be rude, but unfortunately some people see it that way. It’s a rough time.

  • linearchaos@lemmy.world
    link
    fedilink
    English
    arrow-up
    49
    ·
    7 months ago

    Doesn’t really matter what your developers run on, you need your QA to be running on trash hardware.

    We can even cut out the middleman and optimize unity and unreal to run on crap

    • meliaesc@lemmy.world
      link
      fedilink
      arrow-up
      20
      ·
      7 months ago

      Jokes on you, my corporate job has crippled the Mac they gave us so much that EVERYONE has trash hardware!

  • manicdave@feddit.uk
    link
    fedilink
    arrow-up
    46
    ·
    7 months ago

    I can think of a few games franchises that wouldn’t have trashed their reputation if they’d have had an internal rule like “if it doesn’t play on 50% of the machines on Steam’s hardware survey, it’s not going out”

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      arrow-up
      25
      ·
      7 months ago

      I think it’s given us a big wave of “Return to pixelated tradition” style games. When you see 16-bit sprites in the teaser, you can feel reasonably confident your computer will run it.

      • manicdave@feddit.uk
        link
        fedilink
        arrow-up
        29
        ·
        7 months ago

        I don’t mind if indie devs try something experimental that melts your computer. Like beamNG needs a decent computer but the target audience kinda knows about that sort of stuff.

        The problem is with games like cities skylines 2. Most people buying that game probably don’t even know how much RAM they have, it shouldn’t be unplayable on a mid range PC.

      • Joe Cool@lemmy.ml
        link
        fedilink
        arrow-up
        7
        ·
        7 months ago

        Unless they use Unreal Engine and don’t know what they are doing. It can be pixely and run like ass.

        Octopath Traveler was the last UE based game that really ran well that I can remember.

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          7 months ago

          Was that game any good? The mobile version had a Gacha mechanic that scared me off, but it otherwise looked like a really smooth SNES style JRPG.

          • Joe Cool@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            edit-2
            7 months ago

            I liked it. It’s not particularly hard or deep but the mechanics are nice, battles and encounters don’t overstay their welcome. The interwoven story was done quite nicely. I played it on PC. It runs great on Proton and Windows 7.

            I haven’t played the second one, but reviews consider it an improvement. Now that they removed Denuvo I might get it.

  • yamanii@lemmy.world
    link
    fedilink
    arrow-up
    44
    ·
    7 months ago

    I knew someone that refused to upgrade the programmer’s workstation precisely because it would have been a big leap in performance compared to what their costumers used the software on. Needless to say the program was very fast even on weaker hardware.

    • Aux@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      7 months ago

      Steam is full of shorter games with worse graphics made by indy devs. Guess what? No one gives a shit! Because no one needs crappy games from 1980-s.

        • Aux@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          This is data from 2020 https://www.statista.com/statistics/1356730/steam-indie-game-revenue-genre/ but I believe it’s still pretty relevant.

          In short, indie games are the games no one is playing. A tiny fraction of indies bring more than $200k, while AA and AAA bring in millions and hundreds of millions.

          The idea that we need shitty games with piss poor graphics is just plain wrong. What we need are high res games to enjoy 4K experiences.

          • yistdaj@pawb.social
            link
            fedilink
            arrow-up
            2
            ·
            7 months ago

            I think this is a false dichotomy and an over-simplistic view of the game industry. Remember, there are far more indie games than AAA, so of course they’re going to earn less, there are more to choose from. Plus, if an indie game does too well, it often stops being indie. Most of the money for AAA games is from the same few people paying thousands of dollars in many small purchases too.

            Anecdotally, most people’s favourite games are, or at least started off as indie games. However, most people’s least favourite are going to be indie as well. I think the thing with indie games is that they vary a lot, often exploring things that many publishers simply aren’t willing to. This allows them to find and fill a niche perfectly that a publisher can never fill. The main thing is that people see this and start making their own indie games, leading to market saturation pretty quickly.

            Plus, the vast majority of people still don’t have 4K monitors. It may be the future, but you seem to think that’s where we are now when we just aren’t.

            • Aux@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              7 months ago

              The topic is about low end indie games specifically. Thus games, which started as indie, like Valheim or No Man’s Sky don’t fit the bill. The point is that no one gives a shit about low end games apart from a few niche fans. Everyone likes high end looking games with loads of stuff to do in them.

      • yistdaj@pawb.social
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        7 months ago

        I quite like many games with “poor” graphics. Perhaps not exclusively, but you’re seriously missing out if you only go for realistic-looking or detailed games. Give a few of those indie games a try, you might be surprised.

        Edit: Oh, and terminal games are cool! Usually not very performant though.

        • Aux@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          Nah, sorry, I’m not playing pixelated crap on my dual screen 4K set up. That was cool back in the 1990-s on my Sega Mega Drive, but I outgrew that a long time ago. Fuck, I’m old…

      • The Menemen!@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        7 months ago

        Some of them are really succesful. Many people care. Others don’t.

        Here the current Steam charts. Many indie games, some few with really low specs. Banana only needs 30MB RAM. Seems to be a great game. Hostly now, why are 50k people playing that “game” currently?

        But back on topic: Yes, AAA games are more succesfull and earn much more money, but claiming “no one cares about indie” is stupid, when so many people play games like Rust, Stardew Valley, Prison Architect, Terraria, RimWorld, Valheim, The Forest, …

        • Aux@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          Games like Valheim and Rust are not some pixel art games which will run on a 2GB system with an integrated GPU, they’re pretty much AA games with an AA level publisher behind them. Yes, there are some real exceptions, like Stardew Valley, but you can count them on your fingers. They don’t make even a 1% of Steam catalogue. Thus my point still stands - no one cares about low end games.

          • Timecircleline@sh.itjust.works
            link
            fedilink
            arrow-up
            2
            ·
            7 months ago

            Even if they don’t make up 1% of Steam’s catalogue (though I doubt that figure), they have had real impact.

            It’s ok to not like them personally, but to say no one cares is disingenuous. Balatro doesn’t require much in terms of hardware but is having a real moment right now. Stardew Valley has been killing it for 8 years. People have thousands of hours in Rimworld.

            Indie games are also great for community and modders.

  • WolfLink@lemmy.ml
    link
    fedilink
    arrow-up
    31
    ·
    7 months ago

    The ideal is “plays fine at lowest graphics settings on old hardware” while having “high graphics settings” that look fantastic but requires too-of-the-line hardware to play reasonably.

    Generally this is almost impossible to achieve.

  • Alteon@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    7 months ago

    This is like the definition of a “conservative”. Progress shouldn’t happen because their not ready for it. They are comfortable with what they use and are upset that other people are moving ahead with new things. New things shouldn’t be allowed.

    Most games have the ability to downscale so that people like this can still play. We don’t stop all progress just because some people aren’t comfortable with it. You learn to adjust or catch up.

    • englislanguage@lemmy.sdf.org
      link
      fedilink
      arrow-up
      15
      ·
      7 months ago

      More “conservative” in terms of preserving the planet’s resources.

      You don’t need Gigabytes of RAM for almost any consumer application, as long as the programming team was interested/incentivized to write quality software.

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        arrow-up
        10
        ·
        7 months ago

        I think the examples given are just poorly chosen. When it comes to regular applications and DRM, then yes, that’s ridiculous.

        On the other hand, when it comes to gaming, then yes, give me all the raytracing and visible pores on NPCs. Most modern games also scale down well enough that it’s not a problem to have those features.

      • Alteon@lemmy.world
        link
        fedilink
        arrow-up
        6
        ·
        7 months ago

        “Limitations foster creativity.”

        100% agree. But there’s no reason to limit innovation because some people can’t take advantage of it. Just like we shouldn’t force people to have to consistently upgrade just to have access to something, however there should be a limit to this. 20 years of tech changes is huge. You could get 2 Gb of Ram in a computer on most home computers back in the early-mid 2000’s…that’s two decades ago.

        I’m still gaming on my desktop that I built 10 years ago quite comfortably.

      • Bytemeister@lemmy.world
        link
        fedilink
        Ελληνικά
        arrow-up
        5
        ·
        7 months ago

        Somebody didn’t live though the “Morrowind on Xbox” era where “creativity” meant intentionally freezing the loading screen and rebooting your system in order to save a few KB of RAM so the cell would load.

        But also having no automatic corpse cleanup, so the game would eventually become unplayable as entities died outside of your playable area, so you couldn’t remove them from the game, creating huge bloat in your save file.

        Not all creativity is good creativity.

      • Dudewitbow@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        7 months ago

        less on general software but more in the gaming side, why target the igpu then. although its common, even something near a decade old would be an instant uplift gaming performance wise. the ones that typically run into performamce problems mostly are laptop users, the industry that is the most wasteful with old hardware as unless you own a laptop like a framework, the user constantly replaces the entire device.

        I for one always behind lengthening the lifetime of old hardware (hell i just replaced a decade old laptop recently) but there is an extent of explectations to have. e.g dont expect to be catered igpu wise if you willingly picked a pre tiger lake igpu. the user intentionally picked the worse graphics hardware, and catering the market to bad decisions is a bad move.

        • queermunist she/her@lemmy.ml
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          7 months ago

          I, for one, hate the way PC gamer culture has normalized hardware obsolescence. Your hobby is just for fun, you don’t really need to gobble up so much power and rare Earth minerals and ever-thinner wafers all to just throw away parts every six months.

          I have plenty of fun playing ascii roguelikes and I do not respect high performance gaming. It’s a conservationist nightmare.

          • Dudewitbow@lemmy.zip
            link
            fedilink
            arrow-up
            2
            ·
            7 months ago

            whose throwing away stuff every six months, hardware cycles arent even remotely that short, hell, moores law was never that short in the existence of said law. and its not like I dont have my fair share of preventing hardware waste (my litteral job is the refurbishing and resell of computer hardware, im legitimately doing more than the averge person and trying to maintain older hardware several fold). But its not my job to dictate what is fun and whats not. whats fun for you isnt exactly everyone elses definition of fun.

    • Übercomplicated@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      7 months ago

      The topic is bloatware, not games. Very different. When it comes to gaming, the hardware costs are a given (for the sake of innovation, as you put it); but when it comes to something fundamental to your computer—think of the window manager or even the operating system itself—bloat is like poison in the hardware’s veins. It is not innovation. It is simply a waste of precious resources.

      • NekuSoul@lemmy.nekusoul.de
        link
        fedilink
        arrow-up
        7
        ·
        7 months ago

        The topic is bloatware, not games.

        The original post includes two gaming examples, so it’s actually about both, which is a bit unfortunate, because as you’ve said, they’re two very different things.

        • Übercomplicated@lemmy.ml
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          I suppose ray-tracing is rather suggestive of games, you’re right. Well, I’ll take it as an accident by the author and rest easy. Thanks for the correction!

    • PopOfAfrica@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      7 months ago

      Honestly we are hitting the bugetary limits of what game graphics can do, for example.

      A lot of new games look substantially worse than the Last of Us Part 2, which ran on ancient hardware.

  • IrateAnteater@sh.itjust.works
    link
    fedilink
    arrow-up
    23
    ·
    7 months ago

    I think that every operating system needs to a have a “do what the fuck I told you to” mode, especially as it comes to networking. I’ve come close to going full luddite just trying to get smart home devices to connect to a non-internet connected network, (which of course you can only do through a dogshit app) and having my phone constantly try to drop that network since it has no Internet.

    I get the desire to have everything be as hand-holdy as possible, but it’s really frustrating when the hand holding way doesn’t work and there is absolutely zero recourse, and even less ability to tell what went wrong.

    Then there’s my day job, where I get do deal with crappy industrial software, flakey Internet connections and really annoying things like hyper-v occupying network ports when it’s not even open.

    • AnarchoSnowPlow@midwest.social
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      I try to not buy any Wi-Fi smart home devices anymore. I try to stick to zwave or zigbee, zwave I have better luck with generally. I even left my nest thermostat at my old house and installed a 10+ year old zwave thermostat at the new one. Way happier. I’m not relying on googles API to be stable anymore for home assistant interaction.

      • IrateAnteater@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        7 months ago

        Yeah, I’d love to, but first we have to tell that to Rockwell, Siemens, Bosch, ABB, etc, etc. All the proprietary software runs on Windows. Not to mention getting my company on board when we’re already heavily into the Microsoft ecosystem at the corporate level.

        • PopOfAfrica@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          7 months ago

          It kind of baffles me that people are still invested in Microsoft at a corporate level considering the costs associated with it.

          • PlexSheep@infosec.pub
            link
            fedilink
            arrow-up
            2
            ·
            7 months ago

            The corpos don’t really care and want someone to blame if things go wrong, that’s why they often use proprietary alternatives.

    • HStone32@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      7 months ago

      You could make your own smart devices. You don’t even need to be smart in embedded systems these days either. Just use a cheap SBC.

    • Gamers_Mate@kbin.run
      link
      fedilink
      arrow-up
      16
      ·
      7 months ago

      Image description.

      The image is a screenshot of a tumblr post by user elbiotipo.

      My solution for bloatware is this: by law you should hire in every programming team someone who is Like, A Guy who has a crappy laptop with 4GB and an integrated graphics card, no scratch that, 2 GB of RAM, and a rural internet connection. And every time someone in your team proposes to add shit like NPCs with visible pores or ray tracing or all the bloatware that Windows, Adobe, etc. are doing now, they have to come back and try your project in the Guy’s laptop and answer to him. He is allowed to insult you and humilliate you if it doesn’t work in his laptop, and you should by law apologize and optimize it for him. If you try to put any kind of DRM or permanent internet connection, he is legally allowed to shoot you.

      With about 5 or 10 years of that, we will fix the world.

    • englislanguage@lemmy.sdf.org
      link
      fedilink
      arrow-up
      7
      ·
      7 months ago

      Innovation is orthogonal to code size. None of the software most modern computers are running cannot be solved on 10 year old computers. It’s just the question whether the team creating your software is plugging together gigantic pieces of bloatware or whether they actually develop a solution to a real problem.