• nanoUFO@sh.itjust.worksOPM
    link
    fedilink
    English
    arrow-up
    47
    ·
    1 year ago

    I remember when community servers existed and these problems were almost non existent without spying.

    • Dedicated servers ran by the community with a server browser to find games/servers.
      Really the golden age of multiplayer.

      Found a nice server that runs well, chill and well moderated? add it to your favorites.
      No lobbies, well… technically the whole server was the lobby, kinda.
      No progression unlocks bullshit.
      No ranking. No waiting on matchmaking. Just play.
      No AI spying on every thing you say or do.
      Maybe a “SIR this is a Christian server, so swearing will not be tolerated” or other warning of some kind now and then, even on games like Counterstrike.

      Eventually, you’d get to know people, kinda like how you might start recognizing names here on lemmy.
      You’d make friends, rivals, etc.
      I miss those times.

      I got into Titanfall 2 pretty late (like last month) and waiting 10 minutes to even get into a lobby is just annoying.
      As opposed to joining a server and playing non stop on there.

      It’s even less costs to the publisher than to host and scale on their own because the community is running your servers.
      But then they can’t pull the plug to force people on a new release.
      They can’t spy on as much shit.
      They can’t sell as much private data.
      It’s probably easier to sell microtransactions this way too.

      In a way… gaming was decentralized. I miss it.

      • Bluescluestoothpaste@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        7
        ·
        1 year ago

        Yeah but there were admins spying what you did and banning you. Quite frankly i have much greater trust in AI admins than human admins. Not that some human admins aren’t great, but why risk it? Same as self driven cars, as soon as they’re ready im ready to never drive again.

        • nanoUFO@sh.itjust.worksOPM
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 year ago

          You trust a billion dollar company with no morals with your data? Isn’t that the whole point we are on this site? Community servers are like lemmy instances.

        • Vampiric_Luma@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          What is stopping AI from showing bias here? The humans tailor the AI, so there will inherently always be that risk without transparency.

            • Vampiric_Luma@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              Sure, but the mistakes aren’t the main issue, it’s that AI is just a tool that by extention can be abused by the humans in control. You have no idea what rules they give it and what false positives result from it.

              My primary concern here is that it’s Blizzard, whom love to gargle honey for China and is all for banning players that speak against them, is in charge of this AI.

              Blizzard’s previously talked about using AI to verify reports of disruptive voice chat, which is now running in most regions, though not globally. The developer says it has seen this technology “correct negative behavior immediately, with many players improving their disruptive behavior after their first warning.”

              Great, they can auto-ban players like Ng Wai Chung, I guess. For whatever they subjectively deem ‘harmful’. There’s also the looming idea that a friend can wander in my room, say something dumb, and now I’m closer to a ban because of an unrelated choice I made outside the game.

              And we definitely trust Blizard to be good with all the audio data they get to harvest. That won’t be abused later, right?

              • Bluescluestoothpaste@sh.itjust.works
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                I mean that’s a general argument against technology. Yes, more technology means more ruthlessly efficient abuse, but ultimately you think technology is better in the long run or not. Either way it is inevitable. Maybe in the EU they will ban those abuses, in China they won’t, and US will find some weird compromise between the two.

      • n3er0o@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Unrelated to the topic, but wasn’t Titanfall 2 plagued by this one hacker that basically filled every lobby with bots to make the servers crash? I think I very recently heard about them resolving the issue and the player count surpassed the numbers at launch even.

    • 🇰 🌀 🇱 🇦 🇳 🇦 🇰 ℹ️@yiffit.net
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Active moderation isn’t spying but using an AI is? The only reason those self-hosted community servers didn’t have problems was because they (usually) had active admins to see bad behavior and take action. This is merely automating that so a real human being doesn’t have to be there watching.

      • nanoUFO@sh.itjust.worksOPM
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 year ago

        This is automating something based on blizzard rules not community rules. What if people want even stricter rules or looser or none at all or completely different rules? Also how many times have billion dollar companies been caught selling customers private info? Too many to count.