If the descentralization of social networks continue, we will have to prepare for the eventual rise of the instances wars, where people will start to fight about which instance is better and which one is weird to be in and so on, but that’s for the future of us all.

  • jrs100000@lemmy.world
    link
    fedilink
    arrow-up
    83
    ·
    1 year ago

    The big problem is going to be when someone decides to start spamming and vote manipulating with bot populated private instances that automatically re-spawn themselves under a new name whenever they are blacklisted. Eventually, the standard will have to move to whitelisting over blacklisting, and once that happens the whole premise of federation starts to fall apart.

    • scarabic@lemmy.world
      link
      fedilink
      arrow-up
      35
      ·
      1 year ago

      It’s not harder than what we’ve had to do with e-mail spam. Which has been enormously successful, with 99% of it not even getting delivered to your spam folder but just dropped entirely.

      Instances will het as much visibility as they’ve earned through successful engagement across instances. The visibility of a new instance’s posts will increase over time.

      This is why yes, there needs to be a feed algorithm. “Just show it to me chronologically” is the most naive thought, and people still have it all the time. There are just so many fundamental things that need to go into a sorting algo. We’re not even talking about personalization.

      • Kaldo@kbin.social
        link
        fedilink
        arrow-up
        14
        ·
        1 year ago

        E-mail spam filter is funded by google and other multibillion megacorporations though, and they just outright block or rate limit unknown providers. I’d say it’s not gonna be as easy to do it with fediverse.

        This is why yes, there needs to be a feed algorithm. “Just show it to me chronologically” is the most naive thought

        Agreed 100% but again, I wonder if we have enough resources to actually make it good while also keeping it free, both in terms of monetization and in terms of outside influence and biases. Twitter and others spend a lot of manhours on it and mastodon still doesn’t have it either for example, it’s not even being worked on afaik (or nobody talks about it).

        • scarabic@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          The trick is to find out how to leverage the community for quality signals, and just support that with good foundations.

          Spam filtering is done by corporations but they’re not all mega tech companies like Google. A lot of it is done at the network level, too.

          DNS has also always been the prime example of a federated service that works so well we can rely on it as a public utility. Why hasn’t it been taken over by bad actors rapidly recycling their identities? It’s not because big tech has thousands of human agents monitoring it at great expense.

          • intensely_human@lemm.ee
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            how to leverage the community for quality signals

            I say we give each person one up or down vote on each piece of content. Then, people should be able to sort by the sum of those up or down votes (with up being worth +1 and down being worth -1).

            I’m not sure, but I suspect a system like that might have content moderation built into its structure.

              • intensely_human@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                1 year ago

                Moderation itself can be gamed. A moderator who’s a bad actor can cause a lot of damage easily by “gaming” the moderation system.

                • scarabic@lemmy.world
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  1 year ago

                  We can keep playing this until some bad actor is pretending to be me typing this right now.

                  But this is why moderators work in teams, and why there is an admin as well. A solo mod who’s a bad actor is not going to develop a very appealing community, and whole scam shitpile instances can always be defederated.

                  • intensely_human@lemm.ee
                    link
                    fedilink
                    arrow-up
                    1
                    ·
                    1 year ago

                    Mutliple moderators are because more centralization means easier corruption. If you extrapolate the diffusion of power to its extreme, you arrive at crowdsourced moderation.

                    It’s true that crowdsourced moderation can be gamed, but it takes some effort and that level of effort only goes down by adding accounts with moderating powers.

                    A moderation team is easier to corrupt than a totally decentralized voting system. That’s like the entire argument for why we like democracy: it’s harder to corrupt a populace at large than it is to corrupt a cabal in authority. It’s possible, but it takes more effort making it as good as you can get in terms of incorruptibility.

      • elboyoloco@lemmy.world
        link
        fedilink
        arrow-up
        16
        ·
        1 year ago

        So I went to the website. It explains what it does, but not much how… Or maybe I’m too dumb to get it. Could you explain how the verification happens? How does this system work?

        • db0@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          Did you read the devlog? I got into more detail there. Just so I don’t explain everything from scratch

          • intensely_human@lemm.ee
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            Hey one thing I learned while canvassing for a politician is that it can be really beneficial to repeat yourself when it comes to articulating a message, instead of articulating it once then passing copies.

            The more times you write and rewrite the same explanation the better it will get.

    • ShrimpsIsBugs@feddit.de
      link
      fedilink
      arrow-up
      8
      ·
      1 year ago

      I think these problems might be solvable with auto blacklisting instances based on their age, how their users behave and what % of comments and posts of them are flagged as spam

        • jrs100000@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          1 year ago

          Thats the problem. It would be very difficult to get a new instance off the ground unless you were an insider or had inside connections. If you have a cabal of existing admins acting as gate keepers you could keep outsiders from abusing the system easily, but you are also walking right back into the centralized control federation is supposed to prevent.

        • Wander@yiffit.net
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          One thing that is feasible is for established instances to give votes from new instances a lower weight. So, no blacklisting, but until they have been around for a little while to be able to calculate that their activity corresponds to their size and that nothing is off, upvotes and dowvotes could be ignored or given a lower weight.

        • zygo_histo_morpheus@programming.dev
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          Well non-federated forums can grow by word of mouth and similar. Being federated does lower the barrier of entry for interacting but it’s still possible to visit the instance the old fashioned way. You probably still need to rely mostly on word of mouth anyway, even if you are federated.

        • ShrimpsIsBugs@feddit.de
          link
          fedilink
          arrow-up
          4
          ·
          1 year ago

          Yes, age alone shouldn’t lead to getting blacklisted. But if an instance is two days old, already 50+ accounts from there were banned on your instance for being bots and besides that there was no real contributions coming from that place, this might be a candidate for auto-blacklisting.

    • Kaldo@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Maybe we’ll move to a system where only upvotes from that home’s instance matter. After all karma is meaningless anyway and is just used for short term discoverability, maybe kbin1.social doesn’t care how kbin2.social votes on kbin1.social threads (or any lemmy example instance)? If you subscribe to kbin1.social then you hope that they will upvote their content appropriately the same way you expect them to self-moderate appropriately. Dunno, just thinking out loud

    • orientalsniper@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      and once that happens the whole premise of federation starts to fall apart.

      Will it? Even if we get to the point where there’s a whitelisting system, major instances will still be federated. There could be even a transitional small instances federation.