Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • lohrun
    link
    fedilink
    4011 months ago

    One of the problems with the fediverse is that each server keeps its own copy of the content. It is definitely a worry that bad actors push content to federated servers to get them taken down due to the content they now are storing.

      • @deksesuma@beehaw.org
        link
        fedilink
        1511 months ago

        General idea is that if there is only one copy, taking something down is knocking that server out of service.

      • TheSaneWriter
        link
        fedilink
        711 months ago

        I think so. My Lemmy instance for example is currently storing several gigabytes of images in my cloud buckets, but with my 4 users I’m reasonably confident it didn’t all come from us.

        • Lloir
          link
          fedilink
          111 months ago

          This is why I disabled that feature on my Lemmy instance.