Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

    • @deksesuma@beehaw.org
      link
      fedilink
      1511 months ago

      General idea is that if there is only one copy, taking something down is knocking that server out of service.

    • TheSaneWriter
      link
      fedilink
      711 months ago

      I think so. My Lemmy instance for example is currently storing several gigabytes of images in my cloud buckets, but with my 4 users I’m reasonably confident it didn’t all come from us.

      • Lloir
        link
        fedilink
        111 months ago

        This is why I disabled that feature on my Lemmy instance.