Mastodon, an alternative social network to Twitter, has a serious problem with child sexual abuse material according to researchers from Stanford University. In just two days, researchers found over 100 instances of known CSAM across over 325,000 posts on Mastodon. The researchers found hundreds of posts containing CSAM related hashtags and links pointing to CSAM trading and grooming of minors. One Mastodon server was even taken down for a period of time due to CSAM being posted. The researchers suggest that decentralized networks like Mastodon need to implement more robust moderation tools and reporting mechanisms to address the prevalence of CSAM.

  • zephyrvs@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    1 year ago

    I’m referring to the CSAM scanning systems that are outside of the control of almost anyone except governments, three letter agencies, other law enforcement and parts of the private sector.

    These systems must be fed every hash of every file submitted to as many instances as possible to be efficient with close to no oversight or public scrutiny.

    Pass.

    Edit: I’m not blocking you but I noticed intermittent connectivity issues on lemmy.ml today, possibly around the time where I replied.