@corb3t@lemmy.world to Technology@lemmy.ml • edit-211 months agoStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.comexternal-linkmessage-square144fedilinkarrow-up1253file-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.party
arrow-up1253external-linkStanford researchers find Mastodon has a massive child abuse material problemwww.theverge.com@corb3t@lemmy.world to Technology@lemmy.ml • edit-211 months agomessage-square144fedilinkfile-textcross-posted to: technology@lemmy.worldtechnology@lemmy.worldfediverse@lemmy.mltechnology@beehaw.orgtechnews@radiation.party
minus-square@balls_expert@lemmy.blahaj.zonelinkfedilink5•11 months agoThere is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content Shadow banning those users would be nice too
minus-square@diffuselight@lemmy.worldlinkfedilink1•11 months agoThey are talking about AI generated images. That’s the volume part.
There is a database of known files of CSAM and their hashes, mastodon could implement a filter for those at the posting interaction and when federating content
Shadow banning those users would be nice too
They are talking about AI generated images. That’s the volume part.