• @whatup@hexbear.net
    link
    fedilink
    English
    592 months ago

    I think they’ll go after Telegram next. I know a lot of people use it to see uncensored news on Palestine and the Ukraine, which is a big no-no in the US. There’ve been a suspiciously high number of news articles linking it to CSAM even though Facebook is a much, much, much bigger offender.

      • @whatup@hexbear.net
        link
        fedilink
        English
        48
        edit-2
        2 months ago

        Yep. Very big, and they do a dog shit job of addressing the problem. Their underpaid content moderators pour over the worst images you can possibly imagine until their mental health is completely shot. The worst part is that this method barely makes a dint in the amount of CSAM distribution.

        https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

        https://www.ft.com/content/afeb56f2-9ba5-4103-890d-91291aea4caa

        https://archive.ph/ter4Y

        • ComradeSharkfucker
          link
          fedilink
          English
          232 months ago

          Their underpaid content moderators pour over the worst images you can possibly imagine until their mental health is completely shot.

          barely makes a dint

          • @whatup@hexbear.net
            link
            fedilink
            English
            232 months ago

            And there’s no one for them to talk to because of how uniquely horrific these videos and images are. Therapists are only affordable to the rich. Can’t talk to family and friends without potentially traumatizing them. Even the people who interview these mods can’t print the details of their experiences because readers would complain.

            • SerLava [he/him]
              link
              fedilink
              English
              222 months ago

              I heard that sometimes it’ll keep showing the same traumatic video to one person over and over and over because the bot uploader has very slightly edited it thousands of times, and Facebook forces the reviewer to watch the whole thing every time even though they already know it’s in violation as soon as it starts

              • @whatup@hexbear.net
                link
                fedilink
                English
                15
                edit-2
                2 months ago

                That’s so uniquely cruel in such a calculating way. It’s like they’re intentionally trying to traumatize their workers in a fucked up experiment. I don’t trust the in-house therapists Meta offers…