Not a good look for Mastodon - what can be done to automate the removal of CSAM?

  • @mindbleach@lemmy.world
    link
    fedilink
    4611 months ago

    4.1 Illustrated and Computer-Generated CSAM

    Stopped reading.

    Child abuse laws “exclude anime” for the same reason animal cruelty laws “exclude lettuce.” Drawings are not children.

    Drawings are not real.

    Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn’t count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson’s rights, because he doesn’t fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.

    This cannot be a controversial statement. Anyone who can’t distinguish fiction from real life has brain problems.

    You can’t rape someone in MS Paint. Songs about murder don’t leave a body. If you write about robbing Fort Knox, the gold is still there. We’re not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.

    If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.

    • @balls_expert@lemmy.blahaj.zone
      link
      fedilink
      7
      edit-2
      11 months ago

      Okay, thanks for the clarification

      Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.

      • @priapus@sh.itjust.works
        link
        fedilink
        511 months ago

        They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.

          • What’s the point of reporting it to authorities? It’s not illegal, nor should it be because there’s no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.

          • @priapus@sh.itjust.works
            link
            fedilink
            211 months ago

            Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

      • @mindbleach@lemmy.world
        link
        fedilink
        311 months ago

        If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.

        You can be against both. Don’t ever pretend they’re the same.

              • @mindbleach@lemmy.world
                link
                fedilink
                311 months ago

                ‘Everyone but you agrees with me!’ Bullshit.

                ‘Nobody wants this stuff that whole servers exist for.’ Self-defeating bullshit.

                ‘You just don’t understand.’ Not an argument.

                • @balls_expert@lemmy.blahaj.zone
                  link
                  fedilink
                  2
                  edit-2
                  11 months ago

                  Okay, the former then.

                  Let’s just think about it, how do you think it would turn out if you went outside and asked anyone about pornographic drawings of children? How long until you find someone who thinks like you outside your internet bubble?

                  “Nobody wants this stuff that whole servers…”

                  There are also servers dedicated to real child porn with real children too. Do you think that argument has any value with that tidbit of information tacked onto it?

                  • @mindbleach@lemmy.world
                    link
                    fedilink
                    311 months ago

                    Ask a stranger about anything pornographic and see how it goes.

                    This is rapidly going from pointless to stupid. Suffice it to say: stop pretending drawings are ever as bad as actual child abuse.

      • @mindbleach@lemmy.world
        link
        fedilink
        511 months ago

        What does that even mean?

        There’s nothing to “cover.” They’re talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.

        No shit they are also discussing actual CSAM alongside… drawings. That is the problem. That’s what they did wrong.

    • @DrQuint@lemmy.world
      link
      fedilink
      411 months ago

      Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.

      Pfft, of course, that’s why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.

      • I mean, I think it’s disgusting, but I don’t think it should be illegal. I feel the same way about cigarettes, 2 girls 1 cup, and profane language. It’s absolutely not for me, but that shouldn’t make it illegal.

        As long as there’s no victim, knock yourself out with whatever disgusting, weird stuff you’re into.