• Xanza@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    ·
    2 days ago

    I totally agree with these guys being arrested. I want to get that out of the way first.

    But what crime did they commit? They didn’t abuse children…they are AI generated and do not exist. What they did is obviously disgusting and makes me want to punch them in the face repeatedly until it’s flat, but where’s the line here? If they draw pictures of non-existent children is that also a crime?

    Does that open artists to the interpretation of the law when it comes to art? Can they be put in prison because they did a professional painting of a child? Like what if they did a painting of their own child in the bath or something? Sure the contents questionable but it’s not exactly predatory. And if you add safeguards for these people could then not the predators just claim artistic expression?

    It just seems entirely unenforceable and an entire goddamn can of worms…

    • billwashere@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 day ago

      First off I’ll say this topic is very nuanced. And as sick as any child porn is I completely agree. This, in my gut, feels like a weird slippery slope that will somehow get used against any AI generated images or possibly any AI generated content. It makes me feel like those “online child protection” bills that seem on the surface like not terrible ideas, but when you start thinking about them in detail are horrific dystopian ideas.

    • Allero@lemmy.today
      link
      fedilink
      English
      arrow-up
      24
      ·
      2 days ago

      I actually do not agree with them being arrested.

      While I recognize the issue of identification posed in the article, I hold a strong opinion it should be tackled in another way.

      AI-generated CSAM might be a powerful tool to reduce demand for the content featuring real children. If we leave it legal to watch and produce, and keep the actual materials illegal, we can make more pedophiles turn to what is less harmful and impactful - a computer-generated image that was produced with no children being harmed.

      By introducing actions against AI-generated materials, they make such materials as illegal as the real thing, and there’s one less reason for an interested party not to go to a CSAM site and watch actual children getting abused, perpetuating the cycle and leading to more real-world victims.

      • Dr. Moose@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        1 day ago

        Nah the argument that this could grow “pedophile culture” and even encourage real activities is really not that far fetched and could be even true. Without very convincing studies do you take a chance where real kids could soon suffer? And I mean the studies would have to be really convincing.

        • Allero@lemmy.today
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 day ago

          The thing is, banning is also a consequential action.

          And based on what we know about similar behaviors, having an outlet is likely to be good.

          Here, the EU takes an approach of “banning just in case” while also ignoring the potential implications of such bans.

      • LifeInMultipleChoice@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        9
        ·
        2 days ago

        It’s strange to me that it is referred to as CSAM. No people are involved so no one is a being sexually assaulted. It’s creepy but calling it that implies a drawing is a person to me.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 days ago

      Exactly, which is why I’m against your first line, I don’t want them arrested specifically because of artistic expression. I think they’re absolutely disgusting and should stop, but they’re not harming anyone so they shouldn’t go to jail.

      In my opinion, you should only go to jail if there’s an actual victim. Who exactly is the victim here?

    • sunbeam60
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 days ago

      It obviously depends on where they live and/or committed the crimes. But most countries have broad laws against anything, real or fake, that depicts CSAM.

      It both because as technology gets better it would be easy for offenders to claims anything they’ve been caught with is AI created.

      It’s also because there’s a belief that AI generated CSAM encourages real child abuse.

      I shan’t say whether it does - I tend to believe so but haven’t seen data to prove me right or wrong.

      Also, at the end, I think it’s simply an ethical position.