• guyrocket@kbin.social
    link
    fedilink
    arrow-up
    27
    ·
    11 months ago

    This will be interesting.

    How to write legislation to stop AI nudes but not photo shopping or art? I am not at all sure it can be done. And even if it can, will it withstand a courtroom free speech test?

    • macrocarpa@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      11 months ago

      I think it’s not feasible to stop or control it, for several reasons -

      1. People are motivated to consume ai porn
      2. There is no barrier to creating it
      3. There is no cost to create it
      4. There are multiple generations of people who have shared the source material needed to create it.

      We joke about rule 34 right, if you can think of it there is porn of it. It’s now pretty straightforward to fulfil the second part of that, irrespective as to the thing you thought of. Those pics of your granddsd in his 20s in a navy uniform? Your high school yearbook picture? Six shots of your younger sister shared by an aunt on Facebook? Those are just as consumable by ai as tay tay is.

    • Grimy@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      11 months ago

      You write legislation that bans all three because there is no difference between generating, photoshopping or drawing lewds of someone without their consent.

      Banning this on an individual level would be impossible, so you let the platforms that host it get sued.

      We have the technology to detect if an image is NSFW and if it includes a celebrity. Twitter is letting this happen on purpose.

      The images spread across X in particular on Wednesday night, with one hitting 45 million views before being taken down. The platform was slow to respond, with the post staying up for around 17 hours.

      It’s hard to pretend it wasn’t reported by Taylors fans many time during this time and the moderators didn’t know about this image half an hour after it was posted.

    • Susaga@ttrpg.network
      link
      fedilink
      English
      arrow-up
      2
      ·
      11 months ago

      If the image is even slightly convincing, it’s essentially just defamation with digital impersonation thrown in. Yeah, that might catch photoshop in its net, but you’d need to be a DAMN good artist to get caught in it as well.

      • NoIWontPickaName@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        11 months ago

        So what level is slightly convincing?

        What about people that happen to look like someone famous?

        What level of accuracy is necessary?

        If I label some random blonde ai generated porn “Taylor Slow”, does that count?

        They are both blonde after all.