Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”

  • jet@hackertalks.com
    link
    fedilink
    English
    arrow-up
    10
    ·
    edit-2
    1 year ago

    The Philosophical question becomes, if it’s AI generated is it really a photo of them?

    Let’s take it to an extreme. If you cut the face off somebody’s polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?

    It’s a debate that could go either way, and I’m sure we will have an exciting legal land scape with countries with different rules.

    • taladar@feddit.de
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 year ago

      I suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else’s picture, what about their eyes, their mouth,… Where exactly is the line?

    • ReversalHatchery@beehaw.org
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      The Philosophical question becomes, if it’s AI generated is it really a photo of them?

      That does not matter, as people can’t make a difference, even if they wanted.
      It is a photo about them if you can recognize them, especially their face, on it.

      • jet@hackertalks.com
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        1 year ago

        What if there’s somebody who looks very similar to somebody else? Are they prevented from using their likeness in film and media?

        Could an identical twin sister be forbidden from going into porn, to prevent her from besmirching the good image of her other twin sister who’s a teacher?

    • RagnarokOnline@reddthat.com
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I think it comes down to the identity of the person whose head is on the body. For instance, if the eyes had a black bar covering them or if the face was blurred out, would it be as much an invasion of privacy?

      However, if the face was censored, the photo wouldn’t have the same appeal to the person who generated it. That’s the issue here.

      A cutout of a person’s head on a porn star’s picture still has a sense of falsehood to it. An AI generated image that’s likely similar to the subject’s body type removes a lot of the falsehood, and thus makes the image have more power. Without the subject’s consent, this power is harmful.

      You’re right about the legal battles, though. I just feel bad for the people who will have their dignity compromised in the mean time. Everyone should be entitled to dignity.

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 year ago

      Objectively it’s absolutely not AIs don’t have X-ray eyes. Best they could do is infer rough body shape from a clothed example but anything beyond that is pure guesswork. The average 14yold is bound to be much better at undressing people with their eyes than an AI could ever be.

      Subjectively, though, of course yes it is. You’re not imagining the cutie two desks over nude because it isn’t them.

    • RaivoKulli@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      In this sort of situations the conclusion would be easy or in cases where we have the input photo. But absolutely it could get iffy

    • ParsnipWitch@feddit.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      How about we teach people some baseline of respect towards other people? Punishing behaviour like that can help showing that it’s not okay to treat other people like pieces of meat.