Police investigation remains open. The photo of one of the minors included a fly; that is the logo of Clothoff, the application that is presumably being used to create the images, which promotes its services with the slogan: “Undress anybody with our free service!”
Making unauthorized nude images of other people, probably. The service did advertise, “undress anyone”.
The Philosophical question becomes, if it’s AI generated is it really a photo of them?
Let’s take it to an extreme. If you cut the face off somebody’s polaroid and then paste it into a nudie magazine over the face of an actress. Is that amalgam a nude photo of the Polaroid picture person?
It’s a debate that could go either way, and I’m sure we will have an exciting legal land scape with countries with different rules.
I suppose you could make a Ship of Theseus like argument there too. At what point does it matter where the parts of the picture came from. Most would probably be okay with their hairstyle being added to someone else’s picture, what about their eyes, their mouth,… Where exactly is the line?
Exactly. A bunch of litigators are going to get very rich debating this.
That does not matter, as people can’t make a difference, even if they wanted.
It is a photo about them if you can recognize them, especially their face, on it.
What if there’s somebody who looks very similar to somebody else? Are they prevented from using their likeness in film and media?
Could an identical twin sister be forbidden from going into porn, to prevent her from besmirching the good image of her other twin sister who’s a teacher?
They are not looking very similar intentionally. But editing images is done pretty much intentionally.
I think it comes down to the identity of the person whose head is on the body. For instance, if the eyes had a black bar covering them or if the face was blurred out, would it be as much an invasion of privacy?
However, if the face was censored, the photo wouldn’t have the same appeal to the person who generated it. That’s the issue here.
A cutout of a person’s head on a porn star’s picture still has a sense of falsehood to it. An AI generated image that’s likely similar to the subject’s body type removes a lot of the falsehood, and thus makes the image have more power. Without the subject’s consent, this power is harmful.
You’re right about the legal battles, though. I just feel bad for the people who will have their dignity compromised in the mean time. Everyone should be entitled to dignity.
Objectively it’s absolutely not AIs don’t have X-ray eyes. Best they could do is infer rough body shape from a clothed example but anything beyond that is pure guesswork. The average 14yold is bound to be much better at undressing people with their eyes than an AI could ever be.
Subjectively, though, of course yes it is. You’re not imagining the cutie two desks over nude because it isn’t them.
In this sort of situations the conclusion would be easy or in cases where we have the input photo. But absolutely it could get iffy
How about we teach people some baseline of respect towards other people? Punishing behaviour like that can help showing that it’s not okay to treat other people like pieces of meat.
I’m pretty sure nude pictures of minors is already illegal.
I’m not sure if AI made ones count yet
You go ahead and make AI generated kiddie porn and we’ll find out.
I’m fairly sure there are legal cases about it, so no need to encourage anyone to make kiddie porn…
Then wtf are you confused about? Lol
I’m confused why anyone would encourage others to make AI kiddie porn. Weird as fuck dude
You’re questioning if it’s even illegal to do.
I simply pointed out that of course it is illegal.
Now you agree that it is illegal. Ok, my point has been made.
I’m not so sure. Drawn kiddie porn is legal in a lot of places. AI stuff might be the same, especially since there’s not a lot of laws about AI imaginery to begin with.
Ah the legal precedent of calling a weirdo a weirdo. I’m not gonna make kiddie porn for you, legal or not.
deleted by creator
Yeah