- cross-posted to:
- ghazi@lemmy.blahaj.zone
- cross-posted to:
- ghazi@lemmy.blahaj.zone
‘Nudify’ Apps That Use AI to ‘Undress’ Women in Photos Are Soaring in Popularity::It’s part of a worrying trend of non-consensual “deepfake” pornography being developed and distributed because of advances in artificial intelligence.
I use an ad blocker and haven’t seen these. Perhaps a link to the best ones could be shared here for better understanding of what the article is talking about?
That’s disgusting, where are these nude photo sites so I can avoid them? There’s so MANY, but which one?!
Here is an alternative Piped link(s):
That’s disgusting, where are these nude photo sites so I can avoid them? There’s so MANY, but which one?!
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Sus question lmfao
These things have been around since the onset of deepfakes, and truly if you take a couple seconds to look you’ll find them. It’s a massive issue and the content is everywhere
deleted by creator
We’re talking specifically about AI enhanced fakes, not the old school Photoshop fakes – they’re two completely different beasts
Different only in construction. Why they exist and what they are is older than photography.
No I disagree because before you could tell a fake from a mile away, but deepfakes bring it to a whole new level of creepy because they can be EXTREMELY convincing
That is a quality improvement, not a shift in nature.
Or maybe an accessibility improvement. You don’t need to practice creating your own works of art over many years anymore, or have enough money to commission a master artist. The AI artists are good enough and work for cheap.
The difference is that we now can do video. I mean in principle that was possible before but also a hell of a lot of work. Making it look real hasn’t been a problem since before Photoshop, if anything people get sloppy with AI also because a felt 99% of people who use AI don’t have an artistic bone in their body.
I’m not saying that it’s a shift in nature? All I’ve been saying is:
A) tools to create realistic nudes have been publicly available ever since deepfakes became a thing
B) deepfakes are worse than traditional photoshopped nudes because (as you put it, a quality improvement) they’re more convincing and therefore can have more detrimental effects
There was a brief period between now and the invention of photography when that was true. For thousands of years before that it was possible to create a visual representation of anything you imagine without any hint that it wasn’t something real. Makes me wonder if there were similar controversies about drawings or paintings.
Careful with asking such things, because the boundary to crime seems blurry.
I don’t think there is any crime.
It’s identical to drawing a nude picture of someone.
It’s what the courts think, and right now, it’s not clear what the enforceable laws are here. There’s a very real chance people who do this will end up in jail.
I believe prosecutors are already filling cases about this. The next year will decide the fate of these AI generator deepfakes and the memories behind them.
And you are sure that ‘someone’ is of legal age, of course. Not blaming you. But does everybody always know that ‘someone’ is of legal age? Just an example to start thinking.
I don’t know if it’s illegal to create naked drawings of people who are underage.
It’s not
Depends on where you live. Not legal in the UK for example. In the US it can even be broken down at the state level, although there’s lots of debate on whether states are able to enforce their laws. “Obscene” speech is not protected under free speech, the argument would be whether or not the naked drawings had artistic merit or not.
I’m not a lawyer, but I do know that people in the US have gone to prison for possessing naked images of fictional children and it’s on the books as illegal in many other countries.
I don’t understand either.