A South Korean man has been sentenced to jail for using artificial intelligence to generate exploitative images of children, the first case of its kind in the country as courts around the world encounter the use of new technologies in creating abusive sexual content.
(Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)
The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.
Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?
That’s a fundamental misunderstanding of how diffusion models work. These models extract concepts and can effortlessly combine them to new images.
If it learns woman + crown = queen
and queen - woman + man = king
it is able to combine any such concept together
As Stability has noted. any model that has the concept of naked and the concept of child in it can be used like this. They tried to remove naked for Stable Diffusion 2 and nobody used it.
Nobody trained these models on CSAM and the problem is a dilemma in the same way a knife is a dilemma. We all know a malicious person can use a knife for murder, including of children Yet society has decided that knives sufficient other uses that we still allow their sale pretty much everywhere.
Editing this reply to say that I was in fact right and I did not have any fundamental misunderstanding of anything. And the database in question here is called LAIOn and contains 6 billions images scraped from the web, including CSAM images.
Thanks for that. As I said, I’m not big into how AI works, so not surprised I got that wrong. The databases of everything that has come across the clear web are still there though and are available for use by people with access.
See this new article. The image database they looked into is called LAIOn. There are others though of course.
I don’t mean google crawlers, I mean image databases for training image generators.
Getting an AI image generator to produce CSAM means it knows what to show
Not necessarily. Part of AI is blending different concepts. AI trained on images of regular children and nude adults in principle should be able to produce underage nudity. This is a side effect of the intelligence in the AI
(Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)
The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.
Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?
That’s a fundamental misunderstanding of how diffusion models work. These models extract concepts and can effortlessly combine them to new images.
If it learns woman + crown = queen
and queen - woman + man = king
it is able to combine any such concept together
As Stability has noted. any model that has the concept of naked and the concept of child in it can be used like this. They tried to remove naked for Stable Diffusion 2 and nobody used it.
Nobody trained these models on CSAM and the problem is a dilemma in the same way a knife is a dilemma. We all know a malicious person can use a knife for murder, including of children Yet society has decided that knives sufficient other uses that we still allow their sale pretty much everywhere.
This can be used by pedophiles is used as an argument to ban cryptography… I wonder if someone will apply that to the generative AI.
Depends how profitable it is.
If it can replace workers no, if it threatens the big players like Disney yes.
Editing this reply to say that I was in fact right and I did not have any fundamental misunderstanding of anything. And the database in question here is called LAIOn and contains 6 billions images scraped from the web, including CSAM images.
Thanks for that. As I said, I’m not big into how AI works, so not surprised I got that wrong. The databases of everything that has come across the clear web are still there though and are available for use by people with access.
What are you referring to by “the database of everything that has come across the clear web”?
See this new article. The image database they looked into is called LAIOn. There are others though of course. I don’t mean google crawlers, I mean image databases for training image generators.
https://www.independent.co.uk/news/ap-study-developers-thorn-canada-b2467386.html
NSA servers? jkjk, kinda
I think they mean Google’s web-crawler index, but I don’t think that the index works that way… well, on the other hand, they do cache some stuff.
Here you go bud, no misunderstanding at all. The image generators are trained on CSAM, as I said.
https://www.independent.co.uk/news/ap-study-developers-thorn-canada-b2467386.html
Not necessarily. Part of AI is blending different concepts. AI trained on images of regular children and nude adults in principle should be able to produce underage nudity. This is a side effect of the intelligence in the AI