In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.
It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.
Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.
Consider the following:
Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term “child porn” and she can be charged and registered as a sex offender but it’s not CSAM and -shouldn’t- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).
Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn’t/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn’t any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.
From 2, boyfriend shares it with other boys, now it’s potentially CSAM or at the least revenge porn of a child as she didn’t consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.
Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.
Underage boy uses AI to do the same as above but more believably, again I think it’s kind of creepy but if he keeps it to himself and doesn’t show anyone or spread it around it’s just youthful weirdness though really he probably shouldn’t have easy access to those tools.
Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.
Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it’s not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that’s super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it’s structurally opposed to being able to do such a thing. Also couldn’t hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don’t think that will ever happen in the US as it exists, not this century anyways.
Obviously not getting into adults here as that doesn’t need to be discussed, it’s wrong plain and simple.
Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can’t deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It’s abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don’t aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.
i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don’t really have an argument against that one.
I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.
But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.
Apologies if it’s just my reading comprehension being shit
Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.
It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.
It’s actually not clear that viewing material leads that person to causing in person abuse
Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.
That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.
There has been yes, but it doesn’t mean it’s the right ruling law. The law varies on that by jurisdiction as well because it is a murky area.
Edit: in the USA it might not even be illegal unless there was intent to distribute
By the statute’s own terms, the law does not make all fictional child pornography illegal, only that found to be obscene or lacking in serious value. The mere possession of said images is not a violation of the law unless it can be proven that they were transmitted through a common carrier, such as the mail or the Internet, transported across state lines, or of an amount that showed intent to distribute.[
So local AI generating fictional material that is not distributed may be okay federally in the USA.
Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.
Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.
Is it CSAM if it was produced by AI?
In this case, yes. Visually indistinguishable from a photo is considered CSAM. We don’t need any new laws about AI to get these assholes. Revenge porn laws and federal CSAM statutes will do.
Removed by mod
Nothing about your comment addressed why it should be treated differently if it’s ai-generated but visually indistinguishable.
Removed by mod
Just passing through, no strong opinions on the matter nor is it something I wish to do deep dive research on.
Just wanted to point out that your original comment was indeed just a threat that did nothing to address OPs argument.
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
If they can plant AI CSAM in my computer they can also plant “real” CSAM in my computer. Your point doesn’t make any sense.
It should be considered illegal if it was used to harm/sexually abuse a child which in this case it was.
Whether it should be classed as CSAM or something separate, I tend to think probably something separate as a revenge porn type law that still allows for distinguishing between this and say a girl whose uncle groomed and sexually abused her while filming it as while this is awful it can (and often does seem) be the product of foolish youth rather than the offender and those involved all being very sick, dangerous, and actually violent offending adult pedophiles victimizing children.
Consider the following:
Underage girl takes a picture of her own genitals, unfortunately classified as the unhelpful and harmful term “child porn” and she can be charged and registered as a sex offender but it’s not CSAM and -shouldn’t- be considered illegal material or a crime (though it is because the west has a vile fixation on puritanism which hurts survivors of childhood sexual trauma as well as adults).
Underage girl takes a picture of her genitals and sends it to her boyfriend, again /shouldn’t/ be CSAM (unfortunately may be charged similarly), she consented and we can assume there wasn’t any unreasonable level of coercion. What it is unfortunately is bound by certain notions of puritanism that are very American.
From 2, boyfriend shares it with other boys, now it’s potentially CSAM or at the least revenge porn of a child as she didn’t consent and it could be used to harm her but punishment has to be modulated with the fact the offender is likely a child himself and not fully able to comprehend his actions.
Underage boy cuts out photo of underage girl he likes, only her face and head, glues it atop a picture of a naked porn actress, maybe a petite one and uses it for his own purposes in private. Not something I think should be classed as CSAM.
Underage boy uses AI to do the same as above but more believably, again I think it’s kind of creepy but if he keeps it to himself and doesn’t show anyone or spread it around it’s just youthful weirdness though really he probably shouldn’t have easy access to those tools.
Underage boy uses AI to do same as 4-5 but this time he spread it around, defaming the girl, she/her friends find out, people say mean things about her, she has to go to school with a bunch of people who are looking and pleasuring themselves to fake but realistic images of herself against her consent which is violating and makes one feel unsafe. Worse probably being bullied for it, mean things, called the s-word, etc.
Kids are weird and do dumb things though unfortunately boys especially in our culture have a propensity to do things that hurt girls far more than the inverse to the point it’s not even really worth talking about girls being creepy or sexually abusive towards peer-aged boys in adolescence and young adulthood. To address this though you need to address patriarchy and misogyny on a cultural level, teach boys empathy and respect for girls and women and frankly do away with all this abusive pornography that’s super prevalent and popular which encourages and perpetuates abusive actions and mentalities towards women and girls, this will never happen in the US however because it’s structurally opposed to being able to do such a thing. Also couldn’t hurt to peel back the stigma and shame around sexuality and nudity in the US which stems from its reactionary Christian culture but again I don’t think that will ever happen in the US as it exists, not this century anyways.
Obviously not getting into adults here as that doesn’t need to be discussed, it’s wrong plain and simple.
Bottom line I think is companies need to be strongly compelled to quickly remove revenge-porn type stuff (regardless of the age of the victim though children can’t deal with this kind of thing as well as adults so the risk of suicide or other self-harm is much higher so it should be treated as higher priority) which this definitely is. It’s abusive and unacceptable and they should fear the credit card companies coming down on them hard and destroying them if they don’t aggressively remove it and ban it and report those sharing it. It should be driven off the clear-web once reported, there should be an image-hash data-set like that used for CSAM (but separate) for such things and major services should use it to stop the spread.
I think it’s best to not defend kiddie porn, unless you have a republican senator in your pocket.
Did you reply to the wrong person or do you just have reading comprehension issues?
i believe in the US for all intents and purposes, it is, especially if it was sourced from a minor, because you don’t really have an argument against that one.
Removed by mod
I’m not sure where you’re going with that? I would argue that yes, it is. As it’s sexual material of a child, with that child’s face on it, explicitly made for the purpose of defaming her. So I would say it sexually abused a child.
But you could also be taking the stance of “AI trains on adult porn, and is mearly recreating child porn. No child was actually harmed during the process.” Which as I’ve said above, I disagree with, especially in this particular circumstance.
Apologies if it’s just my reading comprehension being shit
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
Removed by mod
attack the argument, not the person
Removed by mod
Then don’t defend them? You’re trying to tell everyone that what is literally above in an article, about a child who had PHOTOREALISTIC pictures made of her, that it isn’t CSAM.
It is. Deleting everyone’s comments who disagree with you will not change that, and if anything, WILL make you seem even more like the bad guy.
Practice what you preach. Read the thread again, what do you think “say something about you” mean?
it would be material of and or containing child sexual abuse in it.
Is it material that may encourage people to sexually abuse a child?
It’s actually not clear that viewing material leads that person to causing in person abuse
Providing non harmful ways to access the content may lead to less abuse as the content they seek no longer comes from abuse, reducing demand for abusive content.
That being said, this instance isn’t completely fabricated and given its further release is harmful as it it involves a real person and will have emotional impact.
There’s other instances where it was completely fabricated, and the courts ruled it was CSAM and convicted
There has been yes, but it doesn’t mean it’s the right
rulinglaw. The law varies on that by jurisdiction as well because it is a murky area.Edit: in the USA it might not even be illegal unless there was intent to distribute
So local AI generating fictional material that is not distributed may be okay federally in the USA.
Serious value? How does one legally argue that their AI-generated child porn stash has “serious value” so they they don’t get incarcerated.
Laws are weird.
Have the AI try to recreate existing CP already deemed to have serious value and then have all the prompts/variations leading up to the closest match as part of an exhibit.
Edit: I should add, don’t try this at home, they’ll still probably say it has no value and throw you in jail.
Prison*
deleted by creator
Removed by mod
Any sex act involving a adult and a child/minor is abusive by its very nature.
Is that the definition of CSAM?