While the media posted by the influencer has been removed, numerous text interactions with the deleted posts from his followers are still on the platform. Some of those posts mention a child depicted in the photos as young as one and a half years old.
To make matters worse, the image appears to have been on the platform for several days before being removed. Lucre even described the image in detail in a separate tweet, noting that it had been taken from a video. The video in question involved the abuse of three children, one of whom was reportedly strangled to death after the filming.
The CSAM material was left up for FOUR DAYS (July 22-26) before he was even suspended. Then they let him “delete” it…and reinstated him. People commented during those 4 days DESCRIBING THE IMAGES.
What the FUCK please tell me this is worth a visit from the FBI, getting removed from the App Store, some massive GDPR violation, fucking something. How is this story not bigger news?
To put this in perspective, 4 graphic CSAM images that an account with 500,000 followers posted were left up on Twitter for 4 days. The person who posted the images was suspended for less than a day.
Content warning: I deliberately avoid providing much more detail than “it was clearly CSAM” but I do mention the overall tweet contents and pretext.
I remember this from when it happened and unfortunately did see the text portion and thumbnail from the original tweet.
He did it under the pretext of reporting on the arrest of a person involved in the video and large-scale CSAM production. It started as a standard news-report-style where they list the name, age and arrest details of someone taken into custody. Initially it looked like the normal alt-right tweet about “look at how paedophilia is rampant and the world is sinful!”.
The guy describes himself as “chief trumpster” and a “breaker of narratives” and journalist. He claimed the details of the CSAM were provided by the Dutch police. He then described the title and detailed events of a CSAM video in the tweet. Unfortunately for me, the detailed events were below the tweet fold, so I had no idea it was going there until I expanded it.
The tweet image attachment or link unfurl thumb had a frame from the video itself. It was an otherwise-SFW image the adult abuser who was being talked about. Unfortunately I didn’t realise until after I had expanded the tweet text contents what the thumbnail was. I actually thought it was an opengraph error at first.
Even in the context of “reporting shocking content” the tweet was way over the line and went from 0 to 100 in a few words. I did not need the info on the CSAM, nobody except the police and courts does. The video title alone was over the line.
Musk phrasing this as another “I was told” decision is just him knowingly deferring responsibility.
Unfortunately, it’s a term everyone should know. It “replaces” the label child porn, because while it’s universally known as horrible, it’s not “porn.” It’s evidence of child sexual abuse. Hence “child sexual abuse material.”
Source
Source
The CSAM material was left up for FOUR DAYS (July 22-26) before he was even suspended. Then they let him “delete” it…and reinstated him. People commented during those 4 days DESCRIBING THE IMAGES.
What the FUCK please tell me this is worth a visit from the FBI, getting removed from the App Store, some massive GDPR violation, fucking something. How is this story not bigger news?
To put this in perspective, 4 graphic CSAM images that an account with 500,000 followers posted were left up on Twitter for 4 days. The person who posted the images was suspended for less than a day.
Content warning: I deliberately avoid providing much more detail than “it was clearly CSAM” but I do mention the overall tweet contents and pretext.
I remember this from when it happened and unfortunately did see the text portion and thumbnail from the original tweet.
He did it under the pretext of reporting on the arrest of a person involved in the video and large-scale CSAM production. It started as a standard news-report-style where they list the name, age and arrest details of someone taken into custody. Initially it looked like the normal alt-right tweet about “look at how paedophilia is rampant and the world is sinful!”.
The guy describes himself as “chief trumpster” and a “breaker of narratives” and journalist. He claimed the details of the CSAM were provided by the Dutch police. He then described the title and detailed events of a CSAM video in the tweet. Unfortunately for me, the detailed events were below the tweet fold, so I had no idea it was going there until I expanded it.
The tweet image attachment or link unfurl thumb had a frame from the video itself. It was an otherwise-SFW image the adult abuser who was being talked about. Unfortunately I didn’t realise until after I had expanded the tweet text contents what the thumbnail was. I actually thought it was an opengraph error at first.
Even in the context of “reporting shocking content” the tweet was way over the line and went from 0 to 100 in a few words. I did not need the info on the CSAM, nobody except the police and courts does. The video title alone was over the line.
Musk phrasing this as another “I was told” decision is just him knowingly deferring responsibility.
Thank you for the additional context. I’m sorry you had to be exposed to that.
Because it was one image and it wasn’t showing anything explicit. That’s how I understood it from reading more about this story.
deleted by creator
Dude, he posted it to highlight how evil it is. Maybe it was a stupid thing to do but that’s the extent of it.
deleted by creator
You’re evil
Do I want to know what CSAM stands for?
Child sexual assault material, unfortunately.
deleted by creator
I didn’t use the term porn at all…
deleted by creator
You spent half your response “correcting” me for use of the the term porn, which I did not use. Respond to the OP for that. That’s all I’m saying.
Yep. Didn’t want to know that.
Unfortunately, it’s a term everyone should know. It “replaces” the label child porn, because while it’s universally known as horrible, it’s not “porn.” It’s evidence of child sexual abuse. Hence “child sexual abuse material.”
Is it pernounced KaZAM! ?