Didn’t Apple try to introduce this and got a ton of flak from all sorts of privacy “experts”? They then scrapped their plans, did they not? How is this any better/different? Any sort of “backdoor” into encryption means that the encryption is compromised. They tackled this in 2014 in the US. Feels like deja vu all over again.
@generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.
First, only photos were scanned and only if they were stored in iCloud.
Then, only cryptographic hashes of the photos were collected.
Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China’s equivalent, its cryptographic hash couldn’t be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.
Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)
Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.
Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.
Haha! Thanks for the excellent write up. Yes, I recall Apple handling CSAM this way and went out of it’s way to try and convince users it was still a good idea, but still faced a lot of criticism for it.
I doubt this bill will be as thorough which is why I was posing the question I asked. Apple could technically comply using some of the work it did but it’s sort of moot if things are end to end encrypted.
From a technical perspective, how much would an image need to be changed before the hash no longer matched? I’ve heard of people including junk .txt files in repacked and zipped pirated games, movies, etc., so that they aren’t automatically flagged for removal from file sharing sites.
I am not a technical expert by any means, and I don’t even use Apple products, so this is just curiosity.
It would have worked and it would have protected privacy but most people don’t understand the difference between having a hash of known CSAM on your phone and having actual CSAM on your phone for comparison purposes and it freaked people out.
I understand the difference and I’m still uncomfortable with it, not because of the proximity to CSAM but because I don’t like the precedent of anyone scanning my encrypted messages. Give them an inch, etc.
I’m assuming it’s either apple not wanting to be told to do it, or it’s due to them “learning their lesson” and no longer support it, they seem to be leaning quite heavily into privacy
Nah, Apple is one of the few companies around that is big on privacy and uses privacy as a differentiator for it’s products. Look at some of the other responses, it’s more complex than them just wanting money. They already make a boat load of it.
Didn’t Apple try to introduce this and got a ton of flak from all sorts of privacy “experts”? They then scrapped their plans, did they not? How is this any better/different? Any sort of “backdoor” into encryption means that the encryption is compromised. They tackled this in 2014 in the US. Feels like deja vu all over again.
@generalpotato Ish. I read the technical write up and they actually came up with a very clever privacy-focused way of scanning for child porn.
First, only photos were scanned and only if they were stored in iCloud.
Then, only cryptographic hashes of the photos were collected.
Those hashes were grepped for other cryptographic hashes of known child porn images, images which had to be in databases of multiple non-governmental organizations; so, if an image was only in the database of, say, the National Center For Missing And Exploited Children or only in the database of China’s equivalent, its cryptographic hash couldn’t be used. This requirement would make it harder for a dictator to slip in a hash to look for dissidents by making it substantially more difficult to get an image in enough databases.
Even then, an Apple employee would have to verify actual child porn was being stored in iCloud only after 20 separate images were flagged. (The odds any innocent person even makes it to this stage incorrectly was estimated to be something like one false positive a year, I think, because of all of the safeguards Apple had.)
Only after an Apple employee confirmed the existence of child porn would the iCloud account be frozen and the relevant non-government organizations alerted.
Honestly, I have a better chance of getting a handjob from Natalie Portman in the next 24 hours than an innocent person being incorrectly reported to any government authority.
Haha! Thanks for the excellent write up. Yes, I recall Apple handling CSAM this way and went out of it’s way to try and convince users it was still a good idea, but still faced a lot of criticism for it.
I doubt this bill will be as thorough which is why I was posing the question I asked. Apple could technically comply using some of the work it did but it’s sort of moot if things are end to end encrypted.
Great writeup! I tried searching but came up short, do you have a link to the technical documentation?
From a technical perspective, how much would an image need to be changed before the hash no longer matched? I’ve heard of people including junk .txt files in repacked and zipped pirated games, movies, etc., so that they aren’t automatically flagged for removal from file sharing sites.
I am not a technical expert by any means, and I don’t even use Apple products, so this is just curiosity.
It would have worked and it would have protected privacy but most people don’t understand the difference between having a hash of known CSAM on your phone and having actual CSAM on your phone for comparison purposes and it freaked people out.
I understand the difference and I’m still uncomfortable with it, not because of the proximity to CSAM but because I don’t like the precedent of anyone scanning my encrypted messages. Give them an inch, etc.
I’m assuming it’s either apple not wanting to be told to do it, or it’s due to them “learning their lesson” and no longer support it, they seem to be leaning quite heavily into privacy
Apple wants money for spying on their users, this bill would compel them to do that without the secret money their getting now, so their against it
Nah, Apple is one of the few companies around that is big on privacy and uses privacy as a differentiator for it’s products. Look at some of the other responses, it’s more complex than them just wanting money. They already make a boat load of it.