Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.
But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.
I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.
Did you also read the difference in how Apple was trying to go about it and how literally everyone else was going about it?
Apple wanted to scan your files on your device, which is a huge privacy issue and a huge slippery slope (and a backdoor built in).
The entire industry scans files when they are off your private device and on their own personal computers. So your privacy is protected here, and no backdoor built in.
Apple just had a fit and declared that if they can’t backdoor and scan your files on your own device then they just won’t try anything, even the most basics. They could just follow the lead of anyone else and scan iCloud files, but they refuse to do that. That was the difference.
First of all: You could turn off the local scanning by turning off iCloud sync - which would’ve sent the images to the cloud for scanning anyway. That’s it, nothing else, nobody at Apple would’ve touched a single super-private file on your device.
The local scanning required MULTIPLE (where n>3, they didn’t say the exact number for obvious reasons) matches to known and human-verified CSAM. This database is the one that would’ve been loaded from iCloud if you had it turned on. This is the exact same database all cloud providers are using for legal reasons. Some have other algos on top - at least Microsoft had an is_penis algorithm that shut down a German dude’s whole Live account for his kid’s pics being on OneDrive.
After the MULTIPLE matches (you can’t get flagged by “accidentally” having one on your phone, nor would pics of your kids in the pool trigger anything) a human checker would have had enough data to decrypt just those images and see a “reduced resolution facsimile” (Can’t remember the exact term) of the offending photos. This is where all of the brainpower used to create false matches would’ve ended up in. You would’ve had to create multiple matches of known CP images that looks enough like actual CP for the human to make an erroneous call multiple times to trigger anything.
If after that the human decided that yep, that’s some fucked up shit, the authorities would’ve been contacted.
Yes, a Bad Government could’ve forced Apple to add other stuff in the database. (They can do it right now for ALL major cloud storage providers BTW) But do you really think people wouldn’t have been watching for changes in the cloud-downloaded database and noticed any suspicious stuff immediately?
Also according to the paper the probability of a false match was 1 in 1 trillion accounts - and this was not disputed even by the most hardcore activists btw.
tl;dr If you already upload your stuff to the cloud (like iOS does automatically) the only thing that would’ve changed is that nobody would’ve had a legit reason to peep at your photos in the cloud “for the children”. But if you’ve got cloud upload off anyway, nothing would’ve changed. So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.
You don’t understand or you refuse to acknowledge this is a back door into your device an Apple is actively scanning your files meaning your device is now compromised.
Or are you shilling for anti-privacy?
My device, my files. I don’t want your scanning.
What’s so hard to grok about that unless you are anti-privacy?
So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.
Or that it was a built in backdoor running in your device.
The difference is what happens on your own device should be in your control. Once it leaves your device then it’s not in your control. Which is where the entire issue was. It doesn’t matter if I toggle a switch on whether to allow upload or not, the fact it was happening on my device was the issue.
The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.
But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.
Did you also read the difference in how Apple was trying to go about it and how literally everyone else was going about it?
Apple wanted to scan your files on your device, which is a huge privacy issue and a huge slippery slope (and a backdoor built in).
The entire industry scans files when they are off your private device and on their own personal computers. So your privacy is protected here, and no backdoor built in.
Apple just had a fit and declared that if they can’t backdoor and scan your files on your own device then they just won’t try anything, even the most basics. They could just follow the lead of anyone else and scan iCloud files, but they refuse to do that. That was the difference.
There was no “huge privacy issue”.
First of all: You could turn off the local scanning by turning off iCloud sync - which would’ve sent the images to the cloud for scanning anyway. That’s it, nothing else, nobody at Apple would’ve touched a single super-private file on your device.
The local scanning required MULTIPLE (where n>3, they didn’t say the exact number for obvious reasons) matches to known and human-verified CSAM. This database is the one that would’ve been loaded from iCloud if you had it turned on. This is the exact same database all cloud providers are using for legal reasons. Some have other algos on top - at least Microsoft had an is_penis algorithm that shut down a German dude’s whole Live account for his kid’s pics being on OneDrive.
After the MULTIPLE matches (you can’t get flagged by “accidentally” having one on your phone, nor would pics of your kids in the pool trigger anything) a human checker would have had enough data to decrypt just those images and see a “reduced resolution facsimile” (Can’t remember the exact term) of the offending photos. This is where all of the brainpower used to create false matches would’ve ended up in. You would’ve had to create multiple matches of known CP images that looks enough like actual CP for the human to make an erroneous call multiple times to trigger anything.
If after that the human decided that yep, that’s some fucked up shit, the authorities would’ve been contacted.
Yes, a Bad Government could’ve forced Apple to add other stuff in the database. (They can do it right now for ALL major cloud storage providers BTW) But do you really think people wouldn’t have been watching for changes in the cloud-downloaded database and noticed any suspicious stuff immediately?
Also according to the paper the probability of a false match was 1 in 1 trillion accounts - and this was not disputed even by the most hardcore activists btw.
tl;dr If you already upload your stuff to the cloud (like iOS does automatically) the only thing that would’ve changed is that nobody would’ve had a legit reason to peep at your photos in the cloud “for the children”. But if you’ve got cloud upload off anyway, nothing would’ve changed. So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.
You don’t understand or you refuse to acknowledge this is a back door into your device an Apple is actively scanning your files meaning your device is now compromised.
Or are you shilling for anti-privacy?
My device, my files. I don’t want your scanning.
What’s so hard to grok about that unless you are anti-privacy?
Or that it was a built in backdoor running in your device.
The difference is what happens on your own device should be in your control. Once it leaves your device then it’s not in your control. Which is where the entire issue was. It doesn’t matter if I toggle a switch on whether to allow upload or not, the fact it was happening on my device was the issue.