- Joined: Apr 13, 2016
- Last Login: Jul 27, 2022, 10:36am EDT
- Comments: 222
Share this profile
Comment 2 recs
ok first off please don’t put words in my mouth.
I literally said they are looking at a hash of the image so I don’t know why you felt the need to explain that to me.
Rightfully so, apple does not want CSAM content on their servers. No company does. It is why Google and others have been scanning for these types of images for a long time. Outside of the obvious case of catching people like this. There have been reports for sometime now that they are working on even stronger end to end encryption for the entire iCloud stack but has run into roadblocks for issues like this.
In my opinion CSAM scanning is a necessary evil. However it depends on how the implementation is setup. Most of the backlash on it was fears that apple would catch you taking pictures of your child or that it could be used for memes or something with fundamentally it could not.
Regarding the false positive component, that was solved by the human review layer that is added only after a hash is matched (multiple matches if I remember correctly). That created image would not be CSAM itself and would easily be marked as not child porn.
The solution from apple here is one of 2 things:
1. Don’t do CSAM scanning at all and just let people store (and share) those photos on their servers
2. Implement a local solution that only takes affect once you try to upload the image to iCloud. That only looks at a hash of the image and doesn’t try to use machine learning to identify images.
Great Edward Snowden was against it, just because he has had some great opinions doesn’t mean he is right on everything.
In this particular case I feel like CSAM is a necessary evil and their implementation of it was the best way to accomplish it while respecting users privacy.
That being said, my response was to google not doing csam scanning on devices as somehow an argument that google actually respects anyones privacy. Which while it may not be on device they do not encrypt images on their servers meaning they can do whatever they want with it once it’s there. Including doing these scans with techniques worse than just comparing a hash.
Comment 1 reply, 9 recs
Do some research, google has been scanning for CSAM on any uploaded images for a while
Google’s is even worse since they are attempting to actually look at the photo or video to determine if it is instead of just looking just at a hash of it. Look at their research in detection drawings.
Apple may have been doing this on device, but it only mattered if the content was uploaded to iCloud. Meaning the outcome was the exact same.
Apple caught a lot of flak out of a lack of understanding of this how this worked. But the reality is they implemented CSAM scanning in the most privacy respecting way possible.
Comment 2 replies, 26 recs
Annnnnd my continued policy if never doing anything that is remotely sensitive in the slightest on any platform except Apple continues.
I am very glad I invested in the HomeKit ecosystem and all of my cameras are part of HomeKit Secure Video (as well as blocked internet access to the cameras on my router).
Apple may not be perfect, but once again they have shown that they give a shit. End to end video encryption should be the default for all of these companies.
But since when has google ever cared about privacy.