When Apple announced changes it plans to make to iOS devices in an effort to help curb child abuse by finding child sexual abuse material (CSAM), parts of its plan generated backlash.
First, it’s rolling out an update to its Search app and Siri voice assistant on iOS 15, watchOS 8, iPadOS 15, and macOS Monterey. When a user searches for topics related to child sexual abuse, Apple will redirect the user to resources for reporting CSAM, or getting help for an attraction to such content.
But it’s Apple’s two other CSAM plans that have garnered criticism. One update will add a parental control option to Messages, sending an alert to parents if a child age 12 or younger views or sends sexually explicit pictures, and obscuring the images for any users under 18.
The one that’s proven most controversial is Apple’s plan to scan on-device images to find CSAM before they images are uploaded to iCloud, reporting them to Apple’s moderators who can then turn the images over to the National Center for Missing and Exploited Children (NCMEC) in the case of a potential match. While Apple says the feature will protect users while allowing the company to find illegal content, many Apple critics and privacy advocates say the provision is basically a security backdoor, an apparent contradiction to Apple’s long-professed commitment to user privacy.
To stay up to speed on the latest news about Apple’s CSAM protection plans, follow our storystream, which we’ll update whenever there’s a new development. If you need a starting point, check out our explainer here.
Dec 7, 2022, 7:02 PM UTCRichard Lawler
Apple drops controversial plans for child sexual abuse imagery scanning
A plan unveiled last year for client-side scanning of iCloud Photos to detect imagery of abuse has been abandoned as Apple focuses on end-to-end encryption and other ways to protect children.
Aug 18, 2021, 2:13 PM UTCRussell Brandom
Apple says collision in child-abuse hashing system is not a concern
Researchers produced a collision in Apple’s new algorithm, but the company says the finding was expected
Aug 13, 2021, 8:25 PM UTCAdi Robertson
Apple says its iCloud scanning will rely on multiple child safety groups to address privacy fears
It’s supposed to stop governments from abusing the system
Aug 10, 2021, 9:43 PM UTCAdi Robertson
Apple’s controversial new child protection features, explained
Apple says its system is secure — its critics say the opposite
Aug 10, 2021, 5:44 PM UTCNilay Patel
Here’s why Apple’s new child safety features are so controversial
Encryption and consumer privacy experts break down Apple’s plan for child safety
Aug 9, 2021, 10:49 AM UTCJon Porter
Apple pushes back against child abuse scanning concerns in new FAQ
‘We will not accede to any government’s request to expand it’
Aug 7, 2021, 7:55 PM UTCMitchell Clark
WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan
A new open letter asked Apple to reconsider the changes
Aug 5, 2021, 8:02 PM UTCRussell Brandom and Richard Lawler
Apple reveals new efforts to fight child abuse imagery
A new hashing system will be limited to images on iCloud photos
Aug 5, 2021, 4:55 PM UTCJay Peters
Apple will scan photos stored on iPhones and iCloud for child abuse imagery
The feature will roll out in the US first