Skip to main content

Apple scrubs controversial CSAM detection feature from webpage but says plans haven’t changed

Apple scrubs controversial CSAM detection feature from webpage but says plans haven’t changed

/

The feature is delayed, not canceled, Apple says

Share this story

Illustration by Alex Castro / The Verge

Apple has updated a webpage on its child safety features to remove all references to the controversial child sexual abuse material (CSAM) detection feature first announced in August. The change, which was spotted by MacRumors, appears to have taken place some time between December 10th and December 13th. But despite the change to its website, the company says its plans for the feature haven’t changed.

Two of the three safety features, which released earlier this week with iOS 15.2, are still present on the page, which is titled “Expanded Protections for Children.” However references to the more controversial CSAM detection, whose launch was delayed following backlash from privacy advocates, have been removed.

When reached for comment, Apple spokesperson Shane Bauer said that the company’s position hasn’t changed since September, when it first announced it would be delaying the launch of the CSAM detection. “Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” the company’s September statement read.

Crucially, Apple’s statement does not say the feature has been canceled entirely. Documents outlining how the functionality works are still live on Apple’s site.

Apple’s CSAM detection feature was controversial when it was announced because it involves taking hashes of iCloud Photos and comparing them to a database of hashes of known child sexual abuse imagery. Apple claims this approach allows it to report users to the authorities if they’re known to be uploading child abuse imagery without compromising the privacy of its customers more generally. It also says the encryption of user data is not affected and that the analysis be run on-device. 

But critics argue that Apple’s system risks undermining Apple’s end-to-end encryption. Some referred to the system as a “backdoor” that governments around the world might strong-arm Apple into expanding into including content beyond CSAM. For its part, Apple has said that it will “not accede to any government’s request to expand it” beyond CSAM.

While the CSAM detection feature has yet to receive a new launch date, Apple has gone on to release two of the other child-protection features it announced in August. One is designed to warn children when they receive images containing nudity in Messages, while the second provides additional information when searching for terms related to child exploitation through Siri, Spotlight, or Safari Search. Both rolled out with iOS 15.2, which was released earlier this week and which appears to have prompted Apple to update its webpage.