Skip to main content

Apple delays controversial child protection features after privacy outcry

Apple delays controversial child protection features after privacy outcry

/

The changes were supposed to roll out later this year

Share this story

The Apple logo is shown in a photo illustration
Apple delays controversial child protection features
Illustration by Alex Castro / The Verge

Apple is delaying its child protection features announced last month, including a controversial feature that would scan users’ photos for child sexual abuse material (CSAM), following intense criticism that the changes could diminish user privacy. The changes had been scheduled to roll out later this year.

“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material,” Apple said in a statement to The Verge. “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

It’s a major reversal

Apple’s original press release about the changes, which were intended to reduce the proliferation of child sexual abuse material (CSAM), has a similar statement at the top of the page. That release detailed three major changes in the works. One change to Search and Siri would point to resources to prevent CSAM if a user searched for information related it.

The other two changes came under more significant scrutiny. One would alert parents when their kids were receiving or sending sexually explicit photos and would blur those images for kids. The other would have scanned images stored in a user’s iCloud Photos for CSAM and report them to Apple moderators, who could then refer the reports to the National Center for Missing and Exploited Children, or NCMEC.

Apple detailed the iCloud Photo scanning system at length to make the case that it didn’t weaken user privacy. In short, it scanned photos stored in iCloud Photos on your iOS device and would assess those photos alongside a database of known CSAM image hashes from NCMEC and other child safety organizations.

Still, many privacy and security experts heavily criticized the company for the new system, arguing that it could have created an on-device surveillance system and that it violated the trust users had put in Apple for protecting on-device privacy.

The Electronic Frontier Foundation said in an August 5th statement that the new system, however well-intended, would “break key promises of the messenger’s encryption itself and open the door to broader abuses.”

“Apple is compromising the phone that you and I own and operate,” said Ben Thompson at Stratechery in his own criticism, “without any of us having a say in the matter.”