Skip to main content

WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

WhatsApp lead and other tech experts fire back at Apple’s Child Safety plan

/

A new open letter asked Apple to reconsider the changes

Share this story

Illustration by Alex Castro / The Verge

The chorus of voices expressing concern and dismay over Apple’s new Child Safety measures grew louder over the weekend, as an open letter with more than 4,000 signatures made the rounds online. The Apple Privacy Letter asked the iPhone maker to “reconsider its technology rollout,” lest it undo “decades of work by technologists, academics and policy advocates” on privacy-preserving measures.

Apple’s plan, which it announced on Thursday, involves taking hashes of images uploaded to iCloud and comparing them to a database that contains hashes of known CSAM images. According to Apple, this allows it to keep user data encrypted and run the analysis on-device while still allowing it to report users to the authorities if they’re found to be sharing child abuse imagery. Another prong of Apple’s Child Safety strategy involves optionally warning parents if their child under 13 years old sends or views photos containing sexually explicit content. An internal memo at Apple acknowledged that people would be “worried about the implications” of the systems.

Cathcart calls Apple’s approach “very concerning,” and he’s not alone

WhatsApp’s head Will Cathcart said in a Twitter thread that his company wouldn’t be adopting the safety measures, calling Apple’s approach “very concerning.” Cathcart said WhatsApp’s system to fight child exploitation, which partly utilizes user reports, preserves encryption like Apple’s and has led to the company reporting over 400,000 cases to the National Center for Missing and Exploited Children in 2020. (Apple is also working with the Center for its CSAM detection efforts.)

WhatsApp’s owner, Facebook, has reasons to pounce on Apple for privacy concerns. Apple’s changes to how ad tracking works in iOS 14.5 started a fight between the two companies, with Facebook buying newspaper ads criticizing Apple’s privacy changes as harmful to small businesses. Apple fired back, saying that the change “simply requires” that users be given a choice on whether to be tracked.

The list of people and organizations raising concerns about Apple’s policy includes Edward Snowden, the Electronic Frontier Foundation, professors, and more. We’ve collected some of those reactions here to act as an overview of some of the criticisms levied against Apple’s new policy.


Matthew Green, an associate professor at Johns Hopkins University, pushed back on the feature before it was publicly announced. He tweeted about Apple’s plans and about how the hashing system could be abused by governments and malicious actors.

The EFF released a statement that blasted Apple’s plan, more or less calling it a “thoroughly documented, carefully thought-out, and narrowly-scoped backdoor.” The EFF’s press release goes into detail on how it believes Apple’s Child Safety measures could be abused by governments and how they decrease user privacy.

Kendra Albert, an instructor at Harvard’s Cyberlaw Clinic, has a thread on the potential dangers to queer children and Apple’s initial lack of clarity around age ranges for the parental notifications feature.

Edward Snowden retweeted the Financial Times article about the system, giving his own characterization of what Apple is doing.

Politician Brianna Wu called the system “the worst idea in Apple History.”

Writer Matt Blaze also tweeted about the concerns that the technology could be abused by overreaching governments, trying to prevent content other than CSAM.

Epic CEO Tim Sweeney also criticized Apple, saying that the company “vacuums up everybody’s data into iCloud by default.” He also promised to share more thoughts specifically about Apple’s Child Safety system.

Not every reaction has been critical, however. Ashton Kutcher (who has done advocacy work to end child sex trafficking since 2011) calls Apple’s work “a major step forward” for efforts to eliminate CSAM.