Skip to main content

Craig Federighi says Apple’s child safety scanning will have ‘multiple levels of auditability’

Craig Federighi says Apple’s child safety scanning will have ‘multiple levels of auditability’

/

‘It’s really clear a lot of messages got jumbled pretty badly’

Share this story

Epic-Apple Trial Hangs Over Some 50,000 Games On App Store

Apple executive Craig Federighi says iCloud Photos’ plans to scan for child sexual abuse material (or CSAM) will include “multiple levels of auditability.” In an interview with The Wall Street Journal, Federighi — Apple’s senior vice president of software engineering — offered new details about its controversial child safety measures. That includes a claim that the iPad and iPhone’s device-level scanning will help security experts verify that Apple is using the system responsibly.

Like many companies with cloud storage services, Apple will check iCloud Photos images against a list from the National Center for Missing and Exploited Children (NCMEC), looking for exact matches with known CSAM pictures. But unlike many services, it will run searches on the device, not fully remotely. “Imagine someone was scanning images in the cloud. Well, who knows what’s being scanned for?” Federighi said, referring to remote scans. “In our case, the database is shipped on device. People can see, and it’s a single image across all countries.”

Federighi elaborated slightly on how this might give people confidence Apple isn’t greatly expanding the database to include material besides illegal CSAM, particularly in countries with restrictive censorship policies.

“We’re making sure that you don’t have to trust any one entity, or even any one country”

“We ship the same software in China with the same database we ship in America, as we ship in Europe. If someone were to come to Apple [with a request to scan for data beyond CSAM], Apple would say no. But let’s say you aren’t confident. You don’t want to just rely on Apple saying no. You want to be sure that Apple couldn’t get away with it if we said yes,” he told the Journal. “There are multiple levels of auditability, and so we’re making sure that you don’t have to trust any one entity, or even any one country, as far as what images are part of this process.”

Apple has previously said that it’s only rolling out the system in the United States and that it will consider launching in other countries on a case-by-case basis. The company confirmed to The Verge that Apple will ship the hash database of known CSAM on the operating system in all countries, but it will only be used for scanning in the US. The Journal further clarifies that there will be an independent auditor who can verify the images involved.

Federighi also offered more detail on when the scanning system will notify an Apple moderator of potential illegal content. Apple has said before that a single match won’t trigger a red flag — a measure intended to prevent false positives. Instead, the system generates “safety vouchers” for each match and alerts Apple if the number hits a certain threshold. Apple has declined to publicize the exact threshold, saying this could let abusers evade detection. But Federighi says it’s “on the order of 30 known child pornographic images.”

Some security experts have offered cautious praise of Apple’s system and acknowledged the importance of finding CSAM online. But many have criticized Apple’s abrupt rollout and a lack of clarity on how the system worked. In his interview with the Journal, Federighi acknowledged the confusion. “It’s really clear a lot of messages got jumbled pretty badly in terms of how things were understood,” he said.