Skip to main content

Apple pushes back against child abuse scanning concerns in new FAQ

Apple pushes back against child abuse scanning concerns in new FAQ

/

‘We will not accede to any government’s request to expand it’

Share this story

Illustration by Alex Castro / The Verge

In a new FAQ, Apple has attempted to assuage concerns that its new anti-child abuse measures could be turned into surveillance tools by authoritarian governments. “Let us be clear, this technology is limited to detecting CSAM [child sexual abuse material] stored in iCloud and we will not accede to any government’s request to expand it,” the company writes. 

Apple’s new tools, announced last Thursday, include two features designed to protect children. One, called “communication safety,” uses on-device machine learning to identify and blur sexually explicit images received by children in the Messages app, and can notify a parent if a child age 12 and younger decides to view or send such an image. The second is designed to detect known CSAM by scanning users’ images if they choose to upload them to iCloud. Apple is notified if CSAM is detected, and it will alert the authorities when it verifies such material exists.

“Apple’s CSAM detection capability is built solely to detect known CSAM images”

The plans met with a swift backlash from digital privacy groups and campaigners, who argued that these introduce a backdoor into Apple’s software. These groups note that once such a backdoor exists there is always the potential for it to be expanded to scan for types of content that go beyond child sexual abuse material. Authoritarian governments could use it to scan for politically dissent material, or anti-LGBT regimes could use it to crack down on sexual expression.

“Even a thoroughly documented, carefully thought-out, and narrowly-scoped backdoor is still a backdoor,” the Electronic Frontier Foundation wrote. “We’ve already seen this mission creep in action. One of the technologies originally built to scan and hash child sexual abuse imagery has been repurposed to create a database of ‘terrorist’ content that companies can contribute to and access for the purpose of banning such content.”

However, Apple argues that it has safeguards in place to stop its systems from being used to detect anything other than sexual abuse imagery. It says that its list of banned images is provided by the National Center for Missing and Exploited Children (NCMEC) and other child safety organizations, and that the system “only works with CSAM image hashes provided by NCMEC and other child safety organizations.” Apple says it won’t add to this list of image hashes, and that the list is the same across all iPhones and iPads to prevent individual targeting of users.

The company also says that it will refuse demands from governments to add non-CSAM images to the list. “We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future,” it says.

It’s worth noting that despite Apple’s assurances, the company has made concessions to governments in the past in order to continue operating in their countries. It sells iPhones without FaceTime in countries that don’t allow encrypted phone calls, and in China it’s removed thousands of apps from its App Store, as well as moved to store user data on the servers of a state-run telecom

The FAQ also fails to address some concerns about the feature that scans Messages for sexually explicit material. The feature does not share any information with Apple or law enforcement, the company says, but it doesn’t say how it’s ensuring that the tool’s focus remains solely on sexually explicit images.

“All it would take to widen the narrow backdoor that Apple is building is an expansion of the machine learning parameters to look for additional types of content, or a tweak of the configuration flags to scan, not just children’s, but anyone’s accounts,” wrote the EFF. The EFF also notes that machine-learning technologies frequently classify this content incorrectly, and cites Tumblr’s attempts to crack down on sexual content as a prominent example of where the technology has gone wrong.

Today’s Storystream

Feed refreshed 6:45 AM UTC Dimorphos didn’t even see it coming

T
Thomas Ricker6:45 AM UTC
Check out this delightful DART Easter egg.

Just Google for “NASA DART.” You’re welcome.


R
Twitter
Richard Lawler12:00 AM UTC
A direct strike at 14,000 mph.

The Double Asteroid Redirection Test (DART) scored a hit on the asteroid Dimorphos, but as Mary Beth Griggs explains, the real science work is just beginning.

Now planetary scientists will wait to see how the impact changed the asteroid’s orbit, and to download pictures from DART’s LICIACube satellite which had a front-row seat to the crash.


M
The Verge
We’re about an hour away from a space crash.

At 7:14PM ET, a NASA spacecraft is going to smash into an asteroid! Coverage of the collision — called the Double Asteroid Redirection Test — is now live.


Asian America learns how to hit back

The desperate, confused, righteous campaign to stop Asian hate

Esther WangSep 26
E
Twitter
Emma RothSep 26
There’s a surprise in the sky tonight.

Jupiter will be about 367 million miles away from Earth this evening. While that may seem like a long way, it’s the closest it’s been to our home planet since 1963.

During this time, Jupiter will be visible to the naked eye (but binoculars can help). You can check where and when you can get a glimpse of the gas giant from this website.


E
Twitter
Emma RothSep 26
Missing classic Mario?

One fan, who goes by the name Metroid Mike 64 on Twitter, just built a full-on 2D Mario game inside Super Mario Maker 2 complete with 40 levels and eight worlds.

Looking at the gameplay shared on Twitter is enough to make me want to break out my SNES, or at least buy Super Mario Maker 2 so I can play this epic retro revamp.


R
External Link
Russell BrandomSep 26
The US might still force TikTok into a data security deal with Oracle.

The New York Times says the White House is still working on TikTok’s Trump-era data security deal, which has been in a weird limbo for nearly two years now. The terms are basically the same: Oracle plays babysitter but the app doesn’t get banned. Maybe it will happen now, though?


R
Youtube
Richard LawlerSep 26
Don’t miss this dive into Guillermo del Toro’s stop-motion Pinocchio flick.

Andrew Webster and Charles Pulliam-Moore covered Netflix’s Tudum reveals (yes, it’s going to keep using that brand name) over the weekend as the streamer showed off things that haven’t been canceled yet.

Beyond The Way of the Househusband season two news and timing information about two The Witcher projects, you should make time for this incredible behind-the-scenes video showing the process of making Pinocchio.


R
External Link
Russell BrandomSep 26
Edward Snowden has been granted Russian citizenship.

The NSA whistleblower has been living in Russia for the 9 years — first as a refugee, then on a series of temporary residency permits. He applied for Russian citizenship in November 2020, but has said he won’t renounce his status as a U.S. citizen.


E
External Link
Emma RothSep 26
Netflix’s gaming bet gets even bigger.

Even though fewer than one percent of Netflix subscribers have tried its mobile games, Netflix just opened up another studio in Finland after acquiring the Helsinki-based Next Games earlier this year.

The former vice president of Zynga Games, Marko Lastikka, will serve as the studio director. His track record includes working on SimCity BuildIt for EA and FarmVille 3.


A
External Link
Vietnam’s EV aspirant is giving big Potemkin village vibes

Idle equipment, absent workers, deserted villages, an empty swimming pool. VinFast is Vietnam’s answer to Tesla, with the goal of making 1 million EVs in the next 5-6 years to sell to customers US, Canada and Europe. With these lofty goals, the company invited a bunch of social media influencers, as well as some auto journalists, on a “a four-day, multicity extravaganza” that seemed more weird than convincing, according to Bloomberg.


J
James VincentSep 26
Today, 39 years ago, the world didn’t end.

And it’s thanks to one man: Stanislav Petrov, a USSR military officer who, on September 26th, 1983, took the decision not to launch a retaliatory nuclear attack against the US. Petrov correctly guessed that satellite readings showing inbound nukes were faulty, and so likely saved the world from nuclear war. As journalist Tom Chivers put it on Twitter, “Happy Stanislav Petrov Day to those who celebrate!” Read more about Petrov’s life here.


Soviet Colonel who prevented 1983 nuclear response
Photo by Scott Peterson/Getty Images
J
The Verge
James VincentSep 26
Deepfakes were made for Disney.

You might have seen the news this weekend that the voice of James Earl Jones is being cloned using AI so his performance as Darth Vader in Star Wars can live on forever.

Reading the story, it struck me how perfect deepfakes are for Disney — a company that profits from original characters, fans' nostalgia, and an uncanny ability to twist copyright law to its liking. And now, with deepfakes, Disney’s most iconic performances will live on forever, ensuring the magic never dies.


E
External Link
Hurricane Fiona ratcheted up tensions about crypto bros in Puerto Rico.

“An official emergency has been declared, which means in the tax program, your physical presence time is suspended,” a crypto investor posted on TikTok. “So I am headed out of the island.” Perhaps predictably, locals are furious.