Apple has ended the development of technology intended to detect possible child sexual abuse material (CSAM) while it’s stored on user devices, according to The Wall Street Journal.
That plan was unveiled last fall with an intended rollout for iOS 15, but backlash quickly followed as encryption and consumer privacy experts warned about the danger of creating surveillance systems that work directly from your phone, laptop, or tablet.
As recently as last December, Apple said its plans on that front hadn’t changed, but now Apple software VP Craig Federighi says, “Child sexual abuse can be headed off before it occurs... That’s where we’re putting our energy going forward.” Asked directly about the impacts of expanding encryption on the work of law enforcement agents investigating crimes, he said, “ultimately, keeping customer’s data safe has big implications on our safety more broadly.”
Now the company is expanding end-to-end encryption to include phone backups and adding other new features aimed at preserving privacy and security while using iMessage and for data stored in iCloud.
Apple did roll out part of the technology it announced last fall, dubbed “communication safety in iMessage,” in the US as part of the iOS 15.2 update and to other countries this year, albeit with some tweaks from the original plan. It’s an opt-in feature for the Messages app, connected to the Family Sharing setup, that scans incoming and outgoing pictures for “sexually explicit” material to children’s accounts.
If it detects something that it thinks crosses that bar, the imagery is blurred, and it displays a pop-up message with guidance on getting help or blocking the sender. The original plan appeared to suggest it would also automatically notify parents of any detection, but as implemented, that is available as an option for the user.