Skip to main content

Latest iOS beta blurs nude images for children using Messages app

Latest iOS beta blurs nude images for children using Messages app

/

But parents are no longer notified

Share this story

Promotional images of the feature released earlier this year.
Promotional images of the feature released earlier this year.
Image: Apple

iOS 15.2’s latest beta adds an Apple Communication Safety feature to the Messages app, MacRumors reports. The opt-in feature is designed to protect children from inappropriate imagery by scanning incoming and outgoing pictures for “sexually explicit” material. Any images meeting this description are blurred, and the child will be warned about its contents and told it’s ok not to view it. The feature, which ties into Apple’s existing Family Sharing system, is also designed to offer resources to affected children for them to get help.

The version of the feature that’s released in iOS 15.2’s latest beta has one crucial difference from what Apple originally announced in August: it won’t send notifications to parents if a child decides to view a sexually explicit image. Critics like Harvard Cyberlaw Clinic instructor Kendra Albert objected to this element in particular because it could out queer or transgender children to their parents. MacRumors also notes that in its original form, the feature could have introduced safety issues when a parent is violent or abusive.

CNET reports that children will instead have the choice of whether to alert someone they trust about a flagged photo, and that this choice is separate from the choice of whether to unblur and view the image. Checks are carried out on-device, and do not impact end-to-end encryption.

The Communication Safety feature was originally announced in August as part of a trio of features designed to protect children from sexual abuse. However, the company said it was delaying the introduction of the features the following month in response to objections raised by privacy advocates.

Communication Safety is distinct from the CSAM-detection (child sexual abuse imagery detection) feature that scans a user’s iCloud Photos and reports offending content to Apple moderators, and which generated the bulk of the outcry from privacy advocates. There is also an update coming to Siri search that’s designed to offer resources if a user searches for topics relating to child sexual abuse. It’s currently unclear when these two features are planned for release, and there haven’t been reports of them appearing in Apple’s public beta software.

It’s worth noting that features added to iOS 15.2’s latest beta could still change dramatically before its official release, and it may be removed from the update entirely. Other new features that have arrived in the latest beta include a manual AirTag scanning feature, as well as the option to pass on your iCloud data to a loved one in the case of your death.