Skip to main content

iPhones will be able to speak in your voice with 15 minutes of training

iPhones will be able to speak in your voice with 15 minutes of training

/

Apple’s new accessibility features can assist those who’ve lost the ability to speak or who are blind or have low vision.

Share this story

Screenshots of iPhones showing the Apple Personal Voice and Live Speech accessibility features, with Personal Voice prompting for a phrase to learn the user’s voice and a screen showing how to add a phrase in Live Speech.
Apple Personal Voice and Live Speech accessibility features.
Image: Apple

Today, Apple previewed a bundle of new features designed for cognitive, vision, hearing, and mobility accessibility. That includes a new Personal Voice feature for people who may lose their ability to speak, allowing them to create “a synthesized voice that sounds like them” to talk with friends or family members.

According to Apple, users can create a Personal Voice by reading a set of text prompts aloud for a total of 15 minutes of audio on the iPhone or iPad. Since the feature integrates with Live Speech, users can then type what they want to say and have their Personal Voice read it to whomever they want to talk to. Apple says the feature uses “on-device machine learning to keep users’ information private and secure.”

Apple’s Assistive Access mode shown on an iPhone and iPad.
Apple’s Assistive Access mode shown on an iPhone and iPad.
Image: Apple

Additionally, Apple is introducing streamlined versions of its core apps as part of a feature called Assistive Access meant to support users with cognitive disabilities. The feature is designed to “distill apps and experiences to their essential features in order to lighten cognitive load.” That includes a combined version of Phone and FaceTime as well as modified versions of the Messages, Camera, Photos, and Music apps that feature high contrast buttons, large text labels, and additional accessibility tools.

Work on a “custom accessibility mode” was spotted late last year in an iOS 16.2 beta release. Apple says the features will arrive “later this year,” which suggests they could be part of iOS 17.

There’s also a new detection mode in Magnifier to help users who are blind or have low vision, which is designed to help users interact with physical objects with numerous text labels. As an example, Apple says a user can aim their device’s camera at a label, such as a microwave keypad, which the iPhone or iPad will then read aloud as the user moves their finger across each number or setting on the appliance.

Apple highlighted a number of other features coming to the Mac as well, including a way for deaf or hard-of-hearing users to pair Made for iPhone hearing devices with a Mac. The company is also adding an easier way to adjust the size of the text in Finder, Messages, Mail, Calendar, and Notes on Mac.

Users will also be able to pause GIFs in Safari and Messages, customize the rate at which Siri speaks to them, and use Voice Control for phonetic suggestions when editing text. All of this builds upon Apple’s existing accessibility features for the Mac and iPhone, which includes Live Captions, a VoiceOver screen reader, Door Detection, and more.

“Accessibility is part of everything we do at Apple,” Sarah Herrlinger, Apple’s senior director of global accessibility policy and initiatives, said in a statement. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”