iOS 14 is filled with accessibility improvements

Photo by Amelia Holowaty Krales / The Verge

Apple’s new operating systems — like iOS 14 and tvOS 14, which are due to be released later this year — include numerous features that should make them easier to use by people with disabilities. Apple announced the new features as part of its Worldwide Developers Conference this week, and Forbes and CNET have rounded many of them up.

These improvements range from new features like sound recognition to improvements to Apple’s existing accessibility features like its VoiceOver screen reader. It’s a substantial list that should make Apple’s products easier to use by those with hearing, sight, motor disabilities, or others.

Sound recognition in iOS 14, for example, will let you tell your phone to constantly listen out for 14 different sounds, including doorbells, sirens, smoke detector alarms, or a crying baby. It’s a feature that could be helpful for people who are hard of hearing or deaf by making them aware of critical sounds earlier than they might have done otherwise. (Apple warns against relying on the feature in “high-risk or emergency situations,” however.)

Then there’s iOS 14’s new Back Tap feature. Twitter users were quick to point out that you can use the feature to make it easier to launch Google’s voice assistant if you’d rather not speak to Siri. But as Forbes notes, the more important aspect of this accessibility feature is that it can be used to replace screen gestures that might be tricky to perform for people with cognitive or motor disabilities. You could tap the back of your phone to access the notification center rather than stretching your thumb to swipe down, for example, or even set up more complex actions using shortcuts.

There’s a trend running throughout many of these features, which is although they’re designed to make devices easier to use for people with disabilities, they can also have benefits for everyone else. People with disabilities should always be the focus when designing accessibility features, but their benefits can be much wider-ranging.

Next up, FaceTime, which will now be able to detect when someone is using sign language, and automatically make that person the focus, making their signing easier to see. tvOS will soon work with Microsoft’s Xbox Adaptive Controller — a controller specifically designed for people with disabilities.

There’s also a new “headphone accommodations” feature in iOS 14, which adjusts the sound frequencies streamed through select Apple and Beats headphones to better match your hearing. Apple says the new accessibility feature should make “music, movies, phone calls, and podcasts” listened to using the headphones “sound more crisp and clear.” It also works with the AirPods Pro’s transparency mode to help make quiet voices around you more audible.

As well as big new features like these, Apple is making a host of other updates to its existing accessibility features. Its VoiceOver screen reader, for example, will now be able to recognize and describe more of what it sees on-screen, like reading text from images or photos. Apple’s Magnifier and voice control options have also been updated, and CNET notes that some of its Xcode coding tools are being updated to make them more accessible.

Apple isn’t the first company to have introduced features like these, and other companies are making big strides of their own (Android 11, for example, will deliver big upgrades to Android’s voice controls), but its commitment to introducing and then refining its accessibility options should be applauded.

Comments

As a longtime former Android user, Back Tap to launch Google Assistant is exactly the kind of gesture I’ve been waiting for.

What’s the wallpaper featured in the Federico Viticci tweet?

Why don’t you go ask him on Twitter?

Comprehensive article covering a range of accessibility features for a range of disabilities that are welcomed.

I welcome the fact that Voice Control will finally support British English voices when iOS14 and macOS Big Sur launch in the autumn. This development and Announce Messages with Siri and Shortcuts coming to the WatchOS 7 makes a year of campaigning for these features worthwhile, and are welcomed.

However, the article fails to report much needed accessibility features that have not been included:

1) Siri and Voice Control still can’t help a user hang up a live phone call

2) Siri and Voice Control still can’t help a user answer a phone call

3) No auto-answer feature on the Apple Watch

4) As mentioned no ability to toggle the auto answer feature on/off with a voice command on iPhone and Watch.

5) can’t dictate into the Google search text box without text getting mangled (and other text boxes like in Wordpress)

Until the public betas are out next month I can’t personally and definitively check for these but nothing has been reported here or in the press or from developers yet, or from Apple itself.

I have also asked Apple’s accessibility department and they are unable to provide any information either.

I am sure there is time between now and the launch of iOS 14, WatchOS 7 and macOS Big Sur in the autumn for Apple to implement these much needed features.

All I and other physically disabled Apple users want is world leading voice access across Apple devices and operating systems. At the moment we don’t have this.

It staggers me and should cause red faces all round at Apple how all the great and the good in Apple technology can create a function to start calls by a voice command.. but no one thinks to set up a function that measn you can end the very call you started by a voice command.

View All Comments
Back to top ↑