clock menu more-arrow no yes

Filed under:

Apple announces a slew of accessibility updates

SignTime launches May 20th and software updates come later this year

A person’s wrist showing an Apple Watch. On the screen of the Watch is an option to end a run, highlighted with a small cursor hovering over it.
Apple Watch with the Assistive Touch cursor.
Image: Apple

Apple announced a variety of new and updated features for people with disabilities today. Beginning May 20th, customers can use the new SignTime sign language interpreter service to contact AppleCare and retail customer care through their web browsers. Software updates later this year to iOS, watchOS, and iPadOS will bring improved options for Assistive Touch, VoiceOver, hearing aid support, and background sounds.

The new SignTime service is launching first in the US, UK, and France, and offers remote interpreter access for American Sign Language, British Sign Language, and French Sign Language. People can also use the service in Apple stores to get interpretation without booking it in advance, potentially preventing the hassle of wrangling an interpreter on short notice.

People with limb differences will be able to use Assistive Touch on watchOS, which should enable them to use Apple Watch without touching the screen. (Apple did not verify which Watches this will work for.) Apple says the Apple Watch can detect muscle movement and tendon activity through its built-in sensors, allowing users to control a cursor on the watch screen, answer calls, access notifications, and more by making various movements and gestures. (You can see it in action in the video on this page.)

iPadOS will support third-party eye-tracking devices, allowing people with low mobility to move a cursor with their gaze and perform actions by holding eye contact, rather than tapping the screen.

VoiceOver, Apple’s built-in screen reader, will be updated to include more details in images. According to Apple, it will allow people to navigate images with text and data tables by rows and columns, and describe people and objects in images. People will also be able to add image descriptions with Markup.

Apple plans to upgrade the Made for iPhone hearing devices program, too, with support for bidirectional hearing aids. Users will also be able to upload their hearing test results to Headphone Accommodations to more easily customize how the feature amplifies sounds and adjusts different frequencies.

Three smiling Memojis, showing one with a cochlear implant on the left, one with nasal oxygen tubes in the center, and one with a soft helmet on the right. Image: Apple

For neurodiverse people (or anyone who likes white noise), Apple is introducing Background Sounds that can be incorporated with other audio and system sounds. They include “balanced, bright, or dark noise, as well as ocean, rain, or stream sounds” that can be set to play continuously and mask distracting or overwhelming noises.

Other features Apple is planning for later in the year include the ability to use mouth sounds like clicks or pops instead of using physical buttons, the customization of display and text size settings in individual apps, and new Memoji options with cochlear implants, oxygen tubes, and soft helmets.

Companies large and small always have room for improvement when it comes to making their products accessible, but Apple has generally been a leader in the space. These updates hopefully signal a continued commitment to designing with disabled people in mind.