Apple today revealed several new accessibility features, including Assistive Access, Live Speech, and more, that will arrive later this year. Read about the new features below.
Assistive Access:
Assistive Access distills apps and experiences to their essential features in order to lighten cognitive load. People with cognitive disabilities can now easily perform common tasks using their iOS device with ease. Here’s what Assistive Access offers:
- Assistive Access includes a customized experience for Phone and FaceTime, which have been combined into a single Calls app, as well as Messages, Camera, Photos, and Music.
- The feature offers a distinct interface with high contrast buttons and large text labels, as well as tools to help trusted supporters tailor the experience for the individual they support.
- Users can also choose between a more visual, grid-based layout for their Home Screen and apps, or a row-based layout for users who prefer text.
Live Speech and Personal Voice:
With Live Speech on iPhone, iPad, and Mac, users can type what they want to say to have it be spoken out loud during phone and FaceTime calls as well as in-person conversations. The new Personal Voice feature allows users at risk of losing their ability to speak to create a voice that sounds like them. It even integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
Here’s how this feature works:
- Users can create a Personal Voice by reading along with a randomized set of text prompts to record 15 minutes of audio on iPhone or iPad.
- Personal Voice uses on-device machine learning to keep users’ information private and secure, and integrates seamlessly with Live Speech so users can speak with their Personal Voice when connecting with loved ones.
Point and Speak in Magnifier:
Point and Speak in Magnifier will allow users with vision disabilities to interact with physical objects that have text labels. For example, if a user wants to use a door lock, Point and Speak combines input from the Camera app, the LiDAR Scanner, and on-device machine learning to announce the text on each button as users move their finger across the keypad. This new Point and Speak feature is built into the Magnifier app on iPhone and iPad. This can also be used with Magnifier features such as Voice Over, People Detection, Door Detection, and Image Descriptions.
Other new features:
- Deaf or hard-of-hearing users can pair Made for iPhone hearing devices directly to Mac and customize them for their hearing comfort.
- Voice Control adds phonetic suggestions for text editing so users who type with their voice can choose the right word out of several that might sound alike, like “do,” “due,” and “dew.” Additionally, with Voice Control Guide, users can learn tips and tricks about using voice commands as an alternative to touch and typing across iPhone, iPad, and Mac.
- Users with physical and motor disabilities who use Switch Control can turn any switch into a virtual game controller to play their favorite games on iPhone and iPad.
- For users with low vision, Text Size is now easier to adjust across Mac apps such as Finder, Messages, Mail, Calendar, and Notes.
- Users who are sensitive to rapid animations can automatically pause images with moving elements, such as GIFs, in Messages and Safari.
- For VoiceOver users, Siri voices sound natural and expressive even at high rates of speech feedback; users can also customize the rate at which Siri speaks to them, with options ranging from 0.8x to 2x.