Apple’s new accessibility features include simplified apps and let your iPhone speak in your voice

The updates will roll out later this year, likely with iOS 17.

Apple Assistive Access and Live Speech features.

Apple is expanding its list of accessibility features in iOS, iPadOS, and macOS in some major ways. The company has announced new features that’ll help folks with cognitive, vision, hearing, and mobility accessibility, as well as those who are nonspeaking or at risk of losing their ability to speak.

Apple's new Personal Voice and Live Speech accessibility features.

The most interesting of these features is Personal Voice, which uses on-device machine learning to mimic your voice when using text-to-speech. According to Apple, users will be guided through a setup process consisting of 150 randomized text prompts which they’ll be asked to speak. This will give your iPhone or iPad 15 minutes of audio to base its impression on. Then, you can use text-to-speech to use your Personal Voice and speak over the phone with text-to-speech.

The feature ties in with Live Speech, which powers the text-to-speech experience when speaking with someone over a phone call or FaceTime session. You won’t need to set up a Personal Voice to use Live Speech, and Apple says you’ll be able to save common phrases to quickly chime in during a conversation.

Then there’s Assistive Access, which is a completely new user interface for iPhones and iPads that focuses on high-contrast buttons and large text labels for a simpler experience, lightening one’s cognitive load. Apple gives you access to five apps: Music, Calls (Phone and FaceTime combined), Messages, Photos, and Camera. All five have been boiled down to their most essential features, and each has a custom interface that makes specific features easier to use (such as an emoji-only keyboard in Messages). You can also choose between a grid or list layout for the home screen.

Apple is also releasing a new Point and Speak feature in Magnifier, which will allow you to point your device at physical objects and announce any recognized text (for example, the buttons on a microwave). Other new features include the ability to pair Made for iPhone hearing aids to Macs, phonetic suggestions for text editing while using Voice Control, the option to automatically pause GIFs in Messages and Safari, using Switch Controls in games, new text size settings in Mac apps, and the ability to adjust how quickly Siri speaks to you.

The new features were announced in time for Global Accessibility Awareness Day on Friday.

“At Apple, we’ve always believed that the best technology is technology built for everyone,” said CEO Tim Cook in a press statement. “Today, we’re excited to share incredible new features that build on our long history of making technology accessible, so that everyone has the opportunity to create, communicate, and do what they love.”

“Accessibility is part of everything we do at Apple,” said Sarah Herrlinger, the company’s senior director of Global Accessibility Policy and Initiatives. “These groundbreaking features were designed with feedback from members of disability communities every step of the way, to support a diverse set of users and help people connect in new ways.”

According to Apple, these features will roll out “later this year,” which likely means they’ll launch as part of iOS 17 (which we’ll learn more about next month at WWDC).