The announcements were made to celebrate Global Accessibility Awareness Day.
Door detection will use the lidar scanner and machine learning to identify doors and relay information about their location, labeling, and more to blind or low-vision users.
Live Captions will transcribe what people on calls and in videos say while you use the phone.
Apple Watch screen mirroring will give Watch users access to some accessibility features that are available on the iPhone but not the Watch.
Users will be able to set how long Siri waits for you to finish speaking before saying something back or performing an action.
Added touch gestures will give users more ways of controlling the Apple Watch.
You'll be able to train your iPhone to recognize important sounds like your doorbell.
Apple's Books app offers new customization options to make text more legible.
Global Accessibility Awareness Day is Thursday, so Apple took to its newsroom blog this week to announce several major new accessibility features headed to the iPhone, Apple Watch, iPad, and Mac.
One of the most widely used will likely be Live Captions, which is coming to iPhone, Mac, and iPad. The feature shows AI-driven, live-updating subtitles for speech coming from any audio source on the phone, whether the user is "on a phone or FaceTime call, using a video conferencing or social media app, streaming media content, or having a conversation with someone next to them."
The text (which users can resize at will) appears at the top of the screen and ticks along as the subject speaks. Additionally, Mac users will be able to type responses and have them read aloud to others on the call. Live Captions will enter public beta on supported devices ("iPhone 11 and later, iPad models with A12 Bionic and later, and Macs with Apple silicon") later this year.
There's also door detection. It unfortunately will only work on iPhones and iPads with a lidar sensor (so the iPhone 12 Pro, iPhone 13 Pro, or recent iPad Pro models), but it sounds useful for those who are blind or have low vision. It uses the iPhone's camera and AR sensors, in tandem with machine learning, to identify doors and audibly tell users where the door is located, whether it's open or closed, how it can be opened, and what writing or labeling it might have.
Door detection will join people detection and image descriptions in a new "detection mode" intended for blind or low-vision users in iOS and iPadOS. Apple's blog post didn't say when that feature would launch, however.
Other accessibility additions that Apple says are just around the corner include 20 new Voice Over languages, new hand gestures on Apple Watch, and a feature that allows game players to receive help from a "buddy" with another game controller without disconnecting their own. Additionally, there are new Siri and Apple Books customizations meant to expand accessibility for people with disabilities, sound recognition customizations, and Apple Watch screen mirroring on the iPhone—which gives Watch users access to many accessibility features available on the iPhone but not the Watch.
Tech enthusiasts often lament that smartphones (and personal tech in general) have become stagnant, without many exciting new developments. But that couldn't be further from the truth for many people with disabilities. Google, Apple, and numerous researchers and startups have been making significant advancements, bringing powerful new accessibility features to mobile devices.
Listing image by Apple
Apple details new iPhone features like door detection, live captions
Recommended Comments
There are no comments to display.
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.