Apple Just Figured Out A Killer Use-Case For AR

Over the past decade, Google, Microsoft, Meta, and several companies like Zoom have increased efforts to offer accessibility-focused products for those users who could potentially benefit from their assistance. According to the World Health Organization (WHO) in a report way back in 2011, over one billion people worldwide have some form of disability. According to a research paper published in 3C Vision and posted by ScienceDirect, "57% (74.2 million) of computer users are likely, or very likely, to benefit from the use of accessible technology because they have mild or severe difficulties/impairments." 

To expand its efforts in providing accessible products for all users — and to honor and celebrate Global Accessibility Awareness Day (GAAD) 2022 – Apple released a preview of a set of accessibility-focused features for its software and hardware products.

"Apple embeds accessibility into every aspect of our work," said Sarah Herrlinger, Apple's senior director of Accessibility Policy and Initiatives, "and we are committed to designing the best products and services for everyone." New features and resources revealed by Apple include Live Captions for the deaf and hard hearing community and augmented reality (AR) Door Detection for individuals that are blind or have low vision. Apple also revealed its own Apple Watch Mirroring system to assist users with physical and motor disabilities. 

Live Captions on multiple platforms

Live Captions were revealed by Apple for iPhone, iPad, and Mac computers. Deaf and hard-of-hearing users can use Live Captions on audio-only phone calls, FaceTime calls, video meetings, social media, and streaming media. The size of the font with Apple's Live Captions feature will be adjustable for easy reading, and it'll all also work in reverse — so to speak. Users will also be able to type responses in real-time and have their text spoken aloud to a recipient — that's effectively text-to-speech (TTS), a feature that's always been a fan favorite on every platform on which it's been made available in the past. Apple suggested that the Live Captions feature will always maintain users' privacy and that when the feature is deployed, no other users will know. Additionally, VoiceOver — Apple´s screen reader — will add 20 new locales and languages.

AR, LiDAR, and Apple Watch

A new Door Detection feature allows people who are blind or have low vision to detect doors when they arrive at a new space or destination. Door Detection works with iPhone or iPad to scan for doors and inform users of the door's attributes like distance and current state. Door Detection can tell a user if the door is open or closed, how it can be opened, and read signs on the door or around it. For this solution, an iPhone or iPad uses LiDAR and machine learning to see and interpret content in real-time. The iPhone then shows information to the user using augmented reality on an iPhone's display — and reads said information aloud when needed. This is only the latest in a line of LiDAR-assisted detection features revealed by Apple in recent years.

Users with physical and motor disabilities will be able to work with Apple Watch Mirroring in the near future. The feature pairs an Apple Watch with an iPhone for remote control action. Using Voice Control and Switch Control, users can input commands using their voice, other sound actions, head tracking, or screen taps. Apple Watch's health features like Blood Oxygen, Heart Rate, and Mindfulness will be able to be controlled through a connected iPhone soon.