- One of the biggest improvements with iOS 14 is the new Screen Recognition feature
- It goes beyond VoiceOver which now uses “on-device intelligence to recognize elements
- Screen Recognition automatically detects interface controls to aid in navigating apps
Apple just gave an overhaul to its accessibility landing page to better highlight the native features in macOS and iOS. These features allow user’s devices to “work the way you do” and encourage everyone to “make something wonderful.”
Now a new interview with Apple’s accessibility and AI/ML engineers goes into more detail on the company’s approach to improving accessibility with iOS 14.
iOS accessibility engineer Chris Fleizach and AI/ML team member Jeff Bigham spoke about how Apple thought about evolving the accessibility features from iOS 13 to 14 and how collaboration was needed to achieve these goals.
What is Apple’s approach to improve accessibility with iOS 14?
One of the biggest improvements with iOS 14 this fall when it comes to accessibility is the new Screen Recognition feature. It goes beyond VoiceOver which now uses “on-device intelligence to recognize elements on screen to improve VoiceOver support for app and web experiences.”
Here’s how Apple describes Screen Recognition:
Screen Recognition automatically detects interface controls to aid in navigating apps
Screen Recognition also works with “on-device intelligence to detect and identify important sounds such as alarms, and alerts you to them using notifications.
How did Apple approach to improving accessibility with iOS 14?
Here’s how Apple’s Fleizach describes Apple’s approach to improving accessibility. This includes the speed and precision that comes with Screen Recognition:
“We looked for areas where we can make inroads on accessibility, like image descriptions,” said Fleizach. “In iOS 13 labeled icons automatically – Screen Recognition takes it another step forward.
Users can look at the pixels on the screen and identify the hierarchy of objects users can interact with. And, all of this happens on device within tenths of a second.”
Bigham notes how crucial collaboration across the teams at Apple were in going beyond VoiceOver’s capabilities with Screen Recognition:
“VoiceOver has been the standard-bearer for vision accessibility for so long. If users look at the steps in development for Screen Recognition, it was grounded in collaboration across teams. Accessibility throughout, our partners in data collection and annotation, AI/ML, and design. We did this to make sure that our machine learning development continued to push toward an excellent user experience,” said Bigham.