- The view of the iPhone’s camera detects the presence of people and measures too
- The feature is available to iPhone 12 Pro and Pro Max running the iOS 14.2
- All detailed information gathered by LiDAR scanner on iPhone 12 Pro
- iPhone packs several other features that are for visual impairment or low vision.
Apple has packed some new accessibility features into the latest beta of iOS. It is a system that detects the presence of people and distance too. The view of the iPhone’s camera, so blind users can social distance effectively, including other things.
The feature emerged from Apple’s ARKit, for which the company developed.
‘People occlusion,’ which detects people’s shapes and lets virtual items pass in front of and behind them.
The accessibility team realized that this, combined with the accurate distance measurements provided by the lidar units on the iPhone 12 Pro and Pro Max, could be an extremely useful tool for anyone with a visual impairment.
iPhone packs several other features that are particularly useful for people visually impaired. For example, “VoiceOver” describes exactly what is happening on users iPhone and “Magnifier” works like a digital magnifying glass.
iPhone features for the visually impaired users
This useful feature emanates from Apple’s augmented reality (AR) platform for iOS devices, ARKit. ARKit 4 introduced a brand-new depth application programming interface.
As creating a new way to access the detailed depth information gathered by the LiDAR scanner on iPhone 12, 12 Pro Max, and iPad Pro. The LiDAR Scanner measures the distance to surrounding objects.
How does this feature helps the visually impaired?
For ARKit, Apple developed a feature called “people occlusion” which detects the shape of people
The wide-angle camera of the iPhone 12 Pro and iPhone 12 Pro Max accurately measure the distance between the users and their nearby people as well as objects. This eventually helps provide visual support to users who have blindness or extremely low vision.
The People Detection feature tells the users whether people are in their space. If it finds someone in close proximity, it then measures the distance and then produces a sound in stereo corresponding to the direction of that person.
It is also capable of allowing users to set any particular tones once a certain distance is measured by the iPhone.