IPhone 12 Pro lets blind people ‘see’ others around them



[ad_1]

apple-iphone-12-pro-1754

The lidar scanner on Apple’s new iPhone 12 Pro and 12 Pro Max enables new AR features, and the ability for people who are blind or have low vision to detect others around them.

James Martin / CNET

Apple’s iPhone 12 Pro and 12 Pro Max have a new feature for users who are blind or have low vision: the ability to see, essentially, other people coming.

The devices make use of the new lidar sensor on the back of the phones to detect how close other people are to the user, something Apple has called People Detection. Lidar is a type of depth sensor that helps with augmented reality applications and serves as the eyes of autonomous cars. Now, Apple is applying it to accessibility in an effort to help people with vision problems better navigate the world around them.

When someone who is blind is shopping for groceries, for example, you will be able to activate People Detection on your iPhone 12 Pro to alert them when to proceed through the checkout line. Or someone walking on a sidewalk will receive alerts about how close other people are when they pass. People who are blind or have low vision can use the feature to find out if a seat is available at a table or in public transport, and they will be able to maintain an appropriate social distance when going through health checkpoints or security lines at an airport.

People Detection will be able to know the distance from the person to the user in feet or meters, and it works up to 15 feet / 5 meters away. Anyone in view of the iPhone 12 Pro’s wide-angle camera can be detected by the feature. If there are multiple people nearby, People Detection will give the distance from the closest to the iPhone user.

The technology will be available as part of iOS 14.2 for the next few weeks. Apple released a beta version of the developer software on Friday.

Globally, at least 2.2 billion people are visually impaired or blind, according to a World Health Organization report last year. In the United States, more than 1 million people over the age of 40 are blind, according to the Centers for Disease Control and Prevention. By 2050, that number could skyrocket to around 9 million due to “growing epidemics of diabetes and other chronic diseases and the rapid aging of our American population,” the CDC said.

Apple has made accessibility a focus for decades. It incorporates features into its technology to help people with low vision navigate the iPhone’s touch screen and allow people with motor disabilities to virtually touch the icons on the interface. Four years ago, Apple kicked off one of its flashy product launches by talking about accessibility and showcasing its new dedicated site.

“Technology should be accessible to everyone,” Apple CEO Tim Cook said at the time.

Apple, in particular, has developed features to help people who are blind or have low vision. Your new person detector takes you one step further.

Lidar detection

The technology makes use of the new lidar scanner built into the iPhone 12 Pro and 12 Pro Max camera array. Their also on the newer iPad Pro and other devices are likely to come in the future. The scanner itself is a small black dot near the camera lens on the back of new high-end iPhones.

People detection won’t work on older iPhones, iPhone 12, 12 Mini, or even the new iPad Air. Neither of those devices comes with lidar scanners, which is essential for human detection technology.


Playing now:
See this:

Our in-depth review of the iPhone 12 and 12 Pro


13:48

People Detection uses Apple’s ARKit People Occlusion feature to detect if someone is in the camera’s field of view and estimate how far away the person is. The lidar scanner makes the estimation more accurate. It sends out a small burst of light and measures how long it takes for the light to return to the lidar scanner. The new function does not work in dark or low light environments.

All detection occurs in real time to provide feedback on how far a person is from the user of the iPhone 12 Pro.

The user receives Human Detection feedback in four ways, and they can be used in any combination. Everything can be customized in the settings. One way to obtain information about a person’s closeness is through an audible read. The phone will call out “15, 14, 13” and so on, when it comes to feet. Give the distance in half a meter for people who choose that unit of measure.

IPhone 12 Pro users can also set a threshold distance with two clearly different audio tones. One is for when people are out of that distance and another is for when people are closer to the user. The default threshold setting is 6 feet or 2 meters.

The third type of alert is through haptic feedback. The further away a person is, the lower and slower the physical pulse of the haptics. The closer the person gets, the faster the haptics buzz. Currently haptics are only through the phone, not through the Apple Watch.

There is also the option of obtaining a visual reading on the screen itself. In that case, it will say how far away the person is and a dotted line will indicate where the person is on the screen.

People Detection lives in Apple’s Magnifier app. Users can launch it using Apple’s Back Tap settings or via the triple-click side button accessibility shortcut. Siri can launch Magnifier, but then users need to enable People Detection from there.

It is designed to be a situational tool that people activate when they need it, rather than an always-on function. Running it for a significant period of time will consume a lot of battery life.

For now, the iPhone only detects people, but developers could use lidar technology to create applications to detect objects.

Other iOS 14.2 features include 13 new emoji characters and a music recognition feature via Shazam in the Control Center.

[ad_2]