Lidar on iPhone 12 Pro: What You Can Do Now and Why It Matters for the Future



[ad_1]

01-iphone-12-pro-2020

The iPhone 12 Pro’s lidar sensor, the black circle at the bottom right of the camera unit, opens up the possibilities of AR and much more.

Patrick Holland / CNET

Apple is getting bullish on lidar, a technology that is completely new to the iPhone 12 family, specifically for the iPhone 12 Pro and iPhone 12 Pro Max. (All four iPhone 12 variants, including the iPhone 12 Mini, are on sale now.) Take a closer look at one of the new iPhone 12 Pro models, or the Latest iPad Pro, and you will see a small black dot near the camera lens, about the same size as the flash. That’s the lidar sensor, and it’s a new kind of depth sensing that could make a difference in a number of interesting ways.

Read more: The lidar technology in iPhone 12 does more than just enhance photos. Check out this cool party trick

If Apple has its way, lidar is a term you’ll start to hear a lot now, so let’s take a look at what we know, what Apple will use it for, and where the technology could go next. And if you’re curious what he’s doing right now, I also spent some hands-on time with technology.

What does lidar mean?

Lidar stands for light and range sensing, and it has been around for a while. It uses lasers to ping objects and return to the laser source, measuring distance by timing the travel or flight of the light pulse.

How does the lidar work to detect depth?

Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology sends out waves of light pulses in an aerosol of infrared points and can measure each one with its sensor, creating a field of points that draw distances. and it can “combine” the dimensions of a space and the objects in it. Pulses of light are invisible to the human eye, but can be seen with a night vision camera.

Isn’t this like Face ID on the iPhone?

It is, but with greater scope. The idea is the same: Apple TrueDepth camera with Face ID support It also shoots a series of infrared lasers, but it can only work up to a few feet away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work up to 5 meters away.

Lidar is already in many other technologies

Lidar is a technology that is emerging everywhere. It is used for autonomous carsor assisted driving. It is used for robotic Y drones. Augmented reality headsets like HoloLens 2 They have a similar technology, mapping the spaces of the room before placing virtual 3D objects in them. But it also has a fairly long history.

Microsoft’s old Xbox depth sensor accessory, the Kinect, it was a camera that also had infrared depth scanning. In fact, PrimeSense, the company that helped create the Kinect technology, was acquired by Apple in 2013. Now, we have Apple’s facial scanning TrueDepth and rear lidar camera sensors.

IPhone 12 Pro camera works better with lidar

Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro does the same. Apple promises better low-light focus, up to six times faster in low-light conditions. The LIDAR depth sensor is also used to enhance the effects of the night portrait mode. So far it packs a punch – read our iPhone 12 Pro review to learn more.

Better focus is an advantage, and there is also the possibility that the iPhone 12 Pro will add more 3D photo data to the images. Although that element has yet to be established, Apple’s front and depth sensing TrueDepth camera has been used similarly with apps, and outside developers might dive in and come up with some crazy ideas. It is already happening.

It also greatly improves augmented reality.

Lidar allows iPhone 12 Pro to launch AR applications much faster and create a quick map of a room to add more details. A bunch of Apple AR updates in iOS 14 They are leveraging LIDAR to hide virtual objects behind real ones (called occlusion) and place virtual objects in more complicated room assignments, such as a table or chair.

I’ve been testing it on an Apple Arcade game, Hot Lava, that already uses lidar to scan a room and all of its obstacles. I was able to place virtual objects on the stairs and have things hide behind real life objects in the room. Expect many more AR apps that will start adding lidar support like this for richer experiences.

lens-snapchat-with-lidar.png

The next wave of Snapchat lenses will start to embrace depth sensing using the iPhone 12 Pro’s lidar.

Snapchat

But there is additional potential beyond that, with a longer tail. Many companies dream of headphones that combine virtual and real objects: AR glasses, being worked by Facebook, Qualcomm, Snapchat, Microsoft, Magic jump Y most likely Apple and others will depend on having advanced 3D maps of the world on which to place virtual objects.

Those 3D maps are now being built with scanners and special equipment, almost like the world scan version of those Google Maps cars. But there is a chance that people’s own devices will eventually help gather that information or add additional data on the go. Again, AR headphones like the Magic Leap and HoloLens already pre-scan their surroundings before putting things in it, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headphones without the earpiece part … and they could pave the way for Apple to eventually make its own glasses.

occipital-canvas-ipad-pro-lidar.png

A 3D room scan of Occipital’s Canvas app, enabled by depth sensing LIDAR on the iPad Pro. Expect the same for the iPhone 12 Pro, and maybe more.

Occipital

3D scanning could be the killer app

Lidar can be used to flatten 3D objects and rooms and superimpose photographic images on top, a technique called photogrammetry. That could be the next wave of capture technology for practical uses like home improvement, or even social media and journalism. The ability to capture 3D data and share that information with others could turn these LIDAR-equipped phones and tablets into 3D content capture tools. Lidar could also be used without the camera element to acquire measurements for objects and spaces.

I’ve already tried some of the early Lidar-enabled 3D scanning apps on the iPhone 12 Pro with mixed success (3D Scanner App, Lidar Scanner, and Record3D), but they can be used to scan objects or map rooms with surprising speed. The 16-foot effective range of the LIDAR scan is enough to reach most rooms in my house, but in larger outdoor spaces more movement is needed. Again, Apple’s front-facing TrueDepth camera already does similar things at a closer distance.


Playing now:
See this:

Our in-depth review of the iPhone 12 and 12 Pro


13:48

Apple is not the first to explore technology like this on a phone

Google had this same idea in mind when Tango Project – one of the first AR platforms to only on two phones — was created. The advanced camera suite also had infrared sensors and could map rooms, creating 3D scans and depth maps for AR and for measuring interior spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that have performed estimated depth detection on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a significantly more advanced successor, with possibilities for that LIDAR extending to cars, AR headphones, and much more.



[ad_2]