[ad_1]
Apple is getting bullish on lidar, a technology that is completely new to the iPhone 12 family, specifically for iPhone 12 Pro and iPhone 12 Pro Max. (The iPhone 12 Pro is now on sale, with the iPhone 12 Pro Max in a few weeks). Take a closer look at one of the new iPhone 12 Pro models, or the Latest iPad Pro, and you will see a small black dot near the camera lens, about the same size as the flash. That’s the lidar sensor, and it’s a new kind of depth sensing that could make a difference in a number of interesting ways.
If Apple has its way, lidar is a term you’ll start to hear a lot now, so let’s break down what we know, what Apple will use it for, and where the technology could go next. And if you’re curious what he’s doing right now, I also spent some time with technology.
What does lidar mean?
Lidar stands for light and range detection, and it has been around for a while. It uses lasers to ping objects and return to the laser source, measuring distance by timing the journey or flight of the light pulse.
How does the lidar work to detect depth?
Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology sends waves of light pulses in an aerosol of infrared points and can measure each one with its sensor, creating a field of points that draw distances. and it can “mesh” the dimensions of a space and the objects in it. Pulses of light are invisible to the human eye, but can be seen with a night vision camera.
Isn’t this like Face ID on the iPhone?
It is, but with greater scope. The idea is the same: Apple TrueDepth camera with Face ID support It also shoots a series of infrared lasers, but it can only work from a few meters away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work up to 5 meters away.
Lidar is already in many other technologies
Lidar is a technology that is emerging everywhere. It is used for autonomous carsor assisted driving. It is used for robotic Y drones. Augmented reality headsets like HoloLens 2 They have a similar technology, mapping the spaces of the room before placing virtual 3D objects in them. But it also has a fairly long history.
Microsoft’s old Xbox depth sensor accessory, the Kinect, it was a camera that also had infrared depth scanning. In fact, PrimeSense, the company that helped create the Kinect technology, was acquired by Apple in 2013. Now, we have Apple’s facial scanning TrueDepth and rear lidar camera sensors.
IPhone 12 Pro camera works better with lidar
Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro does the same. Apple promises better low-light focus, up to six times faster in low-light conditions. The LIDAR depth sensor is also used to enhance the effects of the night portrait mode. So far it packs a punch – read our iPhone 12 Pro review to learn more.
Better focus is an advantage, and there is also the possibility that the iPhone 12 Pro will add more 3D photo data to the images. Although that element has yet to be established, Apple’s front depth sensing TrueDepth camera has been used similarly with apps, and outside developers might dive in and come up with some crazy ideas. It is already happening.
It also greatly improves augmented reality.
Lidar allows iPhone 12 Pro to launch augmented reality applications much faster and create a quick map of a room to add more details. A bunch of Apple AR updates in iOS 14 They are leveraging LIDAR to hide virtual objects behind real ones (called occlusion) and to place virtual objects within more complicated room assignments, such as a table or chair.
I’ve been testing it on an Apple Arcade game, Hot Lava, that already uses lidar to scan a room and all of its obstacles. I was able to place virtual objects on the stairs and have things hide behind real life objects in the room. Expect many more AR apps that will start adding LIDAR support like this for richer experiences.
But there is additional potential beyond that, with a longer tail. Many companies dream of headphones that combine virtual and real objects: AR glasses, being worked by Facebook, Qualcomm, Snapchat, Microsoft, Magic jump Y most likely Apple and others will depend on having advanced 3D maps of the world on which to place virtual objects.
Those 3D maps are now being built with scanners and special equipment, almost like the world scan version of those Google Maps cars. But there is a chance that people’s own devices will eventually help gather that information or add additional data on the go. Again, AR headphones like the Magic Leap and HoloLens already pre-scan their surroundings before putting things in it, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headphones without the earpiece part … and they could pave the way for Apple to eventually make its own glasses.
3D scanning could be the killer app
Lidar can be used to flatten 3D objects and rooms and superimpose photographic images on top, a technique called photogrammetry. That could be the next wave of capture technology for practical uses like home improvement, or even social media and journalism. The ability to capture 3D data and share that information with others could turn these LIDAR-equipped phones and tablets into 3D content capture tools. Lidar could also be used without the camera element to acquire measurements for objects and spaces.
I’ve already tried some of the early Lidar-enabled 3D scanning apps on the iPhone 12 Pro with mixed success (3D Scanner App, Lidar Scanner, and Record3D), but they can be used to scan objects or map rooms with surprising speed. The 16-foot effective range of the LIDAR scan is enough to reach most rooms in my house, but in larger outdoor spaces more movement is needed. Again, Apple’s front-facing TrueDepth camera already does similar things at a closer distance.
Apple isn’t the first to explore technology like this on a phone
Google had this same idea in mind when Tango Project – one of the first AR platforms to only on two phones — was created. The advanced camera suite also had infrared sensors and could map rooms, creating 3D scans and depth maps for AR and for measuring interior spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that have performed estimated depth detection on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a significantly more advanced successor, with possibilities for that LIDAR extending to cars, AR headphones, and much more.
[ad_2]