[ad_1]
the iPhone 12 and 12 Pro They are for sale nowBut one of the key differences between this year’s Pro and non-Pro models is a new kind of depth sensing technology called lidar. Take a closer look at one of the new iPhone 12 Pro models, or the Latest iPad Pro, and you will see a small black dot near the camera lens, about the same size as the flash. That is the lidar sensor.
But why is Apple giving so much importance to lidar and what can technology do if buy iPhone 12 Pro or iPhone 12 Pro Max? It’s a term you’ll start to hear a lot now, so let’s take a look at what we know, what Apple will use it for, and where the technology could go next.
What does lidar mean?
Lidar stands for light and range detection, and it has been around for a while. It uses lasers to ping objects and return to the laser source, measuring distance by timing the journey or flight of the light pulse.
How does the lidar work to detect depth?
Lidar is a type of time-of-flight camera. Some other smartphones measure depth with a single pulse of light, while a smartphone with this type of lidar technology sends waves of light pulses in an aerosol of infrared points and can measure each one with its sensor, creating a field of points that draw distances. and it can “mesh” the dimensions of a space and the objects in it. Pulses of light are invisible to the human eye, but can be seen with a night vision camera.
Isn’t this like Face ID on the iPhone?
It is, but with a greater scope. The idea is the same: Apple TrueDepth camera with Face ID support it also shoots a series of infrared lasers, but it can only work from a few meters away. The rear lidar sensors on the iPad Pro and iPhone 12 Pro work up to 5 meters away.
Lidar is already in many other technologies
Lidar is a technology that is emerging everywhere. It is used for autonomous carsor assisted driving. It is used for robotic Y drones. Augmented reality headsets like HoloLens 2 They have a similar technology, mapping the spaces of the room before placing virtual 3D objects in them. But it also has a fairly long history.
Microsoft’s old Xbox depth sensor accessory, the Kinect, it was a camera that also had infrared depth scanning. In fact, PrimeSense, the company that helped create the Kinect technology, was acquired by Apple in 2013. Now, we have Apple’s facial scanning TrueDepth and rear lidar camera sensors.
IPhone 12 Pro camera might work better with lidar
Time-of-flight cameras on smartphones tend to be used to improve focus accuracy and speed, and the iPhone 12 Pro will do the same. Apple promises better focus in low light, up to 6 times faster in low light conditions. The LIDAR depth sensor will also be used to enhance the effects of the night portrait mode.
Better focus is an advantage, and there is also the possibility that the iPhone 12 Pro will add more 3D photo data to the images. Although that element has yet to be established, Apple’s front and depth sensing TrueDepth camera has been used similarly with apps.
It will also greatly improve augmented reality.
Lidar will allow the iPhone 12 Pro to launch augmented reality applications much more quickly and will create a quick map of a room to add more details. A bunch of Apple AR updates in iOS 14 They are leveraging LIDAR to hide virtual objects behind real ones (called occlusion) and place virtual objects within more complicated room assignments, such as a table or chair.
But there is additional potential beyond that, with a longer tail. Many companies dream of headphones that combine virtual and real objects: AR glasses, being worked by Facebook, Qualcomm, Snapchat, Microsoft, Magic jump Y most likely Apple and others will depend on having advanced 3D maps of the world on which to place virtual objects.
Those 3D maps are now being built with scanners and special equipment, almost like the world scan version of those Google Maps cars. But there is a chance that people’s own devices will eventually help gather that information or add additional data on the go. Again, AR headphones like the Magic Leap and HoloLens already pre-scan their surroundings before putting things in it, and Apple’s lidar-equipped AR technology works the same way. In that sense, the iPhone 12 Pro and iPad Pro are like AR headphones without the earpiece part … and they could pave the way for Apple to eventually make its own glasses.
3D scanning could be the killer app
Lidar can be used to flatten 3D objects and rooms and superimpose photographic images on top, a technique called photogrammetry. That could be the next wave of capture technology for practical uses like home improvement, or even social media and journalism. The ability to capture 3D data and share that information with others could turn these LIDAR-equipped phones and tablets into 3D content capture tools. Lidar could also be used without the camera element to acquire measurements for objects and spaces.
Apple isn’t the first to explore technology like this on a phone
Google had this same idea in mind when Tango Project – one of the first AR platforms to only on two phones — was created. The advanced camera suite also had infrared sensors and could map rooms, creating 3D scans and depth maps for AR and for measuring interior spaces. Google’s Tango-equipped phones were short-lived and were replaced by computer vision algorithms that have performed estimated depth detection on cameras without the need for the same hardware. But Apple’s iPhone 12 Pro looks like a much more advanced successor.