Lidar is boring on iPads, but it could go beyond AR on the iPhone 12 Pro



[ad_1]

While many of Apple’s investments in breakthrough technologies pay off, some just don’t: Think of the “enormous amount” of money and engineering time it invested in force-sensitive displays, which are now in the process of being phased out. Apple watches and iPhones, or your work on Siri, which still appears to be in beta nine years after it was first integrated into iOS. In some cases, Apple’s backing is enough to bring a new technology to the mainstream; in others, Apple includes a feature in many devices just so that innovation goes nowhere.

Lidar has the potential to be Apple’s next “here today, gone tomorrow” technology. The laser-based depth scanner was the great addition to the iPad Pro 2020 that debuted in March and has been rumored for almost two years as a feature of the iPhone 2020. The recently leaked rear glass panels for the iPhone 12 Pro and Max suggest that Lidar scanners will appear on both phones, although they are unlikely to be on the non-Pro versions of the iPhone 12. Also, may will be the only major changes to the rear camera arrays on the new iPhones this year.

If you don’t fully understand Lidar, you are not alone. Think of it as an additional camera that quickly captures the depth data of a room instead of creating traditional photos or videos. To users, Lidar visualizations look like black-and-white point clouds focused on the edges of objects, but when devices collect Lidar data, they know the relative depth locations of individual points and can use that information. deep to enhance augmented reality, traditional photography. and various computer vision tasks. Unlike a flat photo, a depth scan offers finely detailed differentiation of what is near, mid-range, and far.

Six months after Lidar hit the iPad Pro, Apple’s software hasn’t matched the potential of the hardware. Rather than launch a new user-oriented app to showcase the feature, or dramatically augment the popular iPad camera app with depth-sensing tricks, Apple introduced developers to Lidar as a way to instantly enhance their existing AR software. , often without the need for additional coding. The depth and room scan features previously implemented in applications would work faster and more accurately than before. As just one example, composite AR content on a real-world video camera could be automatically partially hidden behind depth-sensing objects, a feature known as occlusion.

In short, adding Lidar to the iPad Pro made a narrow category of apps a little better on a small portion of Apple devices. From a user perspective, the best examples provided by Apple of the potential of the technology were hidden in the Apple Store application, which can display 3D models of certain devices (Mac Pro, yes; iMac, no) in AR and the dark “Measure “from the iPadOS app, which previously did a mediocre job of estimating the length of real-world objects without Lidar, but did a better job with Lidar. It’s worth stressing that those aren’t objectively good examples, and no one in their right mind except an AR developer would buy a device solely to get such marginal improvements in AR performance.

It remains to be seen if Lidar will have a greater impact on iPhones. If it’s truly a Pro exclusive feature this year, not only will fewer people have access to it, but developers will have less incentive to develop Lidar-dependent features. Even if Apple sells tens of millions of iPhone 12 Pro devices, they will almost certainly follow the pattern of the iPhone 11, which reportedly outsold its more expensive Pro siblings around the world. Consequently, Lidar would be a comparatively niche feature, rather than a benchmark expectation for all iPhone 12 series users.

The new XS portrait mode lets you adjust the background blur (bokeh) from f / 1.4 to f / 16 after taking a photo.

Top: Portrait mode allows you to adjust background blur (bokeh) from f / 1.4 to f / 16 after taking a photo.

Image Credit: Jeremy Horwitz / VentureBeat

That being said, if Apple uses Lidar hardware correctly on iPhones, it could become a big business and a differentiator down the road. Industry rumors suggest that Apple will use Lidar to enhance the Pro cameras’ autofocus features and depth-based processing effects, such as Portrait Mode, which artificially blur photographic backgrounds to create a ‘bokeh’ effect. “similar to a DSLR. Since Lidar’s invisible lasers work in dark-hued rooms, and quickly, they could serve as a better low-light autofocus system than current techniques that rely on minute differences measured by an optical camera sensor. Fake bokeh and other visual effects could and will likely be applicable to video recordings as well. Developers like Niantic could also use the hardware to improve Pokémon Go for a subset of iPhones, and given the massive size of its user base, that could be a win for AR players.

Apple won’t be the first company to offer a rear depth sensor on a phone. Samsung introduced similar technology to the Galaxy S10 series last year, adding it to the following Note 10 and S20 models, but a lack of killer apps and performance issues led the company to remove the function from the Note 20 and S series from the next year. While Samsung is apparently redesigning its depth sensor to better compete with the Sony-developed Lidar scanner that Apple uses in its devices, finding great applications for the technology can still be a challenge.

Although consumer and developer interest in depth sensing technologies may have (temporarily) stalled, there has been no shortage of demand for higher resolution smartphone cameras. Virtually every Android phone maker made a leap in sensor technology this year, so that even mid-range phones now include at least one camera with four to ten times the resolution of iPhone sensors. from Apple. Relying solely on Lidar won’t help Apple close the resolution gap, but it may further its earlier claims that it’s doing more with its fewer pixels.

Ultimately, the problems with Apple-owned innovations like 3D Touch, Force Touch, and Siri don’t come down to whether the technologies are inherently good or bad, but whether they’ve been widely adopted by developers and users. As augmented reality hardware continues to advance and demands rapid room-scale depth scanning for everything from object placement to gesture control tracking, there are many reasons to believe that Lidar will be a foundational technology or a preferred solution. But Apple will have to make a better case for Lidar on the iPhone than on the iPad, and soon, lest the technology end up forgotten and abandoned instead of being the core of the next generation of mobile computing.

[ad_2]