ARCore’s new depth API is out of beta, bringing the next generation of hide-and-seek to phones


Historically, AR apps on Android have always struggled with proper depth detection and the distinction between foreground and background in the physical world. Every time you add an AR object, it will simply sit on top of the entire stage in front of your viewer, regardless of whether or not something should realistically block your view. After an extensive preview phase introduced last year, Google is rolling out its new depth API to ARCore for all developers using Android and Unity.

The new interface helps distinguish between real-world foreground and background, so digital objects will be properly occluded while enhancing your route and physical search capabilities.

Developers can integrate the technology into their projects starting today, and you should see the change in some of Google’s products. The API uses a depth-of-motion algorithm similar to Google’s camera bokeh portrait mode to create a depth map. This is accomplished by taking multiple images from different angles while moving the phone, allowing the system to estimate the distance to each pixel you see on the screen. At the moment, the API is only based on a single camera for that.

This is how the background mapping process works.

Thanks to this depth information, digital objects can be hidden or partially hidden behind real-world materials. The first API-equipped Google product is the Scene Viewer, which is part of Google Search. It allows you to see all kinds of animals and more right in front of your camera, just search for “cat” on your ARCore-enabled device, for example.

The depth API will help AR objects feel more immersive.

Depth information can also be used to improve path search (so that digital characters stop flowing through your furniture), proper surface interactions (so you can paint on objects more complex than walls), and better physics (when throwing a digital ball, it will be obstructed by real world objects). With more and more cameras on the back of the phones, Google is also teasing that the API will rely on additional depth sensors and time-of-flight lenses in the future to improve and speed up the mapping process: “We have only started to scratch the surface of what is possible. “

Google demos of possible API applications.

In addition to Google Search, ARCore Depth Lab (APK Mirror), and a domino app specifically designed to highlight the new API, the first product to receive an update that takes advantage of occlusion is Houzz, an app that lets you equip your home with furniture in AR. There’s also the TeamViewer Pilot app, which helps you draw in AR to remotely assist those with no computer skills. Five Nights at Freddy’s is the first game to take advantage of the API, allowing some characters to hide behind real-world objects to scare additional jumps. Also, Snapchat has updated its Dancing hotdog and Undersea world lenses to take advantage of the occlusion.

Left: Five nights and Freddy’s. Medium: TeamViewer Pilot. Right: Dancing Hotdog Snapchat filter.

Samsung will also launch a new version of its Quick Measure app to take advantage of new depth capabilities, making it faster and more accurate.

Starting today, the API will be available through ARCore 1.18 on a selection of compatible devices, including the latest flagships and even some mid-range phones. To take advantage of the new features, you can obtain the respective update of Play Services for AR from the Play Store or APK Mirror. Interested developers can head to the ARCore website for more information, where they will also find updated SDKs.

Updated with the official launch of the Depth API. Previously, this article covered previewing the feature.

Updated to clarify that only a selection of devices supports the Depth API, not all devices that work with ARCore are integrated.

Google Play services for AR
Google Play services for AR
ARCore Depth Lab
ARCore Depth Lab
Game lines
Game lines