Google has recently revealed a new tool called Depth API, which allows mobile devices to create depth maps using a single RGB camera and also makes the AR experience more natural.
Shahram Izadi, Director of Research and Engineering at Google, states that the new tool will give access to occlusion for mobile AR applications and crate more realistic physics and surface interactions.
“The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera,” Izadi said. “The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel.”
Full-fledged AR headsets use multiple depth sensors, and the thing is that the new tool can produce such results using a single sensor.
“One important application for depth is occlusion: the ability for digital objects to accurately appear in front of or behind real-world objects,” Izadi noted. “Occlusion helps digital objects feel as if they are actually in your space by blending them with the scene. We will begin making occlusion available in Scene Viewer, the developer tool that powers AR in Search, to an initial set of over 200 million ARCore-enabled Android devices today.”
You can learn more here.