ARCore & Unreal Engine: What to Look Forward to?
Subscribe:  iCal  |  Google Calendar
Amsterdam NL   25, Jun — 28, Jun
Los Angeles US   25, Jun — 28, Jun
Montreal CA   27, Jun — 1, Jul
Cambridge GB   28, Jun — 2, Jul
Guildford GB   29, Jun — 30, Jun
Latest comments
44 min ago

Great job! I want this too! Please make it somehow available!

by John Doe
1 hours ago

I want this!

by hnjmbn
3 hours ago


ARCore & Unreal Engine: What to Look Forward to?
30 August, 2017

Epic Games published a little introduction, detailing the most important aspects of developing projects with ARCore support.

Google ARCore brings AR functionality to Android smartphones. Developer preview of ARCore, which includes support for many modern engines, including UE4. ARCore gives development the ability to make interesting AR experiences without the need for any additional hardware.

ARCore SDK supports the Google Pixel, Pixel XL, and Samsung Galaxy S8 (Android 7.0 Nougat+). Google is determined to add over 100 million devices before official launch.

Epic is supporting all of the new AR platforms with Unreal Engine, Unreal Engine 4.18, coming in mid-October, will be a major release for AR, with more mature ARKit support, and support for ARCore. Here are the most important points that, according to Epic, help transform how mobile users see the world.

Motion Tracking

As your mobile device moves through the world, ARCore combines visual data from the device’s camera and inertial measurements from the device’s IMU to estimate the pose (position and orientation) of the camera relative to the world over time. This process, called visual inertial odometry (VIO), lets ARCore know where the device is relative to the world around it.

By aligning the pose of the virtual camera that renders your 3D content with the pose of the device’s camera provided by ARCore, virtual content is rendered from the correct perspective. Rendered virtual images are then overlayed on top of the image obtained from the device’s camera, making it appear as if the virtual content is part of the real world.

Environmental Understanding

ARCore is constantly improving its understanding of the real world environment by detecting feature points and planes. Feature points are visually distinct features in the captured camera image that ARCore can recognize even when the camera’s position changes slightly. ARCore estimates pose changes by triangulating on these feature points between successive frames.

ARCore looks for clusters of feature points that appear to lie on common horizontal surfaces, like tables and desks, and makes these surfaces available to your app as planes. ARCore can also determine each plane’s boundary and make that information available to your app. You can use this information to place virtual objects resting on flat surfaces, such as a character running around on the floor or a table.

Light Estimation

Finally, through light estimation, ARCore can detect information about the lighting of its environment and provide you with the average intensity of a given camera image. This information enables you to light your virtual objects under the same conditions as the environment around them, increasing the sense of realism.

You can check out the ARCore developer preview.  Some details on the production of the ARCore projects are also available.

Leave a Reply