ARCore & Unreal Engine: What to Look Forward to?
Subscribe:  iCal  |  Google Calendar
20, Jan — 16, Mar
21, Feb — 23, Feb
Barcelona ES   25, Feb — 1, Mar
Dubai AE   5, Mar — 7, Mar
7, Mar — 1, Jun
Latest comments

Having side projects are important in progressing as an artist.

by Mandy Rozana
2 days ago

Read online Punjab News in Punjabi. We cover all kind of topics such as sports, politics, Bollywood, Pollywood and soon. Visit once for breaking news.

Very impressive article Jake! You are very talented.

ARCore & Unreal Engine: What to Look Forward to?
30 August, 2017

Epic Games published a little introduction, detailing the most important aspects of developing projects with ARCore support.

Google ARCore brings AR functionality to Android smartphones. Developer preview of ARCore, which includes support for many modern engines, including UE4. ARCore gives development the ability to make interesting AR experiences without the need for any additional hardware.

ARCore SDK supports the Google Pixel, Pixel XL, and Samsung Galaxy S8 (Android 7.0 Nougat+). Google is determined to add over 100 million devices before official launch.

Epic is supporting all of the new AR platforms with Unreal Engine, Unreal Engine 4.18, coming in mid-October, will be a major release for AR, with more mature ARKit support, and support for ARCore. Here are the most important points that, according to Epic, help transform how mobile users see the world.

Motion Tracking

As your mobile device moves through the world, ARCore combines visual data from the device’s camera and inertial measurements from the device’s IMU to estimate the pose (position and orientation) of the camera relative to the world over time. This process, called visual inertial odometry (VIO), lets ARCore know where the device is relative to the world around it.

By aligning the pose of the virtual camera that renders your 3D content with the pose of the device’s camera provided by ARCore, virtual content is rendered from the correct perspective. Rendered virtual images are then overlayed on top of the image obtained from the device’s camera, making it appear as if the virtual content is part of the real world.

Environmental Understanding

ARCore is constantly improving its understanding of the real world environment by detecting feature points and planes. Feature points are visually distinct features in the captured camera image that ARCore can recognize even when the camera’s position changes slightly. ARCore estimates pose changes by triangulating on these feature points between successive frames.

ARCore looks for clusters of feature points that appear to lie on common horizontal surfaces, like tables and desks, and makes these surfaces available to your app as planes. ARCore can also determine each plane’s boundary and make that information available to your app. You can use this information to place virtual objects resting on flat surfaces, such as a character running around on the floor or a table.

Light Estimation

Finally, through light estimation, ARCore can detect information about the lighting of its environment and provide you with the average intensity of a given camera image. This information enables you to light your virtual objects under the same conditions as the environment around them, increasing the sense of realism.

You can check out the ARCore developer preview.  Some details on the production of the ARCore projects are also available.

Leave a Reply

Be the First to Comment!