Yuri Popov talked about the development of his AR application where you can view the masterpieces made by famous painters.
My name is Yuri, I am a software developer. I develop games, interactive and audiovisual installations, and other software solutions. And in my free time, I do the same.
ARt is a collection of world art recreated in augmented reality.
In the application, you can find scenes from works by Rene Magritte, Salvador Dali, Mondrian, Malevich and others.
The main idea was to add a 3rd dimension to the canvases that live in two dimensions and bring them to life. The main goal of the project was to create a hub where we can gradually add recreated art pieces.
Thus, the project became a small encyclopedia with the descriptions of the original works including interesting facts about them.
ARt makes it possible to add art to everyday scenes of your life. Your phone is always at hand with all the app content.
Selected works are rather subjective. Since the project was personal, we were completely free to make the choice.
The project started with 4.18 version of UE4 and was ported and released in 4.19 when AR interfaces of Apple and Android were unified in the engine.
Time to market — 2.5 month.
The project was developed on Windows to a certain point, and I will talk about it below.
A blueprint project without a code allows you to do assembly fully on Windows and test it directly on Apple devices.
All content is stored in the internal table: titles, description, images and main art blueprint. This allows adding new art to the app quickly and separately by simply adding a new line to the table.
The application supports English and Russian languages. Unreal Engine has a great localization mechanism (Localization Dashboard) that permits to gather all texts in the project (the type should be FText), translate and compile them.
The main thing that I hate in most AR apps is that you can get stuck in a 3D model and see some anatomy of it. To avoid it, we calculate camera position relative to the model and hide it if it is too close.
Some geometric art pieces (by Kandinsky, Mondrian, and Malevich) were made programmatically. The matrix that repeats Mondrian Broadway Boogie Woogie art was recreated by hand.
Unreal Engine has a mechanism to add native Object-C code. If you add some code sources to your project you will not be able to build it on Windows. Here starts the branch from Windows to Mac.
Unreal doesn’t have a native screenshot functionality in the sandbox. That’s why a native screenshot maker was made during the development.
In the end, all sources evolved into a unique plugin — blueprint functional library with native fully customized alerts, screenshots and sharing.
Right now it’s all available in UE4 marketplace.
I always try to bring CI/CD pipeline to all my projects as quickly as possible. It permits to be more disciplined and avoid some wrong assumptions in the beginning. Shortly speaking – one new change creates one new build.
My usual pipeline (it depends on the project) is the following:
- Bitbucket for sources (with Git LFS for assets)
- Jenkins for builds and deployments
- Slack for chatOps (sometimes for logging).
But this is a story for another article.
If you are just starting out in AR I recommend you to read the official Apple’s Human Interface Guidelines.
The main idea is to use NUI (natural user interface) to interact with models by gesturing (rotate, scale, spawn) without using standard buttons (or these buttons should be also in AR). It should be natural, intuitive for the user and without any instructions (every instruction is a mess).
A user can take some time to look inspect the space. During this time you can accurately show him what he should do for a better experience: move your phone around the space, find a well lit place.
In the interface, you can add a scan plane. In an AR app, UI automatically hides within some time in order to maximize the working field. For my app it is not required to find stable, good tracking planes, you can generate art by simply finding a featuring point.
In the development, the hardest work is a “fool protection” which means handling errors and unusual user behavior.
For example, a user did not allow access to the camera or the photo library. Here’s a good design which shows what goes wrong:
Publishing the app on Google Play was not a problem, and the app was available in just an hour without any review.
Publishing on Apple Store was more complicated. At first, Apple rejected the app due to the usage of TrueDepth API. Unreal native AR plugin was maintained and simplified. Face tracking part was cleaned.
The seconds reject was strange as they sent me a black screen. I don’t know all the specifics of the way Apple tests the apps, but I feel like they tested it in an emulator or maybe in a totally black space. This problem was solved via chat with the Apple team.
More than that, after I uploaded an app’s update I got the message that my icon was being used by another app. But it was my app… Again, the chat solved the problem and all updates were published after that successfully.
The moral: fight for your rights.
And finally… Don’t use AR in elevators!