Mirko Vescio from Oneiros was kind enough to talk about the way his studio sets up VR experiences with mind-blowing visuals in Unity.
Mirko Vescio from Oneiros was kind enough to talk about the way his studio sets up VR experiences with mind-blowing visuals in Unity.
Introduction
I am the CEO of Oneiros, a startup company based in Milan, co-founded with Ruggero Corridori, Lead Artist and the head of ArchVizPro series, and Antonella Contin, an Academy Director, in the middle of 2016 with the aim to offer enterprise virtual reality solutions, using Unity, mainly for the architectural industry but also for everything related to environments and other products.
Launching a VR-focused company
We realized that many virtual reality experiences, even made by big companies, were not realistic at all having flaws related to the graphics quality and that is a big problem if you are trying to use VR to simulate a real world around you. Since then, our goal has been to create realistic virtual reality experiences, seeking new approaches and methodologies.
If a user forgets he is in a simulated reality, it means we have done a good job.
Our clients are usually architectural studios, real estate companies, worldwide brands but also others VR companies.
Demo
The first step when creating our Virtual Reality experiences demos is to find the right inspiration.
It is not only about copying an existing project — we search for something that fascinates us with the lights, the composition of the volumes and the materials.
For our last demo (ArchVizPro Vol.6), for example, we started with this project by modifying it and giving some personal touches. Ruggero, our Lead Artist and head of the ArchVizPro series, does a great work at handling all these aspects.
Talking about the challenges, they are common to all our Virtual Reality experiences. First, of course, there is no post-production. Almost everything must be done in 3D, you cannot just show what you want — a user chooses his own focus. It is necessary find the right balance between details and optimization for the whole scene.
The tech
In general, we found Unity more flexible as a real-time rendering engine.
Moreover, it is possible to find a big amount of resources online and a great documentation. It is almost impossible to come across a problem in Unity and not being able to find a solution online.
It is not easy to find beautiful ArchViz scenes made with Unity. Moreover, Unity is mostly famous with mobile platforms or not realistic graphics but it is not true at all. Unity made big steps towards improvement and their incredible demo “Book of dead” is the demonstration of all of this.
Our production process can vary depending on specific needs of the project but, if we take as an example our last demo, the first step is to divide the process into working on single assets and environment.
The creation of the assets is generally a standard process. We start by modeling the high poly mesh and then we proceed to make low poly versions of all of them. Once the 3D modeling is finished, the final step is to make PBR materials using Substance.
Those who work on the environment do not just follow the 3D modeling and the creation of the shaders, but they also have to manage the scene in Unity. This is all about checks of the 3D model of the environment in 3ds Max and the lighting in Unity avoiding errors and artifacts.
You also have to deal with the interfaces and the coding side for the interactivity or anything else required for a VR experience.
Speaking about differences between developing a VR experience and building a game scene, first of all, there is an issue of frame rate. For a game, it would be okay to have it between 30 and 60 fps but for VR you’d better have 75/90 fps at least. Details and comfort are fundamentals.
Vegetation
Vegetation is always a big challenge when you have to do it inside a real-time engine. Getting a realistic result is a complex task which is about finding the right compromise between a number of polygons and realism.
In our case, the vegetation required a use of different software solutions and several tests.
We started working on a 3D model made in 3ds Max (Guruware ivy Generator) but with a big deal of geometry optimization. Once the low poly was ready, we continued with the bake in Substance Designer, vertex color painting and creating the final shader using Vertex Motion.
For the moss material, we used Substance Designer.
Rain
We love using our demos as tests to face new challenges, experiment with new approaches and workflows, and the rain is another tough challenge in VR.
The rain can be divided into several parts: the rain itself, the rain splash on the ground and the rain on the windows.
The rain itself was made using the Unity Shuriken Particle System.
Rain splashes required a more complex workflow. We started with a particle simulation in 3ds Max, the production of the texture in Substance Designer for the ground and the creation of a custom shader Parallax Occlusion Mapping & Flipbook.
The rain on the windows was made by using a particle simulation in 3dsMax with the aim to obtain a sequence of frames. The tiling and adjustments required some work in After Effect, the creation of a Flipbook (8×8) texture using VFX Toolbox and as the final step, there was a custom refraction shader.
Post-production
Working in Linear Color Space produces a more natural light but tends to flatten the image a lot. To make up for this lack the Color Grading is very important to contrast the image and balance shadows/highlights.
Also working with correct PBR values is very important in this process to get correct looking materials. It is very important to avoid an albedo being completely black or white and this is one of the most common mistakes in the PBR workflow.
The Eye Adaptation is also very important to “feel the light”.
When it comes to optimization, the first step is to optimize all the geometries avoiding crazy heavy one-billion-polygon meshes.
Once the whole geometry is optimized, it is necessary to avoid too much draw calls and, of course, when is possible, use baked lights. The Occulusion Culling is another “trick” to improve the frame rate of the Unity scene.