Lucas Smaga from Illusion Ray Studio talked about the production of VR-movie experiences with Unreal Engine 4.
Hi, my name is Lucas Smaga and I am the founder of Illusion Ray Studio and CG artist. Previously, I was working for Platige Image for couple of years. Today, with my friend and art directior Dominik Sojka who worked several years in game industry, we will answer some questions regarding VR and 3D movies that we are creating in Unreal Engine 4.
We began with project called Dino Safari just when Oculus DK1 came out for UDK. With the release of UE4 we decided to move to the new engine and we created our first stereoscopic movie for cinemas. So we began our journey with VR and 3D films. Until now we have made 5 VR project (Dino Safari, Motoride, Afterlife, Collosus, Solars System) and one stereoscopic, Pirates 3D. They all are designed to work with 4D cinemas which are using FX like shaking seats, blowing wind, splashing waters etc.
Building Virtual Reality
In virtual reality, it is very important to follow some rules to achive the best immersion possible. Every project is a little bit different so you must keep an eye on various things. The point of view is very important. When we have a sitting character it is good to sit, when we have a walking character it is best to stand up. Of course this is not too practical to stand the whole game but it really helps with the scale and immersion. Also showing our hand in VR can by tricky. It is best to show them when you literally hold something in reality (pad, joystick, steering wheel) or just hide them. On of the most important thing is to avoid to many spinning or shakes. This will obviously make most people sick but making a 180 degree spin of you head in game maintaining your real head not rotating breaks the illusion. There are many places where you can`t avoid situations like this (for example shooter games) but if you can, just try to make is as smooth as possible.
Our primary target is the GTX 970. We cant allow to have even one frame drop below 75 FPS because this make some unpleasant effects for the viewer like smearing and delays which is noticeable even with 74 FPS.
To reduce the draw calls we try to limit our materials per model. In addition to that we use Culling Distance Volumes with our special blueprints that are deleting meshes, especially Skeletal Meshes, that are no longer visible (we found in the GPU profiler that there are processed in many cases where they shouldnt be.
Our primary “weapons” are GPU and CPU profilers. They are very helpful in finding bottlenecks. Every project is divided into smaller levels which are loaded at the beginning and later we just change their visibility state. Next we try to reduce to visibility of transparency object closer to the camera (using Camera Distance Fade) but the most importat thing is to lower dynamic shadows on Skeletal Meshes. They have the biggest impact on FPS drops.
Of course it is good to use LOD for the meshes but also for the Particles as well.
If we do have some FPS`es budget, then we increase the Screen Percentages to the limit until we have frame drops. Usually we use 100 and then we go up (it gives us sharper image which is blured by the Temporary AA).
It is good checking the UE docs about optimizations for the VR.
Because of optimization reasons, we use mainly baked lights. It is better to use Stationary lights with turned off Dynamic Shadows (if we dont need them) for better reflections. Stationary light are limited to 4 in the same region because of the RGBA channels in the materials, thats why you must use them very smart. There are many situations where we use additional Static lights with very subtle intensity to mimic the GI with roughness of 1 (to not make any reflections). Almost every light should use Light Profiles – it give more realistic fade to it. We are excited to use the new Portal feature in 4.11. Until now we were using Koola`s methode or simillar to it (using Spot Light against white plane to soften the light).
Creating Realistic Scenes
The base of every realistic scene are good models and textures. Sometimes we buy models but always we tweak and polish them, at least on the shader side. Next is lighting part that we mentioned earlier and in the final step postprocess effects which are split into global (for the whole project) and local (e.g. specific location, room). All assets must be normalized and the postprocess give us the final mood we are looking for. For example, if a room is ruined in some degree, then all thing must have some level of damage added on them. If a cave use one kind of rock material then all of other rock meshes should use the same material or very similar. It is good to use Material Functions in situations where you create one kind of material and you want to add subtle changes like moss, snow, dirt etc.
Top-down approach. The most important is pipeline. We start with a screenplay, animatic, prototype in UE, base models and then we add details with every iteration. When we finish prototyping and making camera movement (even if the player can turn 360 degree) we know where to focus on, where to add details and where we can reduce them. Like in real cinematography where you tweak a shot, so we do but in 360 degrees. So this is why we prototype the levels and camera. It helps us not only to judge where to add details but also to optimize the scenes.
About the “Sparkles”
We didnt have problems with them. We know that there are problems with blinking black square on eye from UE 4.9 but to solve it you need to maintain 75 FPS.
Using Unreal Engine for VR
Working with VR is sometimes very difficult. It is more pleasent to create scene in UE for stereoscopic rendering but the whole effect is more satisfying and interesting for the viewer. One of the coolest features of UE are Blueprints that are allowing us to rise our creations process on a higher level. It looks like the Sequencer in 4.11 will be something that are we waiting for since UE 2.5 and that can greatly help us in creating VR and 3D movies.