Bernhard Rieder told us about the workflow behind the Unreal Engine 5 tech demo, talked about making and animating MetaHumans, and shared some early vehicle concepts made in ZBrush.
Introduction
Hi there. My name is Bernhard, I am an Unreal Engine Artist, born and raised in Austria, working and living in the U.S. for 18 years now. Like many others, I love film and cinematography. As a kid, I watched the first epic Mad Max movie with Mel Gibson. Its specific style and creative direction are still unique today.
In the last 5 years, a lot of things changed in the Entertainment industry. Nvidia released RTX, and Epic Games' flagship Unreal Engine 5 was announced with a major change in their render engine. Nanite, Lumen, Metahumans, Volumetric Effects, etc., all that in runtime.
Below, you can watch a very short clip I created over a weekend. I should call it weekend production because I have a regular job which keeps me very busy. Being a 3D Visualizer is fun but also requires a lot of hard work.
I love cinematography, lighting, and look development, so I thought it would be really cool to create some fan art and a Tech Demo level. It's also important to say that I will not create a full fan game because Mad Max is a protected copyright title, so I am not allowed to do that.
Moreover, I wouldn’t have time to create a real full game, so this is just a small level I created to test the new Unreal Engine 5.
When I showed my short clip to a few of my friends, they told me it was a nice offline ray-traced 3D animation. "But wait a second!" I responded. "This is not an offline render, this is an in-game cinematic short, it’s in runtime, and I hit solid 60 FPS on my RTX 2080."
How the heck is that possible? Especially when the game level contains millions and millions of polygons, with high-resolution scanned data provided from the Quixel Megascan team. How can that all be true?
Well, that’s what I couldn’t believe when I was running my first tests using the unofficial Beta release of UE5. But once I installed the new engine testing Nanite and Lumen – kaboom!
Behind the Scenes
I started with a new level and focused first on setting up the environment. For that purpose, I used Megascans assets.
Once I had the landscape roughly done, I started working on some look development. Let there be light, right? I decided to go with a totally different tonal direction compared to the one used in Mad Max: Fury Road. It was very saturated, and the vibrancy of the colors was also been cranked up a bit.
I wanted to create something a bit different and also wanted to make sure the overall look and feel fit the Mad Max theme. And of course, I was curious to test all the new features Unreal Engine 5 had to offer.
For my level, I used the built-in SunSky plug-in. It’s a blueprint that already contains a Sky Light, Directional Light, and the Sky Atmosphere. In order to use all integrated Volumetric Effects, you need to turn on Volumetrics in your project's settings.
On top of that, I used the Exponential Height Fog to get as much realism as possible. All actors combined allowed me to play with the digital weather and atmospheric fog.
I also use all possible options within the Post Process Volume and all my Camera settings: add Glare, change camera blades, lenses, etc. I couldn’t afford all the fancy cameras in real life but in the digital world – heck yeah.
MetaHumans for Game Production
Here's a short clip that shows a bit of what's going on behind the scenes of level creation.
For my characters, I also use Epic Games’ MetaHuman Creator. You can use Bridge to launch the creator and after you are happy with your result, you can export the fully rigged and skinned character to your game.
Inside the Unreal Engine Sequencer, I can use the Control Rig for the body and the face to create poses. And yes, if you are an animator, you can actually manually animate your character using the Control Rig and the Sequencer as a combo.
MetaHumans for VR Production – Aerobics from the 80s
Yes, I loved the 80s and their wonderful TV-aerobics sessions. Guess what? They're coming back soon, this time fully immersive. That’s right. Please check out my crazy VR development process for Oculus Quest 2.
My mom always told me stretching your legs helps with your mobility
As you can imagine, I always believe everything my mom says, so I highly recommend doing some aerobics
MetaHumans for Animation using Motion Capture and Facial Expression Capturing
For the game production, it would take me too long to animate the characters using the Control Rig. Also, if you want to achieve really good animations, you need to be a real master in animating characters. It's art, and it’s very time-consuming.
That made me use mainly Motion Capture Performance and Facial Expression Capturing. There is one master in that field and I happen to know them. Let me introduce you to my friend Gabby also known as Feeding_Wolves. Her full name is Gabriella Krousaniotakis. We found each other in the metaverse, and since then, we can’t let go of each other anymore.
We both have so much fun working together on our ideas, that it’s just a natural and fluent process now. Gabby records all the mocap and facial expressions we use for all our animations, and let me tell you – it’s not just about wearing a jumpsuit, acting, and recording. There is a ton of other stuff you need to consider and set up before you get to this point.
However, we can use it for the gameplay in runtime and also for offline animations, shows, events, and, of course, for our VR experiences. Check out Gabby's latest performance tests.
Gabby records everything in real-time. Once we have our takes, we bring them into the Sequencer and create our edits, actions, etc. – whatever we need to tell our stories and to entertain our audience. This is how we combine our Jedi Forces and create our current Metaverse.
MetaHuman Control Rig
Here I was just having some fun playing around with the Methahuman Control Rig.
Game & Asset Creation
Since UE5 and Nanite allow me to use assets with a lot of polygons, I don’t use super low poly models anymore. For the V8 model, I used Blender and Substance 3D Painter. I also like to use Marmoset Toolbag. It’s a bit different in the baking process, but I love the fact that you also have a render engine included. That helped me to visualize my model instantly, not just the textures in the viewport but how the entire model will look like inside a render engine.
Mad Max Interceptor V8 Model textured in Substance 3D Painter
Some early quick vehicle concepts made in ZBrush:
VR Unreal Engine Artist
Aside from the project, I am working really hard with the latest VR abilities and MetaHumans. Yes, I never thought that I would have so much fun as a VR Unreal Engine Artist. I am pretty happy with my latest results, achieving high-quality rendering while maintaining high performance and using forwarding shading.
You can follow my VR development process on my slack channel. I am testing it with Oculus Rift S, and Oculus Quest 2. Can’t wait until Quest 3 is coming out, possibly with an LED Screen.
You’ll find my latest Virtual Reality development tests in my VR blog, especially shader and material tests for maximum performance. I am currently profiling a ton of different methods to achieve solid 11ms or 90 FPS, also running Air Link.
My latest performance tests with Forward Rendering are covering the following functions:
- Distance Field Shadows
- DX12, ray tracing, and GPU Lightmass
- Emissive Material
- Translucency to render the "cheapest" Glass material possible
- Normal Maps, Displacement Maps
- Volumetric Fog
- Niagara Particle Effects
- Optimized Shaders for MetaHumans
- Hair Solutions
- Real-Time Cloth Simulation (UE5 Native)
Shout out to all my annual Mad Max Wasteland party fans. This year, I’ll be there as well, I'm still working on my Mad Max outfit. I could need some serious help with the costume design. If you can help me with that, please DM me on Slack and let me know, that would be epic.
Please check out my blog for more information and my current VR development process. Feel free to download all my free Unreal Engine projects for learning and studying purposes or just download some of my game experiences that you can run on your computer.
May the pixel be with you, talk to you soon.