Engine for VR Scene Production: Unity or UE4?

Daniel Stringer recapped how he created a VR Breaking Bad Lab in Unity and moved it to UE4 a few years later and compared the two engines.

Introduction & Career

My name is Daniel Stringer, I’m 38 and live in Cornwall, UK.

I currently work as Lead 3D Artist at a company called The Moment which is part of the creative engagement group. Here, I specialize in creating content and applications for VR/AR including occasional rendered animation.

Since the age of about eight, I’ve been fascinated with 3D graphics. I remember playing Sonic the Hedgehog, and Super Mario and wondering how they were created. When I was about fourteen, I remember trying some 3D software - I think it was a predecessor of 3ds Max. And as soon as I’d made my first cube I was hooked. I love the feeling of being able to bring ideas and concepts to life through 3D modeling and design.

I took courses at college (Advanced Diploma in Computer Science) and it wasn’t very art-based, but for my final year project, I recreated a space invaders clone using 8-bit graphics and the C programming language. 

Life then took its twists and turns and I decided to focus on learning 3D on my own and working as a freelance 3D artist. Over the years, I also had a couple of strange jobs (police, property industry). I like to think I’ve learned a little from each and that knowledge contributed to my creativity and approach to anything 3D related. 

Now, I’m a full-time lead 3D artist, working for an amazing company. I’ve worked on a multitude of projects: submarine walkthroughs, AR applications for the agriscience industry, a large VR application of a construction site to teach students the correct health and safety processes when they are working, and more. A couple of years ago, we created a realistic radio station studio in VR to teach presenters what to do when things don’t go as planned, and it won a couple of awards. My current project is to create a 3D character that will be composited into live-action footage for some training videos.

Breaking Bad Lab: Start of the Project

1 of 4

About four years ago, I started watching Breaking Bad. I thought the show was very unique and I wanted to create some fan art for it. I began looking around to see if other artists had produced anything alike and I was amazed to find that lots of creators had taken the time to make some really amazing ZBrush sculpts of the main characters, plus there were lots of Photoshop paintings that all looked stunning. Yet, I wanted to do something a little different. At the time, Oculus had released the DK1 and we had a development kit at the office. So I thought how great it would be to try to make a scene from the show and walk around it in VR.

One scene in particular really got my attention - the superlab. Featured in many episodes, it looked like a really good example of a set design including props, lighting, and general atmosphere.

First Version of the Scene in Unity

As soon as I started the project, however, I kind of regretted it! I soon realized that it was quite an undertaking for a personal project.

I wanted it to be a perfect replica that would do the show justice, so I aimed at obtaining as much reference material as I could. Looking at hundreds of different angles of the set shots, I was able to create a floor plan of the space. 

From there, it was all about looking at references whilst scrutinizing every angle and then modeling the space. For this, I used 3ds Max, - it’s always been my go-to application for 3D asset creation.  

Once I was happy with the layout of the rooms, I started making the props. I had to be mindful here as polycount was important. So where possible, I utilized normal maps that had been created in ZBrush to make models look more detailed.

After that, I had to texture the models in Photoshop. There was no Substance Painter back then and Unity didn’t have the current PBR workflow, so texturing was quite simple and quick... apart from the signs. And there was a lot! I had to try and reference each one, then remake it in Photoshop. This probably took more time than anything else.

Once I had created all the assets, I brought them into Unity and managed to get everything working in VR. It relied a lot on occlusion culling to optimize the scene but it worked!

Second Version of the Scene in UE4

Fast forward to 2019, I was watching what Unreal was doing: the quality and ease of use were becoming hard to ignore, and I really wanted to make something in the engine. Then, I saw that the producers of Breaking Bad were about to release a movie. This got me thinking: I could reuse all the assets I had made previously and try to get them working on Oculus Quest. 

After learning Unreal fundamentals, I tried to bring the models in and see what would happen, but the results were not good. Whilst Oculus Quest is undoubtedly an amazing piece of hardware, my models and textures were going to require some optimization. 

I decided to combine a lot of meshes together. I did this depending on what material I had assigned. I merged all the chrome items together, all the steel items together, all the black rubber wires together, etc. This resulted in about ten different meshes as opposed to the previous 100. 

Next came texture optimization. There are a lot of signs in the lab - those were combined into one mesh, plus a texture atlas was created. Now, I had one mesh, one texture. This was way more efficient. 

I also did a lot of work to bring the textures up to standard. I used some Megascans textures for the walls, tweaking them a bit in substance Painter and then using Unreal's material pipeline to make sure everything was PBR-friendly. 

I really love Unreal's shader editor. I used it to create shaders for different surfaces like glass, metal, etc. It’s easy to use and I got some great results. I tried to avoid making everything shiny and get a variety of different surfaces instead to make the scene more compelling. 

Once the models and the textures had been redone, I needed to think about the lighting. I knew dynamic lighting would probably not work, so I used the Rect Light for all the lights around the perimeter of the room, combined with an emissive material on the mesh. For the spot lights, I used Unreal's spotlight. The lights around the walls inside cages had an area light placed inside. I then tweaked the colors to suit the mood, baked the lights and kept tweaking them until my lighting matched the reference. 

1 of 3

Unity vs. Unreal

To be honest, there is nothing in the scene that I could probably not have done in Unity. However, there is something about the bake lighting quality that seemed easier to achieve in UE4 (it was certainly faster to bake) and more accurate in terms of the shadow calculation. For example, the lights placed inside metal cages (around the walls) always seemed a bit of a problem in Unity and took a lot of trial and error to get them looking right. But Unreal dealt with it really well. 

Overall, the two engines have a lot in common. I think Unreal has a few more features that feel like they have matured and are part of a complete package. Unity sometimes feels like you are bolting systems on to achieve what you need or using a feature that is in beta. 

If I need to create an environment or a product animation, I personally will go straight to Unreal,  particularly because of the Megascans integration and the introduction of the HDRI backdrop. If a project involves a lot of interaction, I would probably use Unity, purely because it’s my comfort zone and I know c# well enough to make what I want happen. But as soon as I get my head around the visual scripting side of Unreal, I think I might take a leap!

Daniel Stringer, Lead 3D Artist

Interview conducted by Ellie Harisova

Keep reading

You may find these articles interesting

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more