logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Creating a Real-Time UFO in VR Using Unreal Engine 5 and Houdini

Edward Dawson-Taylor has told us about the work process behind the UFO project, spoke about effects in the scene, and showed the camera setup he used with HTC Vive.

Introduction

80.lv: Please introduce yourself. Where did you study? What companies have you worked for? What projects have you contributed to?

Edward Dawson-Taylor: My name is Edward Dawson-Taylor. I am originally from London in the UK but moved to Los Angeles about ten years ago, where I now live with my wife and two daughters. I did a Computer Science degree at Bristol University in the UK and was a software engineer/architect for about 6 years working on corporate software mainly in the tax, legal and geological industries, before transitioning into VFX. I was a hobbyist and VJ but wanted to go pro, so I studied Maya at Escape Studios in 2006.

I didn't want to do another degree due to the time and money involved, and I wanted an up-to-date professional-grade education. I had been teaching myself for years and was not production-ready by any means. I was looking for somewhere to help beat me into shape and they did it in 3 months. Escape got me my first paid 3D work on Harry Potter 5, and through that, I realized I wanted to be a generalist, not a specialist, so I chose to work in commercials rather than film, and in a hair raising moment was lucky to be offered some freelance work at a mid-sized VFX house called Golden Square Post Production, in Soho London.

I had to leave my full-time software position after only 2 weeks of work, which felt crazy but you only get one life, and this felt like my chance. This fortunately became a full-time job almost straight away and gave me what I wanted which was an apprenticeship with a great mentor, Sean Elliott, to whom I owe so much of what I know. I was very lucky to be in such a wonderful VFX family-like bubble at a great time and loved it for years.

While there I met my wife who came over from LA to work on Harry Potter 7 II with MPC. She wanted to come back to the US. So we moved here and I spent a while rebuilding my career through commercials, then TV at Zoic getting to do some cool shows like Arrow and the Flash. I then went to DD for a game trailer and that led to an offer to work on The Jungle Book, which felt like a risk as it was this new way of working and a lot of people turned it down as it was more like previz than VFX. I am so glad I took the risk, as it led to my childhood dream of working on the Jurassic series at ILM, leading the Virtual Art Dept for MPC on the Lion King, and more.

I then spent some time working on location-based VR projects at Pure Imagination, and then working for Roborace/Arrival an Autonomous electric Race Car championship creating a world-building machine in Houdini, and finally after years of teaching and using real-time engines was inspired to create a School called CG Pro, which I now run with my wife.

We teach cutting-edge Virtual Production techniques and are an Unreal Authorized Training Center teaching the world's first Unreal Connectors program. Epic Games has created this as a way to help people train in the style of the fellowship program through affordable experiences with training partners. We have been doing the same thing under a different name for over a year now and our students have gone on to work at studios like ILM and MPC.

The UFO Project

80.lv: How did you get started with your UFO project in UE5? What was the goal?

Edward Dawson-Taylor:  The goal with the UFO project was to put Unreal Engine 5 through its paces, in as many ways as I could, on a laptop (Alienware X17 R1), in as short a time as possible (I am very busy), and to have fun. So as soon as the City Sample from the Matrix demo came out I knew I wanted to make something with it. I wanted to test as many systems as I could in less than a day and come out with complete animation. Obviously, I am standing on the shoulders of giants.

Unreal Engine and the city, with all of its people, cars and interaction were already done for me. So I wanted to add something fun to it and test some of the main things like Virtual Camera, Sequencer, Materials, Niagara, Nanite, Lumen, World Partition, and the Movie Render Queue. I didn't get to use composure or nDisplay in this shot, but that's next. I started by just exploring the world and marveling at it. Then I thought about what might fit in there and as I had been playing with making UFOs in Houdini I figured that would be a nice simple way to get some complex geometry in there and make a fun moment that 

Unreal Engine 5

80.lv: What are your thoughts on Unreal Engine 5 and its first stable version? What’s good? Is there something that needs to be improved? What impressed you most?

Edward Dawson-Taylor:  My first thought on UE5 after having tested the early access and preview version, was how stable it was. I didn't have a single crash, and I threw a lot of geo, lights, emissive surfaces, particles, and more at it. It really exceeded my expectations here. I know it's been worked on for a while and they were being careful and working extremely hard, but as a recovering software developer, I take my hat off to them for pulling this off.

It's not only groundbreaking in so many ways, but really nice to use, very familiar with UE4, and ran at 50fps on my laptop with no crashes. It's just crazy. The things that impressed me the most were the amount of geometry it can handle, the visual quality of the lighting over such a huge scale, the amount of detail possible from very close up to far away, and the stability again. It needs some tweaking to get some of the systems like probably the LiveLink inputs but this could just need more testing. I also haven’t tested Compositing and LED wall use, but these are stated as not quite ready anyway.

Houdini Workflow

80.lv: How did you generate the UFO and its patterns with Houdini? What algorithm did you use here?

Edward Dawson-Taylor:  It's actually fairly simple in a way, which I love about it, as I also wanted to show how Houdini can be used without fear. Houdini is my favorite 3D software and has historically been really hard to learn. It's still a beast but I wanted to create training that eased the way into it and to use this as a way to show how you can use it to great effect without the stress.

I worked on the shape of the UFO from two simple primitive spheres, with a simple deformation to squash the sphere into shape, and insert or merge the "Eye" of the UFO with a separate material that would help in Unreal to assign different shaders. Then on the body of the UFO, I applied one of the amazing Labs tools called Lot Subdivision which is meant for dividing geo into lots to make city maps. It works really well to add detail like greebles on ships and there is a lot of control.

Because everything is driven by attributes in Houdini it was fairly easy to assign random values to the faces and use that information to drive extrusions all over it. So the result is a lot of detail with a very adjustable look with just a few sliders. I then projected UVs so the textures would work, and ran a few tests before exporting the geo as FBX and importing it into UE5. Then is simply developing the shaders, enabling nanite (a check box), and animating it in Sequencer. 

Effects in the Scene

80.lv: Could you tell us about tweaking FX with Niagara and Materials. What effects did you prepare for the scene?

Edward Dawson-Taylor: FX wise the laser is actually driven by a material, which has an animatable texture in UV space. It's a simple cylinder with a material on it, that I added some controls to in the material, using the Material Parameter Collection that allows you to animate the speed and position of the noise-based transparency, so you can see the laser energy moving down and animating on at the right moment.

I then created a Niagara system for the debris because it seemed like something should react. I wanted to go way further but one of my requirements was under a day. The Niagara system is so flexible and now in UE5, it has a much easier way to add an array of meshes to the renderer. Besides, you can add a random number to the emission, so you can pump out a lot of variation easily. This is way easier than in a lot of other software. There is so much geo in there I just chose suitable things from the city to fire out as pieces and dialed it all in to make the scale work.

Camera Setup

80.lv:  Please also discuss your camera setup with HTC Vive. How did you use the device to get the needed result? Could you explain the workflow to beginners?

Edward Dawson-Taylor: This is also pretty straightforward. There are a few ways to do it including blueprints, but I wanted to show how simple it could be so I used the Live Link XR plugin. Make sure you have HTC Vive connected and calibrated. You then just have to enable LiveLink, LiveLinkXR, and Virtual Camera plugins. Then from the LiveLink window, add a LiveLinkXR source, and check all the HMD and controllers boxes. It should then find any steam controllers you have connected and list them below.

Once you have that, add a camera to the scene. In the component panel in the camera actor, click the green add button and search for LiveLinkXR. Then click on that new component, and in the details panel choose the controller you want to use in the dropdown. Once this is connected it should work in the editor, and if it's in the wrong place or orientation you can flip the axes, offset the orientation and offset the position in the LiveLink window to make sure it is where you want it.

You can then simply add the camera to take the recorder, hit simulate to get the city moving, and hit record intake recorder to capture the motion of the camera. After that, go to the cinematics folder and find the LevelSequence that was created by taking the recorder in the dated folders. In a subfolder, you will find a sub-level Sequencer that has the camera animation in it. Just copy that to your own Sequencer and you will have the recorded animation you can use to render in Movie Render Queue.

I rendered it several times until the random city activity did a good take. And I kept the one I liked most. I added a second sun so I could get the right lighting on the side of the buildings as well as under the ship, and also cranked up the fog a bit to help the scale.

Challenges

80.lv: How much time did it take to finish the project? What were the main challenges? Was there some kind of a bottleneck in UE5 that was slowing you down? 

Edward Dawson-Taylor: It took just over half a day to complete the construction of the spaceship, lighting, lookdev animation, and FX. My friend Jamesen Re did the sound design after that. The main challenge I think was having control over the city animation. This I know is more controllable but instead of learning it more I just ran a lot of takes till it looked right.

The animation from the camera was a bit jerky, which could be an issue that needs fixing in UE5 but I managed to smooth most of it out with animation filters. Other than that it was really a joy to make. I think if I had used my desktop it would have been even smoother, but I wanted to see what the laptop could do, which is an i9, 64 GB, 3080. I think the fact that I could do something like this in such a short time, is a testament to the fact that very little slowed me down. It's a very exciting time in CG!

Edward Dawson-Taylor, Unreal Authorized Instructor partner for Epic Games

Interview conducted by Arti Burton

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more