The War of The Worlds: Virtual Production with Unreal Engine

Chris Farkouh has talked about the production process of The War of the Worlds, a small film based on H.G. Wells' novel, and discussed the learning process at CG Spectrum.

Introduction

My career in entertainment goes back to when I discovered a passion for theatrical lighting as a teenager. My first job was as a production electrician for theatre and live events, and this became the catalyst to then complete an MA in Theatre Practice, specializing in lighting design at The Royal Central School of Speech and Drama in London. I was lucky enough to then become a professional theatrical lighting designer for some years.

I later took the step up as a production manager for live events, working at HEART | Productions, becoming the agency’s technical director after a short while. We were one of the production companies that deliver film premieres in London, and I will always remember the Twilight: Breaking Dawn Part 3 UK Premiere because of the scale of the technical production across the 3 largest cinemas at London’s Leicester Square.

In 2013, we won the contract to deliver the technical production for the Coca-Cola ‘Be Active’ Pavilion at the Sochi Winter Olympics. I ran this large project as the technical project lead, including working for 2 months at the Olympic Park in Sochi.

In 2016, I became a freelance production manager and delivered the scenic fabrication for the main stage of Tomorrowland: The Elixir of Life for Brilliant Stages. In this role, I was working with some of the world’s greatest scenic artists and design engineers in the special events industry. More recently I have been working for a brand experience agency, Jack Morton Worldwide, as a technical production manager and I was onsite delivering an exhibition project at Mobile World Congress in Barcelona just as the Covid-19 pandemic arrived in Europe.

Background

Regarding my interest in moving into Realtime Production, I had started to get impressed by the fidelity of the games that were being released on console since around 2015. The first game that blew me away was Uncharted 4: A Thief’s End by Naughty Dog. I remember melting over the infectious storyline, the gameplay, and visual fidelity of the worlds I was being asked to explore by the game.

However, the game that really pulled me into Realtime Production was Until Dawn by Supermassive Games. As far as I could see, this wasn’t just a game, this was a high fidelity Hollywood production made within a game engine and sliced up into interactive events and narrative branches. I recognized this to be an unusual next-generation product, and I became fixated by the need to figure out what the technological and creative process is to make it. 

Joining CG Spectrum

So this was my mandate to re-enter education, and I decided in July 2020 to enroll at CG Spectrum having spent 3 months scaffolding into it via self-study in Unity & C#. I was weighing up the options for either joining CG Spectrum or joining the NFTS MA in Game Design and Development. I made the decision to join CG Spectrum because I thought it would be more up-to-date & targeted game production training designed by actively working AAA personnel. 

I entered CG Spectrum with the specific mandate to learn how to make an Interactive Drama. I met with Game Design Head, Troy Dunniway, explained my position and he was happy to invite me onto the game design course. Knowing that I was going to be designing a game in the first term, I decided to bring The War of The Worlds into the CG Spectrum framework. I did this partly because I am from the town in England that the book is written, so I felt it was something I could make very authentically, and partly because it is a highly recognized piece of literature in the UK.

The course material was entirely suited to fit my aspiration of creating a substantial AAA game (it is focused more on FPS and adventure games but designed to be flexible enough to support alternative genres as well). During this time, I began the construction of a microsite that a studio could use to host the design material required for teams to successfully go and make a game.

By December 2020, Troy had given me an in-depth understanding of the process required to design a AAA game. I had also by this point mentioned that I envisaged my final output at the school to be producing a cinematic for The War of The Worlds. Simultaneously we had established that my creative passion in games may well be more aligned to a Technical Artist role as opposed to a Game Designer, and on this basis, he recommended that I move into the Art track at the school.

I then had the pleasure of working with Clinton Crumpler learning the process of hard surface modeling and texturing for games. This aspect of the course was very much more tools-focused, and I was working across Maya, Marmoset Toolbag, Substance 3D Painter, and Unreal Engine. Clinton has in-depth knowledge and skills in both the technical and creative aspects of game art production, which at times can feel like a bit of black art, and he is very willing to share as much of his knowledge as he can with the time you spend with him, turning you into a professional game artist! 

It was at this time in April 2021 that I had begun the process of preparing the Martian Tripod in Maya. I had also now created the framework for the cinematic, and with a full scope of work, I decided to add a 4th term to my CG Spectrum education to have the time to complete the project. I then transferred over to Virtual Production mentor, Deepak Chetty, who was the best-placed mentor in the school to support me in delivering the project.

Between May and October 2021, I have been working to a specific schedule of work that I produced, primarily working in Unreal Engine. I have been simultaneously making a game prototype based around the Unreal Production teaching materials created by Simon Warwick and producing The War of The Worlds cinematic with Deepak, roughly on a split of 2 days game and 3 days cinematic.

The way the Virtual Production course is designed, Simon gives you the framework, knowledge, and tools to be able to create a quality game prototype, but there is enough creative freedom to be able to make the project your very own. I embellished his teaching materials to create a fully working prototype with a strong emphasis on learning technical art, animation, and blueprint interactivity in UE4.

The War of the Worlds Project

Meanwhile, regarding The War of The Worlds cinematic, I had the idea of conveying the journey of the Martian invaders from Mars to Victorian London and I selected a soundtrack that I felt conveyed the emotional tone as the initial basis of the cinematic. With the soundtrack in place, I looked for shots that would help convey the story – from within the BBC series, across the internet, and by exploring assets that I could purchase on Unreal Engine Marketplace.

I had also already identified the most important frame whilst planning the game, which is the scene where The Narrator looks up at the Tripod towering over the church. It is a really nice piece of emotional cinematic storytelling because you feel the sense of helplessness looking up upon the gigantic Martian machine.

I then turned the blueprint for the trailer into a shot breakdown and started to build it into a schedule of work. Deepak and I agreed to put the most difficult frames into production first, and we also decided to use Source Control, so I set the project up on a cloud-based Perforce server both as a backup, and so that we could both have access to the project.

The most complex part of the entire production was always going to be the creation of the Martian Tripod. For this part, I bought a model from Turbosquid, imported it into Maya, retopologised the geometry to reduce the polycount, and simplified some of the parts. Then I textured it using a trim sheet, gave it a skeletal rig in Maya and an FK rig in Unreal to then be able to animate it. All these processes for making the Martian Tripod character took approximately 5 weeks to complete, which is nearly 1/3rd of the entire production time.

Virtual Production in Unreal

The project was made using Unreal Engine 4.26. Prior to starting this project, I had about 3 months of experience making small game prototypes with Unity, and about 6 weeks of experience with Unreal Engine, working across the core front-end components of the engine: The Editor, Blueprints, and Sequencer.

For Virtual Production projects, Sequencer is the primary toolset used for generating your animated content. It is a tool that is designed to deliver in-game cinematics. It effectively turns the game engine into a 3D nonlinear editing tool (NLE) as well as an interactive software production tool. What this means is that you don’t need to think of Unreal Engine as just a game or interactive toolset, which requires an understanding of Blueprint logic or C++ coding. Instead, you can sequence an entire Virtual Production along an intuitive timeline editing tool without ever knowing how to write a single line of code. You can then quickly render the content out as pre-rendered video files.

Unreal Engine has a huge emphasis on motion picture cinematography, and the cinematic camera and post-processing systems have a high level of precision which has been transposed directly from real-world filmmaking techniques.

There are also a large number of additional features that ship with Unreal Engine with Virtual Production workflows in mind, many of which were used for this project.

  • Virtual Cameras. Some of the scenes are shot using the virtual camera system. I attached an HTC Vive wand to a DSLR camera rig and used an iPhone X as a preview monitor to preview the camera viewport inside the virtual world. I also built a small tool using Blueprints to make the camera a Steadicam and used a game controller to be able to position myself accurately inside the virtual world, adjust the aperture and the focus of the camera before taking a shot.
  • The camera movement is then recorded using Take Recorder. This is a feature that allows you to record content onto a timeline. You can then bring the content back into your main timeline. So in this example, the positional data and movement of the camera are recorded and then placed back into the main timeline sequence. 
  • Live Link is the feature that enables both Virtual Cameras and also Facial Capture. Using an application that you can download onto an iPhone, it is possible to record your facial movements and apply them to characters inside Unreal Engine via Take Recorder.
  • Metahuman Creator. There is a website that allows you to create custom characters that you can import into Unreal Engine. These characters are ultra-realistic and have advanced facial animation rigs that you can either animate using Facial Capture via Live Link or keyframe animate using Control Rig.
  • Control Rig. This is a feature that enables you to rig skeletal meshes inside the engine and then keyframe animate them using Sequencer. The Metahumans use this feature so that you can hand key them within Unreal. I also used this feature to create an FK rig to animate the Martian Tripod.
  • Quixel Megascans. This is an online database of real-world assets that are 3D scanned and manually processed to be used inside 3D content creation tools as either 3D assets or textures. The assets are free to use in Unreal Engine and Quixel Bridge has subsequently become a fully integrated feature in UE5.
  • Movie Render Queue. This is a new high-quality rendering feature designed for professional Virtual Production pipelines in mind. It delivers high-quality anti-aliasing and motion blur, enabling very high-quality pre-rendered content to be produced in Unreal, and it can output in EXR format, which is the go-to file format for high-end computer graphics.

Production Process

In order to deliver a project of this scale in the timeframe, I knew that I would be working with a mixture of free, purchased, and hand-built assets. Once I had the shot breakdown I was able to start building a list of assets that I needed to acquire or make.

Unreal Engine has a large marketplace of professional quality art, animation, VFX, and gameplay assets you can purchase. I also used Quixel Megascans, Turbosquid, CG Trader, and Kitbash 3D assets. Autodesk Maya and Adobe Substance Painter were the digital content creation tools that I used to pipeline content into the engine.

Most of the scenes are created using the principles of Virtual Set Design. This is a virtual production technique that is directly transposed out of live-action: you only build the scene based on what the audience sees, in this case from the viewpoint of the camera. This is different from game production, where you build 3D virtual worlds which can be explored by a player to complete missions and quests.

As an example of my working process, Horsell Common was planned to be the most complicated real-world scene to produce. It needed to be designed as an amphitheater with shots from 2 different camera angles. 

I created the landscape by forming a plane in Maya into a landscape topology, I then textured it using a Megascans Blend Material within Unreal that allows you to paint and blend 3 textures together. I populated the scene with realistic pine trees purchased from the UE marketplace. I added additional rocks, trees, scatter, and decals using Megascans and the UE foliage painting tool. I built the Martian Capsule in Maya and textured it using Substance Painter, before lighting it in Unreal so as to appear to have an ethereal green and gold quality.

However to make the scene believable there are many more tricks required. Behind the perimeter of the trees is a billboard which is a photograph taken from Horsell Common. There is also a perimeter fog card that uses a PNG fog texture that is slowly animating and fades out when it hits the ground. This is done using Unreal Engine’s material editor system, which is an incredibly powerful tool for creating film-quality materials and effects to apply inside your virtual productions. I also built an animated fog VFX using the material editor and Unreal’s Cascade system. This is a customizable fog cloud that you can toggle the size, density, and color. It gives the environment a more magical appearance.

For creating the characters and the animations, there are a number of techniques used:

The Victorian characters were made by creating three characters in Metahuman Creator and then importing them into Unreal. At the same time, I created three Victorian characters using Daz Studio and imported them into Unreal. The black art is that you then attach the Metahuman and the Daz characters together and hide the parts you don’t want the audience to see. The face is animated using the Metahuman facial rig and the body is animated using the Genesis skeleton that comes with Daz characters. By attaching the Metahuman head to the neck bone of the Daz character, it automatically moves correctly with the motions of the body.

Most keyframe animation is done using Sequencer, so for example, if you want to move or rotate an object you adjust the key frames along the timeline. This is also how you can animate the movement of the cameras, and there is a curve tool that allows you to have very fine control of the animation and the easing in and out.

Motion capture is retargeted onto the Genesis skeleton using animations from Mixamo or Actorcore. For character walk cycles, I have used motion capture which is applied to the characters in Sequencer and then plotted their movement within the virtual world. Facial animation was done either using Live Link Facial Capture or hand-keying the Metahuman blendshapes using Control Rig. I was getting mixed results from Live Link during production and found I needed to hand key one of the characters for the frame to be of a high enough quality.

For Tripod, there are two methods of animation. In the scene with the Tripod and the Church, the character is animated by hand in Sequencer. For the Tripod’s walking around London, I used a plugin that procedurally animates the Tripod. You play the Tripod like a 3rd person game character using a game controller and record the animations in Take Recorder.

Many of the atmospheric FX are hand built in the material editor and adjusted in the scene. For example, the vapour trail in the sky over Horsell Village is created by blending a static vapour trail PNG with two panning and rotating cloud textures. The animation speed, colour, opacity and emissive brightness are all adjustable to suit the composition of the frame.

I wanted to highlight the Procedural Vein Growth algorithm. For the red weed that grows over the statue, I always expected this to be the hardest effect to pull off without learning to use an FX tool like Houdini. I researched a number of ways to do this, but in the end, I created a red rust effect that travels across the body of the statue, and I also found a very unusual tool on the UE marketplace that procedurally generates red veins around a static mesh. 

Once I was happy with the general appearance of the veins I transferred the geometry back into Maya to clean it up before reimporting it into Unreal. The material applied to the geometry has a gradient mask applied to the opacity channel. You can keyframe the value of this mask in Sequencer meaning you can create the appearance that a weed is growing across the object. 

Cinematography

Reflecting on the project, I would say that the wonderful thing about using Unreal Engine for Virtual Production is that you are able to single-handedly produce large-scale theatrical productions at an incredibly high standard with an intermediate level of training and experience. I have proved this by way of making this cinematic. 

I remember at the start of the process I was unsure how to produce the scene with the Tripods walking across London. Deepak gave me strong technical direction on how I could produce this scene. The Tripod animations were procedural and recorded by using the engine as a game production tool, then transferred into a linear timeline. This is a very unusual but efficient way of producing animated content that would never have been possible without Unreal Engine being a simultaneous game and timeline animation tool. Then regarding the cityscape in the background, this is constructed using a traditional VFX technique of combining a matte painted sky with an edited panoramic photograph of a low-rise city. The film damage, lens dirt, vignette, and color correction are added as post-processing effects in DaVinci Resolve. By combining these VFX techniques, this scene became one of the easier scenes to produce across the cinematic.

Conclusion

Although I have no background as a filmmaker prior to creating this cinematic, I think that having a background in theatre production helped during production. In particular, theatrical lighting, set design, and cinematography are quite similar disciplines as they are the departments that define what the audience sees and the emotional qualities of that visual composition.

I think the complexities to take virtual production work in Unreal Engine from intermediate to advanced level grow immeasurably, and to take work of this scale to the next level, I would need to divide my projects up into their respective CG art disciplines, both within Unreal (e.g. Environment, Technical Art, Animation, Cinematics, VFX) and using the tools and process that pipeline into the engine (eg Concept Art, Storyboarding, 3D Modelling, Texturing, FX, Performance Capture).

Chris Farkouh, Technical Producer

Interview conducted by Arti Sergeev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more