Using Virtual Cameras In GameDev & Filmmaking

Matt Workman helped us understand what a virtual camera is, how it works, and how it can help game developers and filmmakers. 

Using a virtual camera means that track a CG camera to a real-world camera rig you can operate. Basically, it lets live-action filmmakers control a CG-camera as they would in the real world for a more traditional cinematic feeling. How can it be used for games, you might ask. God of War, for example, was entirely directed with a handheld virtual camera and blended together in real-time. 

Matt Workman, a live-action cinematographer and software developer, joined us to explain how it works, how you can set it up, what do you need in terms of hardware, and more. 

Introduction

My name is Matt Workman, and I’m a live-action cinematographer and software developer.  As a DP, I’ve shot music videos for Justin Bieber, 50 Cent, Diddy, etc., and commercials for Facebook, Google, BMW, Estee Lauder, etc.  As a developer, I created a cinematography simulator called Cine Tracer using Unreal Engine that is used by filmmakers around the world today and is available on Steam.

About the Career Path

I got into cinematography by shooting music videos for indie artists in NYC and slowly worked my way up to shooting for big labels and, eventually, moved into shooting commercials.  I got started with motion control (programmable robot machines) and ended up shooting some commercials that involved motion capture, virtual production, and virtual cameras.

A few years later, I was building my own virtual cameras in Unreal Engine and building them into Cine Tracer.

What Virtual Camera Is?

“Virtual camera” or “VCam” is when you parent and track a CG camera to a real-world camera rig that a human is operating.  A common example is to track a handheld camera for some kinetic and cinematic action footage. At the end of the day, it allows live-action filmmakers to control a CG-camera like they would in the real world.  The tool in the right hands gives CG footage a much more traditional cinematic feeling.

About the Hardware

The easiest way to try a virtual camera is to use Unreal Engine’s free “Virtual Camera Plugin” and then use an iPad with their free URemote app.  This allows you to use the iPad to control the position and rotation of a CineCamera in Unreal Engine. You can change the focal length and even record the camera motion “take” to be edited later.

UE4 Virtual Camera Plugin Documentation

Another affordable approach is to use an HTC Vive Controller or Vive Tracker and attach it to a real-world camera rig.  It’s very straightforward to use SteamVR and UE4 together to make a Virtual Camera that way.

At a professional level, there are dedicated enterprise camera tracking solutions from companies like Mo-Sys, NCam, Stype, Vicon, Optitrack, etc.  These are very high-quality and production-ready solutions that are reliable, but expensive.

My goal with Cine Tracer is to make Virtual Production tools accessible to everyday filmmakers and if you have a gaming PC or laptop and an HTC Vive, you can be experimenting with VCam in minutes.

The Setting Up Process in UE4

With UE4, you can use the virtual camera plug-in which is free and download the free URemote app for your iPad. There are tutorials online that step-by-step show the process.

For a Vive VCam, you simply want to ask for the Position and Rotation of the tracked device (which is the Vive Controller or Vive Tracker) and then set the position of a CineCameraActor to the Position and Rotation. This is all native functionality to UE4 using the included SteamVR plugin, and it’s how you write normal VR games in UE4.

You can use any scene that you would normally use for game development.  Megascans is free with UE4, so you can whip up a photorealistic forest or mountain scene pretty quickly and then be using the VCam like it was a real location.

Some people record the camera tracking data to Sequencer or Take Recorder in UE4.  Others like me, record the footage directly from the viewport, and that is the final product.  It basically treats UE4 like the camera, and I just get footage at the end of the session to edit normally.

Combining Virtual Cameras and Mixed Reality

The next step up in Virtual Production from VCam is to match the virtual camera to a real-world camera.  So, we need to match a real sensor and lens in addition to the position and rotation. The end effect allows you to film a real person (against a green screen) and perfectly put them into a virtual environment and see the end result in real-time.  You can then move the camera freely because everything is tracked. You can track the camera and key the footage in Unreal Engine natively now.

Mixed reality is done in broadcast all the time and the next steps after that are Augmented Reality and then LED-wall virtual production like on The Mandalorian.

The Biggest Challenges

Virtual Camera is pretty straightforward because everything is CG.  The main thing you want to consider is the quality of the tracking data, so it’s not jumpy or laggy.  With the iPad (ARkit) and Vive tracking solutions you generally want a well-lit room that doesn’t have windows, mirrors, or black shiny surfaces.  They use different tracking technology but both are “confused” by the mentioned features.

To avoid lag, you want to output the view preview directly from the GPU through HDMI.  There is very little latency that way.

Finally, you may want to implement some basic motion-rotation smoothing if that fits the type of camera work that you are trying to capture.  From there, it’s business as usual and executing good camera work is universal as far as composition and movement are concerned.

On Using Virtual Cameras in Gamedev

Game cinematics has been using “Virtual Camera” for a long time, ever since there was MOCAP of actors they have been tracking cameras.  God of War, the entire game was basically captured with a handheld virtual camera and perfectly blended together in real-time.

UE4 in Filmmaking

Unreal Engine has been diligently building tools to make Virtual Production (including VCam) free and part of their toolset. I was the Director and DP on the SIGGRAPH 2019 LED Virtual Production demo that showed a small version of how The Mandalorian was shot.  This exact tech is now part of the engine for free and or will be added in future releases.

It’s getting easier and more affordable to experiment and learn virtual production because of Unreal Engine and I believe will see a Virtual Production Revolution where productions of all scales will be using it and innovating new ways of producing images and telling stories.

Matt Workman, Cinematographer

Interview conducted by Arti Sergeev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more