A Look at Indie Filmmaking with Mocap and Real-Time Tech

Ryan Garry, Film Director and Founder of Unlimited Motion Ltd, discussed the use of mocap and VR technologies in film production and talked about their micro-budget motion capture feature film Anghenfi.

In case you missed it

You might find these articles interesting

Introduction

I'm Ryan Garry, and I run a motion capture studio in the UK called Unlimited Motion Ltd. We work on a load of projects both in this country and all over the world – mainly immersive and animation content – but I myself have a background in filmmaking.

I'm a director but have worked in many other roles on a variety of short films (narrative and docs) around the North West and Wales. It's really good to do this to get a well-rounded view of the entire production process. This experience is very handy to have when making an indie film as even when you're not holding the camera or the boom pole, you've got to know how to overcome the challenges you might encounter on every shoot.

In the past, I've also been involved with organising networking events in Liverpool for local filmmakers. And now? I'm making a feature film called Anghenfil – the world's first micro-budget motion capture feature film.

Anghenfil: About the Project

Anghenfil (translate that for a little easter egg) – is the world's first micro-budget motion capture feature film. Because of the extra cost involved with mocap for film (like compositing the rendered image), it's generally not used in indie projects – and certainly not to the degree we're doing here, with the mocap character as a main focus.

I use the 2017 film War for the Planet of the Apes as a comparison. This is the film that inspired me to create Anghenfil, in fact. The modern Planet of the Apes series tried (and succeeded!) in integrating motion capture performers into the live-action filming, using virtual production and limb extensions to emulate as close as possible actually having apes on-set. That way, the live-action actors can give a much better performance as they're acting to someone else, rather than a ball on a stick (think Dobby in Harry Potter). Pirates of the Caribbean: Dead Man's Chest was another film that made incredible use of on-set motion capture (the technical details behind that are quite amazing, too).

I'm using these technologies now in the production of a micro-budget film – using the skillset I offer through my business to here create a 'real' virtual character – filming as if there were a real monster on set. We're also incorporating virtual production to make the on-set experience even better – live-streaming mocap data from the performer to the CG character so I can see what the final render is going to look like, and direct the performance from there.

As for the storyline, I wanted to keep it simple, but compelling – a high-concept film like Jaws. This film is relatively similar in the overall plot and tells the story of a legendary monster, long thought slain, who returns to life and terrorises the small British town of Anghenfil. It's up to two residents of the town – unlikely heroes – to take this monster down again.

While the main plot is straightforward, the film comes alive with the performances, soundtrack, and visuals. I thought that it's really important not to get overwhelmed by the technology on a film like this – if you do, there's a risk you can forget to make a film that's actually interesting. A project like that might be interesting to study, but not to watch.

1 of 5

Advantages of Real-Time Workflow

My workflow started initially as quite a standard one. This meant storyboarding out a scene, filming it, taking the footage back, then processing it with offline tools like Blender and After Effects.

These are great tools, but what made a huge difference to my workflow (and is, in a way, the reason I'm giving this interview) was the 2018 Reflections demo. This was a tech demo from NVIDIA demonstrating the new ray-tracing technology they were putting in their RTX 2000 cards. It was a great demo of the real-time ability of Unreal Engine – there used in an animated film, but here of course integrated with the live-action shooting.

The fact you can now import assets into a game engine and have them animated in real-time is amazing, and it lets you make so many more creative decisions on-set. This also shrinks the post-production period, as you've already made a lot of your choices while filming. What I'd be really excited to see next is using all of what was shot on-set as a final (or almost final) render – I think this might be the next step in virtual production. Indeed, this is already done when using LED walls.

Probably the most important aspect to speeding up production and post-production is real-time VFX. In the case of motion capture – you can stream a performance (both facial and body capture) live to your mesh. That mesh can be displayed either on a monitor (disconnected from the environment) or – as I've done – live in the environment with a 3D tracked camera:

This shoot was purely for promotional purposes (so it doesn't use the final monster mesh used in the film) but it gives you an idea of what is now possible on a small budget – compared to very expensive technologies like Simulcam that were used in the production of Avatar in 2009.

Affordable Mocap

Optical mocap systems are often used on big-budget film productions for tracking many actors at a time, and this was the system that was used on Planet of the Apes. There are two main issues with these for an indie film that shoots on location:

  • The cost. This depends on how many cameras you have, but for an outdoor system (like I'd need for this film) it'd be in excess of $30,000 for a 7m2 area. This cost just goes up for a larger capture volume.
  • The setup, as they're usually set up in studios and not necessarily designed for outdoor use. Even for the cameras that are suitable outside, the setup time is massive.

This is why I prefer using an inertial suit like the Xsens. The data is really high-quality and you can track it over a theoretically infinite area (if you move the capture laptop and connection hub). Drift is also very low with this system (1m in 100m travelled, or 1%), but if you need accurate positional data, it can be combined with the Vive Tracker and Steam Base Stations – both inexpensive IR tracking solutions.

The suit can also be set up anywhere fairly quickly. This is very important not only for this film but also for my business. I offer affordable motion capture to clients across the UK and the world – which often involves travelling to their performance location. By using this system over an optical one, I can make it much more affordable to the clients I work with.

1 of 5

Introduction of VR into Workflow

While I haven't used VR tools in the workflow for this film, I have worked with a number of immersive theatre producers and am familiar with their setups. I think for content that's created in 3D, having a 3D view of it is really useful – rather than just rotating around it on a 2D monitor. To this end using VR, or even holographic displays that let you see assets in 3D such as the Looking Glass display, has its advantages.

Andrew Price of Blender Guru recently commented that VR workflows are going to become more common – with the tools becoming more accessible and integrated into more pieces of software. I definitely agree with this – and also how real-time engines and renderers (such as Unreal Engine, or Eevee in Blender) are going to become more prevalent for both rendering and higher-quality previews while editing. VR tools will play a big role here as you'll be able to inspect the asset as if it were in front of you.

I think the key will be how you interact and edit assets using VR. The main challenge here will be unless you have a very high-fidelity tracker (equivalent to a high DPI mouse) it's going to be difficult to do this well and with enough precision.

I think the ultimate end goal of the real-time workflows (incorporating VR as a preview medium) will be much better live interoperability between software. For instance, you can currently one-click export between After Effects and Premiere, and 3ds Max and Maya. The next step for these has got to be real-time previews of 3D content within an NLE (like the After Effects-Premiere Live Link for 2D content). So, no longer would you have to export a 3D track to your 3D application, then export the render to your NLE – you could preview everything in real-time between the programs, tweaking parameters much faster.

Challenges

The biggest challenges so far haven't been necessarily connected with the CG aspects, rather just the normal filming challenges that come with shooting an indie film without a large budget! Most of these issues are solved with time spent on logistical and organisational challenges, rather than the money which would otherwise sort it out.

Afterword

In terms of motion capture and VR workflows, I think these technologies will become more and more prevalent as they become more accessible – both with the hardware, as well as the software integration (this last point is key).

I also think VFX will become more common in films that aren't traditionally seen as 'effects' films (as a superhero movie might be). Look at a drama like The Irishman – not sold on its effects, but features heavy use of de-aging technology. I think AI will make the integration of CGI (including motion capture) a lot, lot easier – automating repetitive and dull tasks like rotoscoping and instead letting filmmakers' creativity take over.

Ryan Garry, Film Director and Founder of Unlimited Motion Ltd

Interview conducted by Ellie Harisova

Keep reading

You may find this article interesting

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more