Cinecom's Jordy Vandeput elaborated on the creation process of The Last of Us project, delving into the steps involved in the Move.ai workflow and sharing insights on the utilization of Unreal Engine 5 during the project development.
I’m Jordy Vandeput, a content creator from Belgium. In 2012 I graduated from the Film School of Brussels, after which I began working as a freelancer.
In my first years, I mostly worked on small projects as a cameraman or video editor. I also found my way into online teaching for the company Envato. Here is where I truly learned about how to make educational videos.
In 2014, I got fired from Envato (still for an unknown reason). Here’s when I decided to start Cinecom.net and distribute my own courses. Using YouTube as a platform to gain an audience, it quickly grew and turned into a business itself.
The Last of Us Project
When there’s something trending, content creators hop on that train! HBO had just released the final episode from the Last of Us series, so it was still fresh in everybody's mind.
If you follow our channel, you’ll see that we always add humor to our videos. I find that very important, which also contributed to our success.
So it was a no-brainer for us to transform The Last of Us into a funny sitcom show!
How the Team Learned about Move.ai
One day we got contacted by a company called Move.ai. They gave us some free credits to play around with the app and we were astounded by the motion capture quality!
That’s when we decided to make a video about the app. In fact, we’re using it now for all things motion capture at the studio.
Working with Move.ai
Move.ai is basically a video-to-motion data converter. The AI analyzes video to see where the skeleton is at. Sounds good on paper, but when you only have one video source, it can get janky. If you’re blocking certain body parts, the AI has trouble figuring out where all the limbs are at.
That’s where Move.ai came up with a brilliant solution. Their app allows you to connect multiple devices together. We’re using six iPhones in total, which can capture talent from every angle. In fact, we were even able to record three people at the same time!
There’s always one host device, we use an iPad for that. With that, we can synchronize all the iPhones and upload the videos to the cloud to create the motion data.
Important to add here is that you don’t need six iPhones. Having two angles already works great if you’re capturing one talent. Move.ai provided us with the iPhones, which were all refurbished iPhone 8’s. So for around $1,000, we were already set, which is peanuts for what traditional motion tracking costs.
Using Unreal Engine
Although Unreal Engine has been used for a long time by visual effects artists, it is now since version 5 that it became more widely known in the industry.
Unreal Engine 5 is incredibly user-friendly with so many features that we take for granted. As a digital artist, I don’t have to think about lighting, optimization, atmosphere, physics and so much more. It’s all built-in and Unreal Engine 5 does it automatically for me. Think about Lumen and Nanite, which are two of the biggest changes since Unreal Engine 5.
This makes the engine perfect for Virtual Production. The only problem is that there’s still very little to be found about it online. We have to figure out most things ourselves, but it’s very rewarding once you get it all to work!
We also created a course that explains the basics of virtual production and will soon release an advanced course as well.
Directing the Sequence
Since we had little time, we started off by downloading a free abandoned apartment from the Epic Games marketplace. This gave us a sense of the space we had for the motion capture.
The beautiful thing about Unreal Engine is that everything works in real-time and nothing is set in stone. So even after you link your motion tracking data to the 3D model, you can still move it around to the spot you want.
When I think of sitcoms, I always see those hard ugly studio lights in front of me. So that’s exactly what we did. We made the scene very bright by adding a bunch of virtual spotlights to the scene. This created the right mood and feel.
And of course, you need the laugh tapes to go with that!
Insights and Tips for Getting Started in Virtual Production
We worked for about three days on the entire video, including the tutorial part. Making such a scene also goes very fast since it all runs in real time. Even rendering the scene just took a minute or two.
Since we already did some tests before starting to work on the video, we didn’t really run into any challenges. However, during the two years that we’ve been working in Unreal Engine 5 and virtual production, we bumped into many roadblocks.
First and foremost it’s a gaming engine and it’s only since recently that we’re seeing more and more tools for filmmaking.
If you’re planning to step into virtual production, I highly recommend getting familiar with the program first and not spending thousands of dollars on gear just yet. Because you’ll be very surprised that nothing works out of the box.
For instance, to get camera tracking into Unreal Engine you can use something like VIVE Mars. But that’s not the only thing you need. To keep everything in sync, you need Genlock and timecode. To get a reference, you need a capture card. Perhaps even network DMX for the lighting. To get real-time chroma keying, you can start with learning Aximmetry. Perhaps you’re looking to work with an LED wall? Well, now things start to get even more complicated.
Virtual production is a very broad subject and it has hundreds of different ways to utilize that concept. So start from the basics and find a good community like the Facebook virtual production group to ask for help so that you don’t make any wrong decisions along the way.