Brett Ineson from Animatrik Film Design talked about the mocap solutions for film and games his company provides.
80.lv: Could you tell us a little bit about your company? When was it founded How big is your team now? What was your first project? What films and games have you worked on?
Brett Ineson: Animatrik Film Design was founded in 2009. My goal was to provide the most up-to-date and efficient stage and location-based Virtual Production services available.
The first production that we worked on was Neil Blomkamp’s District 9 – it was an incredible experience. Neil is as cool as they come, he’s talented and professional, and the film he created was amazing. It’s still my favourite after all these years.
Years later, Animatrik now operates the largest independent stage in North America, located in Vancouver, Canada. Our team ranges between 20 and 55 people depending on the schedule and we’ve worked on a wide variety of blockbuster films and AAA games. Some standout examples are Warcraft, Deadpool 2, Gears of War 4 & 5, Dead Rising 4, Spiderman: Homecoming, and Thor: Ragnarock. But that just scratches the surface.
80.lv: What solutions do you offer? How does their implementation differ for games and films?
Our goal is to capture any object in any environment. We lead the industry in building capture volumes in unique locations and challenging environments; we work in daylight, outdoors and crowded film sets where we may also be dealing with the added elements of rain and fire.
At its core, Animatrik utilizes a range of industry-leading motion capture pipelines, including Giant, Optitrack, and Vicon. We provide robust real-time solutions for all setups. We are proud to be one of the few studios worldwide granted the licensing rights for GIANT from Lightstorm Ent. Inc.
Our complete motion capture setup allows for the simultaneous capture of a performer’s body, face, fingers, and audio, all integrated with virtual cameras, reference video, live-comp, and camera tracking, as well as real-time character retargeting and game-engine visualization.
We’ve also partnered with Jumpin’ Joe Productions, meaning we can provide a full talent casting and union signatory service. They’ve got a deep experience in casting talent and stunts for mocap, and specialize in cross-border and work permit services.
Stunt and IBC Capture
80.lv: You have a handful of different capture solutions. Can you tell us more about IBC and Stunt Capture? How do these technologies work? What are they used for mostly?
Whether we’re capturing stunts for previz or for final shots, we partner with stunt teams to construct a volume that captures the action safely and effectively. Our stages have heavy-load stunt beams with high ceilings to allow for complex rigging and construction. With stunts, there’s always serious technical consideration, we need to factor in the technology and materials attached to the actor. You don’t want anything to obstruct their movement or injure them if they fall.
IBC refers to Image-Based Capture. This is the process of completing motion capture with video cameras as opposed to machine vision purpose-built cameras. The post-process has a manual component to it, but it gives you the ability to capture 3D in unconventional environments, such as rivers.
80.lv: How does animal capture work? What core difficulties connected with the tech setup and the production process have you faced?
We have completed some very memorable mocap shoots with animals. A lot of planning goes into these shoots and the first thing we think about is the safety and wellbeing of the animals we will be working with.
Most recently, we worked on a motion capture project with horses. You can’t mocap a horse in a studio so we had to reconfigure everything to work in an indoor riding arena. One of the biggest difficulties was figuring out how to set up all of the cameras in the space to capture the movements. We were working in a whole new environment.
The technology side of it was easy compared to markering and tracking the horses. You’ve got dusty, sweaty horses and dirt on the arena floor that covers the markers on their hooves. A lot of experimenting went into the best way for the horse to wear the markers. In the end, we created our own version of a horse mocap suit.
For the Twilight movies, we captured wolves – that was incredibly interesting. Even though the wolves were movie-business animals and not wild, at the end of the day, they’re still wolves. It was incredible to be behind the scenes and see how trainers motivate them. Horses get a little irritable here and there, just like us on the job – it’s no big deal! But when wolves aren't happy there is this look in their eyes...that’s when it’s time to get lost and let the trainers do their thing. The trainers have a wonderful relationship with their wolves but a random mocap guy on-set doesn’t earn enough trust in a single shoot.
VFX and Game Engines
80.lv: You also work on VFX, post-production, game engine integration. How do you work on VFX in particular? What software do you mostly rely on? How do you work with game engines in film production?
Our work in VFX is very focused on captured animation. Typically, we track bodies, faces, and cameras, but it could also include many things, such as whips, chains, cars, or animals. We spend our time mostly on Lightstorms’s Giant software these days, but we do use the industry staples such as Maya and Motionbuilder.
Working with game engines has become incredibly popular in recent years – we visualize almost all of our work in Unreal Engine on stage now. Many of our game clients use the engine so it’s great for them to have continuity in production. With film projects, it has opened up entirely new worlds. We can interface with LED walls and provide in-camera FX on green screen shoots – it’s really changing approaches to film production.
80.lv: How do you plan to expand your solutions line? What do you think about games becoming more realistic and looking more like films? How does such a tendency influence the production technology in both areas?
We have been doing considerably more live shows as of late. This is new for us – live computer graphics has historically been too big of a risk. However, we’ve proven on more than one occasion that this can be successful. For example, we successfully completed a broadcast to 50 million people.
As far as realism goes, the technology is pretty much there. For certain projects, you need some heavy hardware – especially if you have multiple characters with real-time hair and subsurface scattering on the skin. But it is here and it will commoditize very soon.