Founder of Animatrik Film Design Brett Ineson returns to 80 Level to tell us more about Unreal Engine's new MetaHuman Animator toolset and explain what new technology means for the animation industry.
In case you missed it
You may find these articles interesting
80.lv: Could you introduce yourself and tell us a little bit about your company?
Brett Ineson: My name is Brett Ineson, and I'm the founder of Animatrik Film Design. Animatrik has been in business for about 13 years now. I started this business to dive into the early days of virtual production with a core focus on performance capture. However, that has led to various other on-set activities, such as camera tracking and XR applications on set.
I'm also the founder of a sister company called Shocap Entertainment, which we incubated at Animatrik Film Design. We use the technologies we employ in service to the film and game industries to produce real-time graphics for live theater applications that are delivered to in-person audiences and live VR applications.
Unreal Engine 5 & MetaHuman Animator
80.lv: Do you utilize Unreal Engine 5 in your pipelines?
Brett Ineson: We are, yeah. So we're finding that there's one side of it where it's been a great application simply for the work we do. We're in the business of real computer graphics, so Unreal Engine has been a go-to for us just in terms of workflow. On the other side of it, a lot of our clients are building applications using Unreal Engine, so it's been driven from that side as well.
80.lv: I know that you work a lot with Unreal Engine's MetaHuman Animator, what's your angle here? What are you trying to do with this software? How did you get into it?
Brett Ineson: We first started using Unreal Engine before MetaHuman, of course, and it was simply about achieving a higher-fidelity image. A lot of the work we do is very technical and structural, so we're not displaying the final picture to the consumer. We're collaborating on the projects with the IP owners and the developers. Watching gray-shaded humans running around the stage served the purpose, allowing us to confirm that they're in the right place, whether they need to be hopping in the Jeep or they're on this side of the console in the operating room, and so on.
It allowed us to focus on the work we needed to do without worrying about the aesthetics. However, with the onset of delivering an image on a film set, all of a sudden, there's a group of creative people around, and they're drawn to a more visually appealing picture. Consequently, the demand for a better-looking image has been steadily increasing.
Unreal Engine has made this possible for all of us. Once people get used to seeing that, it's a bit challenging to revert to the old gray-shaded models. Therefore, we've begun developing our plugins and tools to customize Unreal Engine to suit our on-set needs. And then MetaHuman came along. For our internal pipelines and testing, it has provided us with high-fidelity characters that we can use to represent our vision for a specific project. Additionally, the MetaHuman pipeline has seen considerable success right from the start. As a result, our clients have been using MetaHumans much more frequently than we initially anticipated. In the past year, I would estimate that 50% of the projects we've worked on have included the MetaHuman pipeline.
80.lv: How is Unreal Engine 5 used in your line of work? Is it used mostly for PreViz or is it more for the actual visualization?
Brett Ineson: We certainly use it for PreViz, and it's an easy way to quickly create something that looks good, even if you're just refining your characters' appearance. You can get a fully-rigged body and facial character running in no time. It's not limited to PreViz; these assets are designed to look good in final products, so we see them used extensively, especially by our video game customers.
The Future of MetaHuman
80.lv: What do you think is the future of MetaHuman Animator? Nowadays, many people are afraid to lose their jobs because of new technology, what's your take on that?
Brett Ineson: I would say I'm definitely in the optimistic camp, and I do suggest people embrace new technology, just as I have embraced it. I do realize I have a 25-year career in performance capture, but it means I've been involved in the keyframe vs. mocap argument for a long time. I did start out my career as a Keyframe Artist, so I completely understand.
In my experience, human beings are the most challenging subjects to hand-animate. The nuance involved is incredible, and you can never get it quite right. That's why we have the Uncanny Valley, which has been such a difficult thing to bridge. Procedural animation does a better job than we humans do at crossing the Uncanny Valley. Even so, if I were a Keyframe Artist, I'd be happy not to animate humans; give me the dragons, give me stylized characters. Trying to recreate humans is a struggle, and I think it's better left to actors and solvers.
I do think that, for better or worse, new technologies are going to affect, at least to some extent, the style of some films coming out. Epic's MetaHuman Animator has really provided easy access to high-quality characters for lower-budget operations. So, we can expect to see more people jumping in and creating films more easily than ever before, and we'll probably see a style that works well with that and levels the playing field.
Is MetaHuman Replacing Actors?
80.lv: During the Actors and Writers' Strikes, many actors criticized Hollywood for scanning their faces and using their likenesses for movies without their consent, what's your take on that?
Brett Ineson: I'm not really an expert on this, but I suppose that I've been involved in enough projects and actor negotiations to have some knowledge on the subject. With MetaHuman, you can create a human being that doesn't exist using parameterized control. This isn't a scan of somebody; it's virtual DNA creation.
So, if you're hiring an actor to play that role, there are different fees in the actor world, depending on whether you use the actors' faces, their voices, or their likeness. For example, in a video game, if it's meant to be the actual actor, there's obviously a different fee structure than if they're playing someone else.
Just from a financial perspective behind the scenes, you can see that for some projects, it's a big advantage. They might not be able to afford to actually use the actors in this way, but the rates for an actor providing the performance for another character may fit the budget. In terms of creating many characters based on the likeness of actors, because of course, you can scan them and bring them in, I suppose that's what we're all waiting for from the negotiations between unions and studios.
It's probably a bit above my expertise, but it does seem to me like a new twist on branding and similar matters. You can't create a burger company with a yellow "M" on it. So the actors have an argument that it's their brand that they've built, and they should be compensated for it. I do support that, and I believe there's a legal system that also supports them.
The Future of the Animation Industry
80.lv: From your professional standpoint, where do you think the animation industry is going to go in the coming years? Will AI affect it?
Brett Ineson: In the next five years, I think that AI is going to become integrated into what we do. So, certainly, there will be little parts of our pipeline where AI is helpful, and we will just rely on it for those tasks.
I don't believe that in the next five years, we're at a point where AI is doing all the work. It's hard for me to imagine that kind of world because I feel that some of the beauty in all of this is the human element and what we do. AI seems somewhat like a dark, cold world, I suppose, but I don't think it will reach that point in five years. Part of me wonders if the endgame for AI is creating this very conversation. 200 years from now, will we be the AI?