JSFILMZ pushed Epic's recently-released MetaHuman Animator to its limits, demonstrating some appealing results.
CG Artist, YouTuber, and an expert on all things Unreal Engine Jae Solina, also known as JSFILMZ, has recently shared a series of mind-blowing demos, showcasing the capabilities of MetaHuman Animator, Epic Games' recently-released toolset that enables its user to turn facial motion capture data into high-quality facial animations in a quick and easy way.
Using the novel toolset alongside data captured on a $270 iPhone 12 Mini and Unreal Engine 5, the creator produced multiple photorealistic digital human animations, described by the author as the most realistic MetaHumans ever that are not made by Epic Games, 3Lateral, the team responsible for Blue Dot, a short film created to demonstrate "the level of fidelity that artists and filmmakers can expect when using MetaHuman Animator", or the Unreal Engine team.
Furthermore, the creator has also shared a brief explanation video, answering some of the questions regarding the project. You can watch this video, as well as all the others, by visiting JSFILMZ's YouTube channel.
This outstanding series, however, is not the first time JSFILMZ impresses us with his Unreal Engine-powered project. Back in 2022, the creator also wowed us with Prisoner, a short film inspired by Mandalorian and the Star Wars universe.
According to the artist, he used Xsens' kit for motion capture, Xsens Motion Cloud for processing, and Reallusion's iClone for adding hand gestures. Then the characters and animations were exported to NVIDIA Omniverse using the iClone-to-Omniverse connector. Prisoner's environment was quickly made using ScansLibrary assets, while the jetpack animation was made using Omniverse's built-in flow simulator.
Last week, Creature Supervisor Madhav Shyam also showcased some experiments with MetaHuman Animator, demonstrating several lifelike face animations using self-made source footage captured on an iPhone, Marmoset Toolbag, Unreal Engine 5, one of 3D Scan Store's head scans, and the toolset.
For those unaware, MetaHuman Animator was first introduced during Epic's keynote session at GDC 2023. The toolset empowers users to effortlessly capture an actor's performance using an iPhone or a stereo head-mounted camera system and then seamlessly apply this captured data to any MetaHuman character, resulting in high-quality facial animations without the need for manual editing. You can learn more and get started with MetaHuman Animator here.
Also, don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on Threads, Instagram, Twitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.