logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Creating Natural MetaHuman Animations with iClone

Petar Puljiz was recognized with his exceptionally natural MetaHuman animation. Now, he is explaining how to leverage Reallusion’s latest iClone 8.1 update to create his new character animation.

Introduction

Hello, dear readers of 80 Level, my name is Petar (Peter) Puljiz, a real-time enthusiast from Split, Croatia, and of time writing this, 26th level of life. To cut a long story short, I've been producing videos and directing videos most of my life, studied digital marketing, and worked as head of marketing for a game development academy in Croatia. Now I’m working as a visual artist at Lowly, where I produced more than 80 experimental music videos using Unreal Engine. My last two years have been spent researching and developing a sustainable pipeline for virtual influencers.

Petar’s award-winning Love, Rosie project at the 2021 Lip Sync Animation Contest.

Discovering Reallusion's Tools

Since I moved from linear mediums such as film, I wanted to explore and build a new kind of storytelling called virtual influencers. The idea is to create space and a timeline to tell a story over a span of several years. So, a pipeline of character is something long-term, and as I was building a pipeline, I knew that animation is a key thing, so a sustainable and flexible pipeline is a must.

iClone provided a one-stop shop for flexible, great tools and UI. In one word, it provided an ecosystem. DCCs like Maya and Blender have their advantages, but iClone proved to be a great fit. Over the last two years, they upgraded the ecosystem so much, and what’s most important is that they listened to us, the users, and implemented stuff we needed. 

Petar’s approach to animate MetaHuman

The main goal of the project was to see how much I was capable of expressing emotions in animation using iClone and MetaHuman. The contest was great because I had a deadline that forced me to finish stuff. Of course, MetaHumans were very new at that moment, and the whole pipeline was a bit fragile with a lot of crashes. But in the end, the goal was accomplished, and I was really happy about the result.

Working on the Character

The character was created using the MetaHuman Creator and took around 6-7 iterations. Regarding the mesh customization, it was minimal work, just little retouching on animated Albedo and Normal textures. In the end, it took me around a few hours to set up the whole character of that fidelity, which is crazy.

For me, clothing is as important as body animation for believable results. I used Marvelous Designer to create a garment and simulate it once the animation was done. I’m not a fan of classic rigging of clothing for cinematics, but with the newly improved Chaos Physics in Unreal, which let you combine rigged cloth with painted parts that are simulated, it is becoming the thing in the pipeline. 

Motion Capture Workflow

For motion capture, I used Rokoko's Smartsuit and Smartgloves. To be honest, my studio is crowded with equipment, so magnetic interference was a problem and caused a lot of noise in the data. So my use of motion capture was basically for blockout and I kept mostly spines and neck movements as original data.

The whole capture was referenced, prepared, and trained. It took me 15 takes to get it right, it really makes you appreciate the talent of actors and performers. The data was then processed with Rokoko Studio filters like Locomotion and Drift fix, which really saved the day.

Once done with that, you just export FBX Rokoko and import it iClone. No retargeting is needed, the team from Reallusion already did that for you. MetaHuman was also incredibly easy to set up. Reallusion provides you with the Live Link plug-in and with MetaHuman Dummies, which are basically iClone retargeted bodies for specific MetaHuman body types. It was so easy and profoundly joyful to avoid all of the hassles of retargeting.

Animation

Once everything is technically set up, you need to make yourself comfortable with animation. I approached it very carefully with initial motion data capture. First thing is to inspect the data to see for any glitches that need to be fixed manually then do the filter to smooth out the data. But you need to pay attention to smooth it just a little bit so all the spontaneous movements stay intact. There are a lot of moving parts in the human body, most of them subconsciously made, and they are very hard to pull off by manually recreating.

Reallusion's Curve Editor is a life safer when it comes to cleaning up because it gives you visual feedback on how your data behaves. After that, it was time to have fun with hands. Reach targets showed up as great allies for that. It’s something similar to constraints in Maya, but with a much simpler UI. It lets you define a target for your arms, or legs and attach props to it, and uses the IK system to animate the body based on the target.

In the timeline, you just set when you want to activate it or release it and what kind of transition you want. As for the facial animation, the iClone is just a killer there. From Acculips to the Face Puppet, the iPhone Live plug-in to the manually keyframing and its picker interface. I didn’t know until that point that face animation can be that fun.

Utilizing the iClone LIVE LINK Plug-In for Unreal

There are two ways of transferring animation, using an exporter like FBX and reapplying animation to the character in Unreal or you can use Take Recorder and LiveLink to basically capture data in real-time from iClone to Unreal. On part of the screen, there is iClone and on the other, there is a real-time preview of your MetaHuman in Unreal. This really matters since MetaHumans are driven by an animation blueprint for corrective deformations. I prefer using Live Link to transfer data because it’s much easier and if something is not right it is way too easy to repeat and fix than in the FBX pipeline.

With the new version of iClone Live Link, now you don’t have to worry about data loss since it’s running on timecode, so it’s just things like these that make your life easier.

The good thing about Unreal is the ability to bake imported animation down to a rig so you can additionally tweak data in Unreal. Saying that, even if you can’t pull off something or you need to fix it, you can do it in Unreal, so collaboration between Unreal and iClone is even better. 

Conclusion

I’m even afraid to think about how much time I would've spent working in a more traditional manner. I tried animating in Maya and it was very difficult. I'm a Blender guy, but since MetaHumans are hardcoded in Maya, and Maya has better and faster rigging/animation tools, it’s a must to know a little bit about Maya. But it just took me a lot of time and it wasn’t fun at all. That's what matters to me, I want to have space for playing around. Only in that way can I produce something good and worthy. Because the time you enjoyed wasting is not wasted time!

Petar Puljiz, Real-time Artist

Interview conducted by Theodore McKenzie

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more