logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Creating a Post-Apocalyptic Animation With iClone, Character Creator & Unreal Engine 5

Senior VFX Artist Stan Petruk spoke about the working process behind an animated movie The Remnants, showed how hair and clothes were created, and explained how Reallusion's Character Creator 4 and iClone 8 helped him simplify and speed up the workflow.

Introduction

My name is Stanislav, but I prefer Stan. I am a self-taught artist working in the gamedev industry. I started as a Motion Designer in a small Siberian town and eventually moved to a bigger city chasing my goal to make games. The first job I landed was a startup mobile game, and I think it was a perfect project that helped me to make a transition from video production to a real-time environment. My next job was at a company called Sperasoft, it's a big outsourcing company that co-developed a lot of great AAA games. I started as a VFX Artist and worked on several big titles there – WWE Immortals, Mortal Kombat (mobile), Agents of Mayhem, Overkill's The Walking Dead, and Saints Row. It was a great time and it helped me to grow a lot as a professional artist.

After over 4 years at Sperasoft, it was time to move on and in 2019, I moved to Poland. I spent almost 2 years working at Techland Warsaw. In 2021 I moved once again, this time to Stockholm, and I currently work at Avalanche Studios as a Senior VFX Artist.

The Remnants Project

Besides making games, I love films, and parallel to my real-time VFX career I am trying to develop myself as a filmmaker. I attended several filmmaking courses, the most recent one was a short online course from Vancouver Film School. I am also learning from books and trying to break down the films I watch. But it was all a theory and at some point, I realized that I need to get my hands dirty and make my first film. So, it is when I started to develop an idea for a short film with a simple story. I also explored a similar theme in my previous project made for a VFX contest two years ago. But it was not a film yet.

The production of The Remnants started maybe a year ago, but in the beginning, it was mostly research and development. I took a pack of sticky notes and started to break down the project into small elements, first breaking it into the story structure acts, adding events in-between, and making a list of what software and techniques I must learn to make the project come to life. Everything was glued to the wall and visually looked like there is no way I could make it alone with the skills I currently have. At the same time, I didn't want the production to turn into only a technical presentation, my goal here was to make a film and the main focus should be exactly the film direction.

So, I started to dig into how I can optimize my work. The real-time approach in all possible ways was something obvious. Merging my two passions – gamedev and films – into one was the core idea to get production going.

Discovering the Reallusion Tools

I was very concerned about having characters in my film, they are very important and they must look alive, it must be possible to empathize with them. So, I googled how I can make a character without being a Character Artist, and after some research, Reallusion's products seemed like a perfect solution. Simple to use and with some basic knowledge of CG you can get an awesome result from it.

Reallusion software works very well with other programs, especially with Unreal Engine. All file formats I needed are supported, and export presets are there too. You just use it as a part of a solid pipeline. It simply does the job and helps to save a massive amount of time.

The creation process in Character Creator is as simple as it can be, you just pull the slides and get the shapes you want. But if you want to get the result you must understand what exactly you need. So, I made a virtual film casting. I found some AI that generated a bunch of photos and just chose the ones I liked best. After that maybe an hour per person and you are ready for the next production stages.

Hair and Clothes

I did not want the clothes on the characters to look like a standard game asset, where they are a part of the skeleton. So, I decided to use Marvelous Designer and simulate the closes as a separate element. The pipeline was the following:

  • Export an animated character from Unreal Engine.
  • Import it into Marvelous Designer.
  • Create a 3D model of clothes in Marvelous Designer.
  • Add UV and textures.
  • Simulate the closes on top of the animated character.
  • Export as alembic.
  • Import into Unreal Engine as a 3ds Max preset with skeleton.

Eventually, I used a mix of simulated clothes and 3D model skinning to save some time.

Hair was a tricky part, mostly because I wanted to use UE hair and fur. A lot of hair is also added to the character's clothes, there are reasons for it – it looks great and hides imperfections of simulated cloth.

Here is the pipeline:

  • Export character (or clothes) from Unreal.
  • Add particle hair in Blender. Blender has a lot of great tools to style hair.
  • Export hair as an alembic file (only the first frame, not a whole simulation).
  • (It is an extra stage which is necessary only if the fur in UE is not binding to mesh correctly, I used it only for clothes). Export the first frame of the mesh as alembic and import it as a skeletal mesh in UE.
  • Import fur into Unreal Engine.
  • Create hair binding (use additionally exported mesh as a source skeletal mesh if required).
  • Add the groom component to a mesh.

The real problem with the hair – it is not always sorting correctly, and if for example there is a transparent particle in front of fur, the fur will be rendered on top anyway. So, I had to remove or tweak somehow a lot of smoke and fog because of that.

Creating Animations in iClone

Before making the final animations, I made the whole film as a simplified cinematic. I used a mannequin in UE and was just moving it in the scene, all cameras were also there as well. Thus, I had a very good plan, I knew what animations I need and even how much time should every movement last (roughly of course). In the final version, a lot was changed but the core idea was from the initial cinematic.

For the motion capture, I used Xsens, which is really great but requires a bit of cleanup and tweaking anyway. In iClone 8 I was able to do all the mocap clean-up, previously done in Motion Builder, but without the complexity or higher cost. Inertial sensors of Xsens understand the movement well but have no idea where they are really located in 3D space.

In iClone, I was able to fix most of such issues and also fix some of the mistakes I made during the pre-production stage. The most complicated scenes were the ones with interaction involved. Especially because I had to be both characters at the same time. Such things required a lot of manual fixing, e.g., there is a scene where one character takes a cigarette from the other, or throws and catches a potato, all of that originally was not moving correctly. I didn't have access to finger tracking so it was made in iClone.

For facial mocap I used an iPhone that also required a bit of tweaking later, e.g., chewing the potato was done manually as tracking was not able to understand the recorded movement.

I also used a bridge to export characters to Unreal and it is very simple and smooth, just a click of a button. The animations were imported manually because I needed to clean up and fix the movements before using them in production.

VFX

All of the effects are animated flipbooks, they are mostly fire and smoke. It was the simplest part for me because I am VFX Artist. The pipeline is standard, make a simulation in Houdini, render it as a flipbook, and make a particle effect in Niagara.

Conclusion

The production started quite a long time ago, mostly because it was a research and development for me in the beginning. So, you could understand better the last 5 minutes of the film were made during maybe a couple of months. So, when I developed a good pipeline, made all preproduction, and had a good plan, the production started to go very fast. So, you basically have a scene set up, add the lights, drop characters to the scene and roll the camera. I really loved this approach, because it actually simulates the film production and shifts most of your attention to actual filmmaking.

I think with a traditional approach I would most likely never make it alone. Now, with all the knowledge I have, I started a new short animated film with a bit more complicated story. Currently, it is in pre-production, but soon I will be able to share more information about it.

Stan Petruk, Senior VFX Artist

Interview conducted by Arti Burton

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more