logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Epic Games on Integrating MetaHuman & Marvelous Designer Capabilities into UEFN

Epic Games' Senior Technical Animator Jared Monsen told us about the recent updates for UEFN, explaining how the average size of a MetaHuman was reduced for the release and providing an overview of the new production pipeline for digital clothes in Marvelous Designer and Unreal Engine.

During its State of Unreal presentation at GDC 2024, Epic Games officially announced that the MetaHuman toolset, including both MetaHuman Creator and MetaHuman Animator, have been integrated into Unreal Editor for Fortnite, allowing UEFN creators to set up photorealistic digital humans as NPCs for their projects.

For the release, the software was fine-tuned to prioritize quality and efficiency, leading to a substantial decrease in the average size of a MetaHuman from almost 1GB in Unreal Engine 5 to just 60MB in UEFN.

To learn more about this release, we spoke to Epic Games' Senior Technical Animator and lead in the MetaHuman character team, Jared Monsen, who explained how the average size of a MetaHuman was reduced, discussed the new character animation workflows MetaHuman enables in UEFN, provided an overview of the new production pipeline for digital clothes in Marvelous Designer and Unreal Engine, and more.

At GDC, you announced the integration of the MetaHuman toolset into UEFN, could you please tell us how this feat was accomplished? How was the software fine-tuned to decrease the average size of a MetaHuman?

Jared Monsen: The MetaHumans we export for use in UEFN are based on the same MetaHumans that are used in Unreal Engine. In other words, we did not create bespoke geometry or rigs, the entire optimization process is based on deriving and tuning existing data. Only grooms were changed at the source, and this was for the benefit of MHs in general.

To reduce the mesh sizes of MH characters, we created a new LOD table for UEFN derived from the original UE MetaHuman LODs. For example; UEFN LOD0 uses the UE LOD1 face mesh. We also created animation scaling options using the MH component. The animBP has options now for perf LODing. Additionally, we reduced texture sizes and baked out Normal Maps. We also simplified the material used for the MH faces in order to take advantage of these reduced texture sizes and retain a high level of quality. In addition to these improvements, we also tweaked the complexity of materials at varying LOD levels, such as replacing subsurface scattering with baked ambient occlusion for the highest LOD level.

We also optimized grooms and reduced their cook sizes with a variety of approaches. For example:

  • Improved LODing of strand-based hair curves, and optimization of the curves themselves.
  • We improved the effectiveness of texture usage for cards and reduced texture sizes.
  • Cook optimizations helped reduce the cook size of grooms by up to 50%.
  • Some grooms were baked out at higher LODs, for example for eyebrows and moustaches. 

For more details please see this talk from GDC:

What can you tell us about the new character animation workflows MetaHuman enables in UEFN?

Jared Monsen: With the recent updates, we brought the feature set from the UE MetaHuman Plugin directly into UEFN. With it, users can start creating MetaHumans from an existing mesh or from footage or using the MetaHuman Creator web app. Users can also animate MetaHumans based on performances captured on iPhone or professional HMCs. Once created, animation can be applied to any MetaHuman, as well as Fortnite Characters that are available to creators in UEFN. Some recent articles on MetaHuman Animator can be found here and here.

How did you enable the use of MH Animator's data for Fortnite characters?

Jared Monsen: MHA outputs animation against the MetaHuman Facial Description Standard (MHFDS). MHFDS can then be mapped onto other rig types. FN rigs don't fully follow the MHFDS but are built on top of MH technologies – a procedural remapping was created and added to the FN rigs to allow them to be directly driven by MHFDS animation. Animation produced by MetaHuman Animator can be applied to any character that uses this standard 

Now, let's discuss UEFN's new tools for setting up digital clothes. How did you collaborate with CLO on this initiative? 

Jared Monsen: Marvelous Designer is built on the foundation of taking traditional clothing production concepts and applying them to 3D cloth modeling. The tool empowers artists to streamline their creative processes by enabling anyone to master cloth modeling – regardless of sculpting skill level – as well as providing fast and accurate cloth simulation.

We wanted to give users working with Unreal Engine and UEFN the ability to access these benefits for the content they are creating using our tools. This integration debuted with Unreal Engine 5.4, providing more possibilities for clothing your MetaHumans. It is underpinned by a new USD export option for garments, including geometry, materials, and the necessary data for simulation setup. 

Could you please provide an overview of the new production pipeline for digital clothes in Marvelous and Unreal? What are the features of the new Cloth Asset Editor? 

Jared Monsen: Clothing is an integral part of making characters believable and can be a visual expression of a character's personality. Creating clothing assets and getting them into your projects to use is not always an easy task, but with 5.4, we now have a pipeline that allows users to integrate clothing assets from outside software packages, such as CLO and Marvelous Designer.

Until now, Marvelous Designer users created artworks, including garments, creatures, and various assets, based on their unique pattern-based workflow. However, bringing the clothing to another software has always been troublesome because of the inherent limitations of using different file formats to suit user needs. Clothing-specific properties would be lost in traditional export and import processes, and users would be left with a mesh that happened to look like clothes.

Marvelous enabling the USD file format now provides the foundation for an effective pipeline into Unreal Engine. Now, clothing panels and parameters from Marvelous translate directly into the engine and allow you to easily see the real-time representation of your sim setup.

Once you've imported your clothing assets into Unreal, you can start setting up the cloth panel graph. You can choose a workflow that ranges anywhere from fully automated to a completely hand-tweaked setup. This flexibility allows you to get something up simulating faster, as well as having the opportunity or the option to customize areas that you feel are necessary.

An auto-generated graph is built when you first create a cloth asset node. This graph helps you get up fast and saves you time getting started. This includes the USD import node, some core nodes for setting up the simulation, value transfers from Marvelous Designer if selected, a physics asset node, and parameters set to good initial values.

The USD import brings in both simulation mesh and render mesh for you to immediately use. There are also nodes for you to use in your graph like auto-skinning and auto-lodding to speed up the creation process. All of this streamlines the cloth setup allowing you to arrive at a working simulation much faster and more easily.

Although we provide automated paths for many aspects of the pipeline, we also allow you to customize basically every part of that pipeline to arrive at the results you need for your project. For example, creating your own customized high-resolution render mesh for a hero character’s costume. The clothing graph allows you to swap out the render mesh for our own version in a really easy way.

Users can also take advantage of the Kinematic Collider which is a deformable collider mesh that moves with the skinned asset and allows for more accurate collisions at a relatively inexpensive price. The kinematic collider is really easy to set up in the graph and solves a lot of problems that come from just using capsules.

The ability for us to customize the graph in areas that you want but also take advantage of the automated processes where you can save enables you to increase the efficiency of your pipeline without giving up creative freedom and control over the end result.  You have the option to use those results in Unreal or send them on to be used in UEFN. 

How does one import assets to UEFN? How does the Cloth Asset import tool work? How can UEFN creators utilize the Chaos Cloth tool in production?

Jared Monsen: Once you have your cloth asset set up, getting it into UEFN is straightforward. You can use the migrate tool to migrate your cloth into UEFN. When migrating your cloth, the only essential asset you need to migrate is the cloth asset itself and the physics asset, if you've chosen to use one. Materials and textures are also available for you to migrate. The clothing asset bundles up almost everything you need into one asset which makes the process really simple. There is a comprehensive overview of these topics in this talk, it’s about 15 minutes long but it provides all the details on the new integration and workflow:

Jared Monsen, Senior Technical Animator at Epic Games

Interview conducted by Theodore McKenzie

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more