logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Creating Characters for a Police Simulator with CC3 & Headshot Plug-In

Members of the Aesir Interactive team have told us about their game Police Simulator: Patrol Officers, spoke about using Reallusion's Character Creator to make the game's characters, and discussed using the Headshot plug-in to add more personality to NPCs.

Introduction

Nicolas Friedl, Junior 3D Artist: Hi, my name is Nicolas. I’m a Junior 3D Artist at Aesir Interactive. I joined in September 2021 and have mostly worked on Police Simulator: Patrol Officers since then. Before that, I studied Fine Arts and Multimedia at LMU Munich. I also worked as a real-time artist at a small agency during that time, but mostly did broadcast-related work.

Manuel Wagner, Senior Technical Artist: I am Manuel, Senior Technical Artist at Aesir Interactive. I’ve been with the company since 2015. I initially started at the company as an intern while studying Games Engineering at the University of Applied Sciences in Kempten. It has been an awesome learning experience. Right now I am back at university for my master's degree at the Technical University of Munich but I am still working for the company and helping out with Police Simulator: Patrol Officers. Before that, I worked on a bunch of VR projects such as the VR arcade port of World of Tanks, as well as a turn-based strategy game prototype called Restless Echo. Besides that, I have some experience as a Full-Stack Web Developer but that just isn’t as fun as making games.  

Police Simulator: Patrol Officers

Manuel Wagner: Police Simulator: Patrol Officers is a chill game where you are a police officer who is on patrol in the fictive American city of Brighton. You need to keep your eyes open for any lost citizens who you can help or any mischievous people that think the laws of Brighton are just a suggestion. Ideally, you play the game together with your buddy so you can keep Brighton safe together! We even have modding support so you can customize your player or police car just the way you like it. We talked to a lot of actual police officers to make the game as realistic as possible while still making it fun to play.

Working with Character Creator

Manuel Wagner: The decision was made early on when we realized that we needed to get a lot of variation into the game as conveniently as possible. The choice was to either outsource the creation of the characters or do it in-house with CC3. In the end, we decided on taking the modular approach that uses Character Creator for content.

Headshot Plug-In

Nicolas Friedl: We used the Headshot plug-in a lot. It definitely helps in giving our police officers and modular NPCs much more personality. The best part is that it affects the model and additionally adds a texture to the head with just one image. Tweaking the texture and adding finer detail with SkinGen afterward is super fast and easy as well.

I also can’t wait to try out the improved version of the plugin for Character Creator 4 which comes with some neat features to make it even easier to achieve a good final texture. Blending the face texture with the rest of the head, for example, looks really awesome!

1 of 6

Manuel Wagner: In the beginning, we did not use the headshot plugin and configured the heads by hand. In hindsight, this was the wrong choice because the heads that were created with Headshot look way more believable and diverse than the manually configured ones.

Using Generated Assets With Other Tools

Manuel Wagner: By itself, CC3 already provides a lot of information about the exported FBX if you check the “export JSON for auto material setup”, but because every clothing asset and head and skin part has its own texture and UV layout assigned, we needed to repack the textures and UV layout to make the material use fewer textures. To that end, we built a python script that uses the Pillow (PIL) library to repack the textures into the new layout which we defined in JSON.

If you take a look at the exported FBX you will quickly see that the head alone has a large number of materials already. This is why we wanted to reduce this to a single set of head textures (Diffuse + Normal + ORM). The challenge here was just that the separate meshes in the FBX were not always named consistently, so we needed to add aliases for different names that are being used for the same mesh.

After the Python script merged the textures, we proceeded to import the FBX into Blender where we applied the same transformation to the UVs of the separate meshes and merged e.g. the teeth into the head mesh. In Blender, we also make sure that all the characters are aligned with regard to their pelvis height because different shoes have different offsets from the ground which causes offsets in the vertices that make it harder to merge the meshes together at runtime. We also cut off the head at a fixed position in order to have a uniform edge interface that is the same across all heads.

I worked on the first iteration of the pipeline for a total of about 3 months. Most of the time was spent figuring out how to best interpret the Character Creator output and how to process it in Blender automatically because the UE4 plugin is closed source and CC3 itself sadly does not provide any automated scripting facilities. It was a lot of fun to get into Blender scripting for this project, however. Luckily the JSON file provided enough information to make up for the lack of scripting in CC3.

Time 

Nicolas Friedl: To be honest, I don’t even want to estimate the character creation process without these tools, but as a rough timeframe I can say that creating a new head for our modular NPCs takes between 5 hours and 7 hours. We spend only about 2 hours in Character Creator while the rest of the time was used to tweak and set up the model so it works with the rest of our characters. So yes, the Character Creator part is incredibly fast. It’s also way easier to iterate on the features of your character.

Solving Performance Issues

Nicolas Friedl: We solved any performance issues with optimization, both art and tech-wise. For example, from LOD1 downwards we remove teeth (which have many polygons) and eyelashes. We also reduce the count of materials per NPC between LOD0 and LOD1 to have fewer draw calls. Not having extremely large Normal Maps for the NPCs helps as well. To add finer skin detail, we used a shared, tileable skin texture instead.

Manuel Wagner: Unreal Engine uses 4 draw calls per mesh section per skeletal mesh in one single frame. The number varies with the material and shadow settings, but under normal circumstances, 4 is what we see. When potentially rendering hundreds of NPCs, this quickly amounts to thousands of draw calls if you are not careful. Luckily, UE4 provides a neat utility in their C++ code that allows you to merge multiple skeletal meshes into a single one at runtime. You can read the documentation for that and alternatives in the official documentation. The provided source code for the skeletal mesh merge does not compile anymore – if it ever did – but it is not too complicated to fix.

The code blocks the game thread by default to wait for a render command fence. This, alongside the huge amount of data that needs to be copied for merging, causes tons of hitching when generating NPCs at runtime. By removing the render command fence and using the async task graph, we successfully moved the actual mesh merge to a separate thread to fix these hitches.

We then merge all the modules that we randomly generate by a set of rules into a single mesh that has 2 material sections on LOD0, one for the hair that has a masked material, and one for the rest that has an opaque material. On LOD1+ we use only a single mesh section that uses an opaque material. In order to shade the mesh with the correct textures, we have to resolve the required texture objects at runtime. We use the 2nd UV channel to detect which ID is used and use the appropriate set of textures accordingly. If we had to do it again, we would use a UDIM workflow because that’s basically the same thing and less complicated.

As you can see, we need to pass a lot of textures into the material. We apply a little trick in the sampling of the textures in order to stick to the limit of 16 texture samplers that are allowed in the shader model.

While looking through the generated shader code, we found that when you pass a texture object into a custom node, you actually pass the texture and its sampler into the function. So what if you just don’t use all of them? Under the assumption that all textures use the exact same sampler settings we only use the sampler of the first texture object. As it turns out, this leads to the shader compiler optimizing out all of the other samplers and allows us to use more than 16 textures in our material! Keep in mind: when implementing custom sampling for non-linear color or normal textures, you will have to convert the data to sRGB and unpack the normals manually. This is still better than just using shared samplers,  of course.

We also make use of the ID mapping to use different shading models for the areas by means of the “From Material Function” shading model. Hair, Head, and Skin use the SubSurface Shading model while the clothes use the default lit surface shading model.

Despite its many benefits, the skeletal mesh merge brought many challenges with it, too. In no particular order:

  • Meshes are merged in their bind pose, which means all merged meshes need to fit together perfectly when in their respective bind poses. This is why we had to create our blender tooling to ensure that all the modules fit together.
  • Morph targets need to be remapped and merged for any module after the first. We only needed them on the head, so we did not have to remap the others anyway.
  • Morph targets of the first mesh need to be copied to the merged mesh for each new mesh. This takes up a LOT of memory if you consider that Character Creator 3 characters have more than 80 morph targets. We ended up stripping all but a few of them to save on memory. We recently did an engine mod that shares morph target data between meshes so that we do not have to copy them, but we did not yet find the time to adjust our animations to the new capabilities.
  • LODs are merged as well, which means generated LODs must still fit together at their seams to not have holes. Enabling the Lock Mesh Edges checkbox for the reduction settings helped resolve most of the issues.
  • Modules do not keep their own skeletons (as opposed to the master pose approach), which causes issues for modules that require bones in different locations than others. This is the case for different heads/faces where the mouth and eye sockets are in different positions. We worked around this by using the head and its skeleton as the base for each NPC. It still creates slight issues with accessories such as sunglasses.
  • When playing animations, the skin sometimes clips through the clothes due to differences in topology between the clothing and the skin. We applied a depth offset in the material to make sure that the clothes always render on top of the skin.

Adding Variety

Nicolas Friedl: We added variety by using randomization and modularity. Each NPC consists of 8 different regions which makes it easy to give them different outfits, hairstyles, heads, etc. Not only the kind of mesh is randomized, but we also randomize the texture and color of certain meshes. To create a texture variation for a skinned mesh we need to input them as custom asset user data.

The custom asset user data consists of multiple parts which need to be configured. “Type” defines where a certain part belongs on an NPC. “Covered skin areas” is important when setting up clothing because it defines the skin areas which are being hidden to prevent clipping.

The assigned textures are being applied to the character. However, the randomization gets interesting with the checkbox “Can be colorized”. If this is ticked, the modular NPC pipeline picks colors out of either the color sets or data tables that are assigned somewhere else. The color is multiplied in the shader with the texture, which is why we need to desaturate the parts in the texture which we want to have colorized (e.g. eyes, hair). There are multiple color palettes to allow us to balance the appearance of certain colors on NPCs. For example, NPCs with dyed hair occur way less than the ones with generic hair colors. This flexibility is one of the biggest strengths in my opinion.

To constrain the generator, rules which could be set by the artists were implemented. For example, there is a rule which prevents sunglasses from spawning on the head of an NPC if they have headphones on (the meshes would clip into each other).

Future Plans

Manuel Wagner: We have some rough ideas about using CPU Skinning to deform the modules before merging in order to resolve some of the constraints during character creation, but we did not prototype this yet. I am also quite excited about the Wave Function Collapse plugin in UE5 which hopefully makes it easier for us to specify better rules for the randomization than just simple if-this-then-that rules.

The render pipeline for the NPCs is optimized toward minimizing CPU and GPU load at the cost of memory. This tradeoff has, in turn, caused issues on older hardware. We are currently working on a fallback that uses multiple skeletal mesh components with the head as their master pose component. However, for “just” 75 NPCs this costs an additional 10ms(!) in CPU time, so we might not even be able to afford this. But this also shows how much performance we saved by taking this path.

We consider our merged-mesh pipeline a successful part of our project and will continue to build on this tech stack in the future. Other projects are already building on top of this pipeline in UE5 and also seeing great results. If you also want to be part of our future projects and endeavors, we are hiring!

Nicolas Friedl, Junior 3D Artist and Manuel Wagner, Senior Technical Artist at Aesir Interactive

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more