logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

YouXia: Developing A Tech-Noir Game With CC3 & iClone

The CEO of Aardwolf Interactive Stephane Biesse shares some information about the game, how to create crowds, and generate human faces using Reallusion's toolkit.

In case you missed it

You may find these articles interesting

Introduction

Hi, my name is Stephane Biesse, CEO of Aardwolf Interactive. I studied English and Network Security in France and Business Management in England. I started my career in 2007 as a Network Security Consultant and got promoted to different managerial positions. Before founding Aardwolf Interactive in 2019, I was in charge of business development for a multinational company. At the same time, I kept coding, making games for my own enjoyment, and modding, modeling, and texturing characters and props. 

Birth of YouXia

In 2018, I wanted to play a third-person action/adventure game in a Tech-Noir/Cyberpunk or Biopunk universe and couldn’t find any, so I decided to develop one on Unreal Engine 4, as I was familiar with the engine.

In the beginning, YouXia was a pet project of mine, an action game with investigation elements inspired by Philip K. Dick’s novels, Judge Dredd comics, Altered Carbon, and Akira.

After writing the GDD first draft I sent it to friends I used to mod with, one of them, who works now in the game industry, was very excited about the project and advised me to make a full-fledged game and release it.

I wrote a proper GDD between June and December 2018, quit my job, and founded Aardwolf Interactive in January 2019 so I could work full time on the project. I started developing the prototype with a freelance programmer and a 3D artist, who is now our Art Director, in order to get more funding. In 2020 I hired 5 more people to speed up the development process and we are now at the alpha phase of the project. 

Gameplay

You Xia, which means "Wandering Vigilante" in Mandarin Chinese is a third-person action and investigation game in a Tech-Noir/Biopunk world where Earth has been devastated by wars and climate cataclysms and humanity remnants are crowded into shantytowns and overpopulated megalopolis. Artificial Intelligence is now sentient beings, genetic engineering is the norm and democracy is slowly dying. The player switches between Nick, the YouXia, a private eye and bounty hunter who investigates and fights in the real world, and Mariko, a teenage hacker and psychic, who projects her mind into the virtual world called the Dream, a drug-induced collective hallucination where she battles against AI to break computer countermeasures.

When controlling Nick, the player can investigate crime scenes, talk with different characters and fight with guns or his bare hands. When Mariko hacks a system, she really fights the intrusion countermeasures in arena melee fights. I wanted the hacking gameplay to be as action-oriented as the real-world gameplay instead of a mastermind game clone to keep the player immersed in the world.

Creating Characters

YouXia has a lot of NPCs and I knew from the beginning it was impossible to model hundreds and hundreds of characters in a short period of time. I learned about Character Creator on the Unreal forum and started using it in 2019. CC3 looked perfect for me as the skeleton was compatible with the Epic skeleton and rigging, skinning, and UV unwrapping was already done, a pretty good point as I’ve always found these tasks to be particularly tedious and time-consuming.

YouXia characters are modeled with Character Creator, outfits are modeled or edited in ZBrush, using CC3 GOZ feature, automatically skinned in CC3, and textured using the appearance editor. When further texture editing is needed for main characters, for example, we use the Substance Painter pipeline, export our character to SP, texture it and get the textures automatically updated in CC3.

After exporting to Unreal Engine 4, the Reallusion plugin automatically sets up the materials. It is a huge time saver, especially with skin shaders.

Secondary Characters

Our secondary characters are the ones who won’t have close-ups and cutscenes and as such, we needed a quick way for them to look good without spending as much time as we do with main characters.

When we need a piece of clothing like slack pants, jeans, or a suit we use the Reallusion asset library, they look good, are already skinned and we just have to drag and drop the asset in the CC3 viewport. When needed, we use the CC3 appearance editor to add variations to the assets. The library is expanding quickly and at the moment we’ve found what we needed. 

Generating Human Faces

To generate crowds of NPCs, we used the generated.photos website, after choosing the sex, ethnicity, and age of the character we want to create, an AI randomly generates realistic human faces, we drag and drop the generated photo in Character Creator using the Reallusion Headshot plug-in and get a realistic-looking human in a matter of minutes.

Recently we needed a crowd for the nightclub scene. There are 30 different character models in the scene, and with texture variations, we got more than a hundred character models. Thanks to generated.photos, CC3, and the Headshot plugin these characters were modeled, textured, and imported into Unreal in a single day by a single artist.

In Unreal, we add material color variations to avoid having the same characters on the same level.

Animation

When I started developing the game I had no prior experience with animation and was quite intimidated by animation software. With iClone I was able to quickly create animations and edit them to suit our needs. Since then, we’ve been using iClone for animating characters and props. iClone's live link to Unreal allows us to see in real-time our animated characters in the game engine. We can modify animations, lighting, and camera angles in iClone to get the perfect rendering for our cutscenes. There is no trial and error approach, and no time wasted exporting and importing.

Dialogues

YouXia is dialog heavy and we had to find a motion capture and lipsync solution that didn't cost tens of thousands of euros and needed hours of cleanup. The process is pretty straightforward, we capture the actors' performances with an HD Smartphone linked to the iClone LiveFace plug-in, clean and edit the capture in iClone, lipsync the recorded audio with the AccuLipsync tool and then check how they'll look in Unreal with the Live link.

Stephane Biesse, CEO at Aardwolf Interactive

Interview conducted by Arti Sergeev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more