Realistic Human Character Study

Braulio “BraV” FG was kind enough to talk about his recent project, where he models and animates awesome photorealistic characters.

Braulio “BraV” FG was kind enough to talk about his recent project, where he models and animates awesome photorealistic characters.

Introduction

Hello everyone, my name is Braulio “BraV” FG and I’m from San Jose, Costa Rica. I´m a 3D digital artist, passionate about real time texturing, lighting and shading, but also motion capture, a subject that I have experience by working on a tech-lab here in my country. I also studied digital animation here in San Jose, but for now, I’ve been mostly studying on my own about real time and Unreal Engine. Right now I’m developing an in-game cinematic trailer for my own IP, so that´s why this test was made, to show the quality I´m trying to achieve.

1 of 2

Realistic human characters

Definitely, any human being, despite not been an artist, can tell when something is wrong with a face, maybe they can’t name the problem right away, but will feel when something isn’t right. We as an artist need to actually name the problem and try to solve it. From some time now, affordable 3D scanning is helping with the problem of proportions and resemblance, and mocap help to solve part of the problem with movement. These two methods have something in common; they both record minor details, like skin imperfections or natural timing of the movements of humans, like pauses and changes of pace. All of this really adds up to an eventually successful face. One of my goals was to add these solutions to my test, and bring this battle into a real time game engine, to see if I was able to create a moving face not so far off a real one with a low budget.

Project

First of all I want to thank David Marmor for providing the base model and primary blendshapes. Also Baongoc for letting me use his UE4 eye material. With this been said, it all started when I saw a video of the base model and blendshapes in action inside of Maya. I thought that it will be cool to see that face in real time, with really high quality detail and been animated by a real human face, using mocap. Now, the rig setup I got from David, was not thought to work outside of Maya, due to rig connections and constraints, and was not real time friendly. The base model didn’t have any texture, just a default gray material, not even UVs and some part were extremely high poly. So despite having some head start, there would still be a long-long way to go to before the final result. Here’s with what I started, and what I end up with:

Perfecting the features of the character

Even though I was provided with a base model to work with, it was like a white canvas, any look could be applied to her face. So for this test, I want the girl to look real, in the sense that she will not have a perfect smooth skin, because there’s this idea that for texturing a female face you need to make her skin smooth and perfect. From the start I want the opposite of that, I want her face with pimples, moles, imperfections and some sign of aging, but still feminine, with her glossy lips and makeup.  For achieving this I made use of two set of displacement and color textures from TexturingXYZ, one female and one male. So after the UVs were completed, I then use MARI for all the texturing and for combining those two genre maps.

Once the projection was done, I move to Zbrush, to actually embedded the details onto the mesh, and also to create extra details manually. Using layers in Zbrush was really important, because I wanted to maintain the details separated at all times, a different layer for primary, secondary, micro and custom details. For me is simpler to go extra hard on the details, then tone down to the result you want, not the other way around. That’s how I ended up with a very detailed and custom female skin surface. I used this exact method to create the tongue, gums and teeth details.

1 of 2

 

Hair

The first step was to make a fast sketch hair in Zbrush to have a notion of what kind of hair style I want to create. Once that was done, I study the guide Epic Games provide for their recently added hair material, mainly to understand how the engine manages to render hair and of course, to know what kind of maps I would need to create. I found out that Unreal handles hair differently than other real time software, like Marmoset for example. So for the hair I ended up using five maps: Apha, Color, Depth, ID and TipRoot.  I used Maya’s Xgen to create both the alpha and color maps, and then I create the rest of the maps in Photoshop, using a combination of ambient occlusion passes and hand painting.

The process of creating the hair was actually slow, because I decide to put all the cards manually, one by one, I did this way cause I want to always be in control, and also I want  to go from simple to complex. I ended up with seven different layers of hair, from big block of hair to tiny loose hair made out of geometry, to break up the symmetry and make it more real looking. Eyelashes were created in a similar way, but a bit more simple, just a plain alpha and color, and a single geo for all the eyelashes, I did it these way because later on I knew that these eyelashes would be added to the rig with their own blendshapes as well. So for the whole purpose of not having to deal with a lot of nodes and blendshapes for every individual eyelash, I decide to work with just one big eyelash geo, two per eye. As for her eyebrows, because they are so thin, I didn’t find it necessary to add extra geo just for that, the result with textures alone worked for me in this case.

Skin

The skin material was created from scratch, I know Epic Games provide a skin material example if you want to use it, but as I stated before, I want control over my shaders and materials and those provided sometimes turn out to be confusing or really complex to understand. So over time I been building a specific skin shading network inside UE4 that works for me, based on previous tests and fails.

Using MARI and Photoshop I created a total of five base maps: Color, Roughness, Specular, Translucency, Ambient Occlusion and also three Normal maps: primary and secondary details, and one for extra details. I did it this way to control the amount of normal contribution of each map, without leaving the engine and going back to Zbrush. And this is useful for me because in the close up shots you get really high details and in the wider shots, you can just turn off the maps if you want. In addition to these maps I also created three mask maps to have independent control over the ears, caruncle and mouth. All these maps vary in size, from 1k for the mask, up to 8k for the normal maps.

In the end, the skin material had a total of twenty exposed parameters that allow me to control everything I needed, from the roughness amount, fresnel, tessellation, subsurface contribution, to caruncle color or wrinkle displacement, etc

This material was also used as a base for the tongue, gums and teeths.

Eyes

Some time ago Baongoc Vu shared his UE4 scene of the Assassins Creed character, that was an amazing opportunity to study a subject that always has been challenging me, the eyes. Bao’s eye material was very user friendly, I managed to understand the shading network, and also the quality was superb. Essentially had all I needed, and because I was in a rush to finish the test, so instead of making one from scratch, I ask him for his permission to use it and he kindly agreed. Now, this material works with specific UVs position and geometry, and since I already have my own geometry, the very first challenge was to match the girl’s eye UVs to Bao’s eye material.

Then I was able to customize the eyes, change the color, Iris Radius, pupil scale, etc. But I realized that for my test, due to the closer shots, I needed to go further with the eye material, so I modified his material by adding another fake shadow pass that only affects half of each eye, and mixing it with a LinearBurnBlend. This allowed me to control an independent fake shadow to recreate the red/ pinkish transition between the eye and the caruncle, giving a more natural look to the eye. It’s just a subtle effect but goes a long way when animated.

Animation

For the animation and motion capture side of the test, I first need to rig the girl all over again, this included the face, eyes, teeth, tongue, earing, etc. and do all the corresponding weight panting. Then all the blendshapes David provided need to be re-connect. I also created secondary blendshapes, not only for the face, but also for the eyelashes, to follow the eyelids to create a correct blinking animation. In total the rig have over 50 blendshapes altogether.

To record all the mocap data I used a software call Faceshift. This software is for markerless motion capture and generates a tracking data on your face based on video input and depth information via laser. So this time I use a cheap Kinect for Xbox360 that I have at my home. This was one of my problems, not having a decent budget to get a good camera to record the motion capture flawlessly without noise due to low video quality. To fix this problem, I decide to give the recording data a pass inside of Maya to fix all the glitches and noise errors, and then export everything back to Unreal. All the data need to be baked into an FBX so Unreal can actually read it. This is where a lot of errors could happen, so make sure to use the latest FBX version.

Once inside Unreal Engine I place various lights and animate some cameras using Sequencer.

Using the model for the real-time games

One of the main goals while planning this test was not only to show quality, but to know how the engine would manage to handle a complex model. So I created this test to see if a real time solution could compete with pre-render quality. I was surprised that the engine handles it pretty well, running at 50-60fps at 1080p, and bare in mind that is a very high poly model because is a test intended for cinematic quality, also have complex shaders and some textures are even 8k. Probably a few years ago nobody would say this was possible.

From a hardware point of view, we are been surprised every other month with new and optimized video cards that hit the market, consoles are getting stronger with each passing generation, so I have a positive attitude about having soon these kind of complex models inside real-time games. Maybe we are not quite there yet, but it’s a matter of time, I think we are close to make it happen.

I hope this information was useful to you in some way. If you like what you saw and want to know more about my future projects, also if you have any question you can ask me or follow me on Facebook at www.facebook.com/BraVlio.FG Thanks you guys for your time!

Braulio “BraV” FG, 3D Artist

Interview conducted by Kirill Tokarev

Follow 80.lv on FacebookTwitter and Instagram

Join discussion

Comments 3

  • FG Braulio

    Hello, you can send me an email to: bravliofg@gmail.com

    0

    FG Braulio

    ·4 years ago·
  • Neural_Coder

    Hi BraV FG! We are a team of software enthusiasts who are going to develop software for new RealSens cameras to work with Faceshift. How to contact you?

    0

    Neural_Coder

    ·4 years ago·
  • FG Braulio

    Hi John, I had a old licence from 2014, before it was bought by Apple. Another solution would be Brekel Face 2.0

    0

    FG Braulio

    ·6 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more