Aitziber Azkue talked to us about the Henry V – Drip Fit for a King project, discussing the process of custom MetaHuman creation with Unreal Engine 5.5, achieving a realistic look for a digital character, and sculpting and texturing cloth and armor using ZBrush, Maya, Marvelous Designer, and Substance 3D Painter.
Introduction
Hello! I’m Aitziber Azkue, a 3D Character Artist originally from Caracas, Venezuela. I’ve written a couple of articles about my work before, so if you’re curious, check out Fish and Vammatar. The long and the short is that since leaving my Technical Writer job in October 2023, I’ve fully dedicated myself to learning 3D art. This piece is the result of my final 4-month mentorship project with Alena Dubrovina: a full character plus a modular armor set, a Drip Fit for a King.
Preparation & Reference Hunting
I had the idea of creating a modular armor set at the end of my Intermediate term, so I had some time to let the thought simmer. I practiced armor modelling during my Advanced term when creating Camilla, and made sure to gather as many real-life armor references as possible in preparation for what was to come.
The most important thing when modelling realistic armor is to find high-resolution real-life armor references. This can be tricky, since it’s easier to find either full armor sets where it’s hard to see the details, or individual pieces without context. What you’re looking for is something in the middle: close-up pictures of full armor sets and their individual corresponding pieces – pictures that give you information about how the piece is both worn and how it is constructed.
Even though looks are important, I like to prioritize functionality. While I try to stay as close to the concept as possible, I think it’s more important to have armor that makes sense than to have armor that looks pretty. This also means that, on top of modelling the assets, I also had to do a fair bit of concepting to ensure everything felt right under the tabard. Fortunately, I did a volumetric ton of that for Camilla, so there were some hard lessons already etched in my brain. Whenever I was out of ideas or needed some inspiration, I watched videos of people putting on real armor and even ended up falling into the Buhurt rabbit hole, a sport where people wear full suits of armor and fight each other (it’s the little things!).
Custom MetaHuman Face Sculpting
With armor references ready, I then decided I had to make at least two different heads to showcase modularity (I ended up making about six heads, but only used three). At the time, the latest version of Unreal Engine was 5.5, so we didn’t have the fancy MetaHuman creator. After a ton of trial and error, I found a workflow that allowed me to preserve my sculpts entirely and still use the MetaHuman Live Link to animate the faces. I didn’t want to spend money on fancy plugins, so I had to rely purely on trial and error.
While I can’t say I reached a single tried and true strategy that will work with any character, I did find that there were two crucial aspects to clean custom MetaHuman creation: edge flow and non-destructive modelling. For reference, this is the MetaHuman base mesh I used from start to finish. The only prominent features I wanted were hooded eyelids and a longer nose, so I could save some time. I reused this mesh for all my characters, creating a new MetaHuman with my latest sculpt for each iteration:
MetaHuman Edge Flow
When it comes to edge flow, the most important thing is that you preserve the edge loops that determine the location of the back of the eyes, eyelid borders, and mouth contact points. I made custom MetaHumans, opened them in Maya, then sent them straight into ZBrush where I immediately created my protective Polygroups. Now, this doesn’t mean that I didn’t touch those areas, because I had a fair bit of sculpting to do before those MetaHuman base meshes looked more like a realistic person. Whenever I sculpted in those areas, I was extra careful, making sure that the direction and integrity of the loops weren’t disrupted and/or made sense for the anatomy of my meshes. The Polygroups were there for quick masking, which I used together with Gizmo masking whenever it was necessary. Here’s one of the squire meshes at an early sculpting stage:
There are good reasons why you want to preserve those loops, but they’re only important if you intend to use the MetaHuman facial animation features: if you screw up the eyes edge flow, your eyes won’t fit where they should and even clip through the eyelids; if you screw up the mouth edge flow, your character will have that uncanny MetaHuman smile that makes Metahumans look artificial and, well, like MetaHumans. To clarify, I mean the situation when the bottom lip seems to hug an invisible barrier around the teeth when the character smiles, leaving an unnaturally large gap between the lips and teeth, and the depressor anguli oris muscles or DAO (aka the marionette lines) looks stretched, flat, and tugging at the corners instead of relaxing, which makes the smile look strange. A “Duchenne smile”, which is a natural smile, doesn’t involve the DAO muscles. Sometimes, bad mouth topology in MetaHumans can also cause the lips to turn outwards during animation and even make a strange, rectangular bulging volume between the bottom lip and the chin, which we want to avoid. I can’t say I reached a perfect result, but I think I got rather close by preserving the edge flow.
Here’s a video I recorded during the project while testing one of the heads, showing some of the issues I’ve mentioned so far:
Non-Destructive Modelling
Now, as to non-destructive modelling, there’s the obvious part that corresponds with what I just mentioned (do not change the topology of the mesh whatsoever!), but then there’s also the vertex order issue. While MetaHumans can be a great base mesh after a few tweaks for realistic anatomy, they have the underlying problem that their lowest subdivision is entirely too high for tweaking primary shapes. The fix is to reconstruct a lower subdivision in ZBrush, which effectively changes the vertex order. When your vertex order goes kaput, UE 5.5 doesn’t recognize your mesh as a MetaHuman. The solution is simple but annoying: when I was finished with a particular stage in the face sculpting, I exported the second subdivision (previously the lowest) to Maya and transferred the vertex order from the original lowest to the (now) second subdivision.
This mesh, my second subdivision and lowest for MetaHuman, is the one I used in Unreal Engine to create my custom MetaHuman. Every time I wanted to work in UE with a new head, I imported the mesh with the fixed vertex order to UE and created a new MetaHuman Identity. I then used the option to Configure Components from Conformed, which allows you to select a mesh in your project that already has a MetaHuman DNA, and it applies all the MetaHuman bells and whistles (animation-ready eyes and groom, plus facial rig).
The extra step for vertex order was a bit tiresome but necessary, and I can confidently say that, together with preserving proper topology, the whole workflow ensured I preserved my sculpts entirely and allowed me to animate all my characters’ faces with both Live Link and the Control Rig. I’ve found that recording yourself in Unreal and then using it as an animation sequence is far more precise than the direct Live Link connection to the MetaHuman, probably because Unreal is rendering your textures and lighting at the same time as it’s retargeting the animation. I made a recording for the cinematic in my ArtStation post. Here’s a video of me playing around with Live Link directly in the viewport purely for your entertainment.
Anatomy
With the MetaHuman woes out of the way, I focused on sculpting my characters in the most realistic way I could within my time frame. It was a very complex project, and I made tons of head iterations because I couldn’t decide on whether I wanted to make a historically accurate Henry V (he died at 35) or whether to stick to the concept. My more notable sculpts included James Cromwell, Timothy Dalton, and Tom Hiddleston. None of these guys made the cut, except for a sneaky Tom in one of my final renders.
In the end, I decided to imagine my own old man, and I mixed and matched features from different people. As to the squires, I did lose likenesses on Ben Whishaw and Timothée Chalamet, since I was watching The Hollow Crown and The King for modelling references and thought it’d be cool to sneak them in. Here’s my reference board for the heads, which I also used while texturing their skin.
Because I wanted to use pore detail and base textures from 3D Scan Store meshes, when I was done sculpting, I used ZWrap to wrap my heads with those meshes. I then projected history back and forth, first to my 3D Scan Store mesh so that it would have all the secondaries I sculpted, and then from the highest subdivision of the 3D Scan Store onto my MetaHuman mesh, so I could get that pore detail. I saved a backup of this mesh, and as I continued to sculpt on the heads, I reimported a middle subdivision to ensure I wasn’t deleting any pore details when smoothing.
After I was confident that I was going to keep the tertiaries, I kept sculpting on the highest subdivision, adding more wrinkles and details without symmetry. I did this with all the heads to some degree, but I put in much more work on Henry since he’s my main character. The best brushes for sculpting faces are the Move Topological, Clean Buildup, and Gio brushes. You can get those last two of them and many more here. I can’t recommend Clean Buildup enough; I’ve replaced Clay Buildup with this one entirely.
Modelling
Even though I started this article with the heads, I worked on every aspect of the project simultaneously. I only had four months to make this happen, so I had to move fast.
Modelling Armor
I’ve never been a big fan of DynaMesh, but I avoid it like the plague when working with armor. I like to make planes and shape them accordingly, using Dynamic Subdivision/Thickness and creasing the edges I need to preserve shapes. Polygroups are very useful for complex shapes when working like this, and working with ZModeler to have proper subdivision topology is crucial in order to save time later during retopo.
I stayed in this stage for a while, making sure all my secondaries were perfect before committing to anything. A big part of making a modular set is making sure that every piece fits with the next one, layering them correctly. I wanted to be able to show every asset individually without hiding anything, so the references came in very handy for the more hidden armor components.
Very early on, I identified and singled out everything that would be repeated across the armor. I made a high poly of those meshes and immediately retopologized them so I could bake them and place them with an IMM brush afterwards. This strategy proved super useful, especially with the buckles, straps, and crown (which I made using Array Mesh), and kept my polycount under control. I like to go back and forth between different stages on the pipeline, since I find it easier to identify what’s working and what isn’t, and act as early as possible.
Bringing everything to high poly was nothing short of scary, since it meant leaving the safety of Dynamic Subdiv.
I first brought the spaulder and its plates to high poly to test the waters, and started experimenting with armor trims. This small detail bothered me for a while, because I couldn’t get the trims to feel right, so I iterated a lot and looked at a myriad of references until I landed on a method and was happy with the result. Some of them I sculpted directly on the piece using this method, and others I made with a simple tube cube brush. The curve tube ones I UVd afterwards, making sure that they were aligned horizontally, and used a simple line pattern to mask and inflate/deflate some areas. I ended up making a lot of different trims, so I played around with them and tried different combinations, making sure that the composition wasn’t too loud and that they matched the armor’s original design language and its historical period.
When adding wear and tear to armor, it’s important to consider that different armor pieces will sustain different types of injuries, so to speak. Some pieces are more prone to rust, others to scratches, some might get bumps, some might deform, and some pieces might be mostly pristine if they’re covered or if they’re well taken care of. Since this was an armor set that had to fit the king, primarily, I made sure that the armor would reflect the kind of attention and care the armor would receive. I added some scratches and bumps and polished some of them off, focusing on damage that was induced by longevity and normal wearing conditions. Here, my best friends were the Orb brushes and a lot of different metal alphas I’ve collected during the last few months.
Sculpting Cloth
For this project, even though I tried working with Marvelous Designer for early versions of the tabard, I relied completely on ZBrush modelling and sculpting. Simulating cloth wasn’t my strong suit, and my mentor suggested I’d have better results if I sculpted everything from scratch instead, especially since the tabard had a peculiar form. That was, of course, intimidating. Sculpting cloth is not for the feeble, but I was ready to give it my best shot and wasn’t afraid to start from scratch whenever my folds weren’t hitting the mark.
While the squires’ tabard, shirts, shoes, pants, and gloves gave me some work, the king’s tabard was by far the most complex cloth asset I had to make. It had so many details, and it could easily look silly. I armed myself with stubbornness and kept refining it until it felt right. I relied almost entirely on the Standard and Cloth Nudge brushes, and a large but subtle noise towards the end.
The embroidery was a different beast altogether, but I think what helped me get it done fast was to stop and plan before diving into Substance 3D Sampler.
I wanted the highest quality possible, so I knew I couldn’t use the tabard UVs to project it. Instead, I made a single plane with UVs and distributed it across the tabard, using Project History to get the planes to stick to it. I drew the alphas in Photoshop and took them to Substance 3D Sampler, where I created Height Maps. I later used those Height Maps in ZBrush in two ways: to displace my planes into embroidery, and to crop the shapes exactly to fit where the threads were. There’s a lot of tutorials online on how to do this, but here are my two cents:
- To displace: Either use the Height texture as a Displacement Map, or use Mask by Noise and inflate/deflate with the Gizmo (Ctrl + click + drag on uniform scale) or the Deformation palette options. I went with the masking route, but you can get good results with displacement as well.
- To crop the shapes: Subdivide the mesh until you have the necessary resolution, use the alpha as a Noise and Mask by noise, then use Hide Part, then Delete Hidden. I did this at the very last stage, when I was sure I didn’t need to move or adjust anything.
This part was rather straightforward; with enough subdivisions, you get the amount of detail that you need, and it’s smooth sailing from there. It helped a lot that all the planes had the same UVs, so I only had to do everything once.
The show wasn’t over after cropping and displacing, though. I kept using the Height Maps to mask and inflate some areas, and played around with different types of Noise to weather the threads and make them feel more believable. In the end, I also did a manual sculpting pass where I added more noise, adjusted their position, made sure that they were hugging the fabric underneath, and deleted all the geometry that went outside the borders. This part can be tedious but has a very high payoff, since it makes the embroidery feel more organic.
Topology, UVs, & Baking
Because I did everything in ZBrush using ZModeler, this part was fairly simple. I was, of course, overwhelmed anyway because of the sheer amount of stuff I had to retopologize. There isn’t much to it other than deleting edges and triangulating whatever I could, but there were a couple of things that helped a lot.
The first one is to use the lowest subdivision of the trims as a starting point for their corresponding piece, which helps ensure that your trims are clean and the rest of the topology flows properly. The second one is to take your low poly to ZBrush and use the Move Topological brush and Project History to make sure it’s hugging the high poly right (inflating with the Gizmo is super useful here too). The rest is just tweaking, test baking, and good old Quad Draw fun in Maya.
One big piece of advice I’ll give is to test bake as early as you can. I identified my texture sets early on and exported all my meshes properly named in ZBrush as objs. With everything named properly in Maya, I tested baking using Match by Name in Substance 3D Painter and went back and forth to make sure everything was baking properly. For this purpose, I had a separate baking project in Painter where I could make sure I wasn’t missing anything and check that the resolutions I picked made sense.
I also went back to ZBrush and used ZWrap to transfer textures to each of the heads. I can’t recommend ZWrap enough, since it opens up a lot of possibilities when you’re using different proprietary geometries. I also opened the mouths of all my characters in Maya with the Control Rig, reimported the corresponding subdivision before baking to avoid the ugly dark lines in the lips (which involves that extra vertex order step I mentioned earlier, a little bit of back and forth, and a lot of patience).
When I was finished with the retopo and test baking, I organized my UVs, making sure I utilized as much as possible from each square since I wanted very high-fidelity renders. It was tricky to organize the UV shells properly while taking into consideration that I had some assets that repeated across texture sets (like the buckles and hinges, and straps), but I had to bite the bullet and make this truly modular. I’m not good at Tetris. I whined a lot about this to my friends during the process. That probably helped my sanity; theirs, not so much. Here are some of the more painful texture sets.
Texturing
After baking everything, I gave every texture set a pass where I cleaned up any baking artifacts and tweaked my Ambient Occlusion bakes. This is how goofy it all looked in Painter.
I then made six separate Painter projects for groups of texture sets (Armor, Squire stuff + swords + belts, Crown + necklace + tabard, 3x heads) where I imported my bakes. I was never a big fan of splitting my projects, but my computer can only handle so much, and the graphics drivers were tanking my performance tremendously at the time.
In all my Painter projects, I enabled/added the Specular level, Ambient Occlusion, Sheen and Sheen color channels. I also added three custom channels, which I used to change different properties of my textures directly in the UE shader (Cavity and 2 different switching masks). With these maps, I was able to mask and modulate Base color, 2 different types of cloth fuzz, Roughness, Specular, and Micronormal, when incorporating modularity. All my textures were packed as ORM (RGB+A) to prevent having too many texture files and making too many calls to the shader in UE.
Texturing the armor was both straightforward and complicated. I relied largely on my references, custom masks, and anchor points. The key point here is to add a lot of color and roughness variation: while roughness always gives us extra detail for free, color variation ensures that our metals look good in almost, if not all, light conditions. Grunges are good and all, but some stuff you have got to paint (Dirt brushes OP).
When you’re working with separate texture sets, it’s important to ensure your textures are cohesive, so I instanced a lot of my layers. One thing I’m super happy with is the custom material I made for burn marks. I originally made it for my last project, Camilla, and for this one, I tweaked the colors and roughness a bit. In principle, what the material does is create the brown to blue transition you often see in burnt metal. Camilla was more of a fantasy project, so her burn marks were more exaggerated, but for this one, I went with something subtler. The material uses a master burn mask, which is anchored and referenced in other burn layers to give that soft transition. I also made a layer for hand smudges, which isn’t all that complicated, but I thought it was a lot of fun.
The tabard was the first thing I textured, and I was surprised that it was also the quickest. It was a lot of back and forth since I wanted the embroidery to stand out at a distance without it being too bright. I played around with metalness, the base color, and how much sheen it had. In the end, removing the sheen altogether from the embroidery proved to give the best outcome.
I made sure to add a lot of small wear details to the tabard, and even though you can only really see them in close-ups, they make the fabric more believable and add a better feeling overall. I’ll say, however, that the cloth shader can truly make or break a good cloth texture, so it’s important to go back and forth between the rendering engine and texturing. My tabard looks very different in Painter than how it looks in Unreal, but that wasn’t a big problem since I was testing my textures under a lot of different light conditions in Unreal. I used to worry a lot about having the same conditions in Painter and my rendering engine, but in a real production scenario, you’ll likely have either a LUT or color profile, or simply have to test your textures in the engine continuously. I focused instead on making sure everything looked good in both places; even if Painter gave me lesser shader quality and lighting than Unreal, if it looked good in Painter, then it was bound to look great anywhere else.
When it comes to texturing faces, I like to keep a basic starting point and build from there. Since the ZWrap texture transfer isn’t perfect, I start by adjusting the base texture using the smudge brush, and then I make the typical color zones on top. Henry’s face, being an old man’s face, gave me a lot more work than the rest. I stared for a long time at old men’s faces and thought about my dad, who’s also an old man, and focused on capturing the feel of the skin without exaggerating any features. I wanted him to look his age, not a minute older, so I relied mostly on redness, sun spots, and specific discoloration areas to emphasize the storytelling. I like to have a slightly lower roughness in Painter, since my shader in Unreal has an additional Fresnel roughness slider that works together with my Cavity Map.
For the squires, I wanted them to still have some character so they wouldn’t fade into the background. I incorporated storytelling through flushing/redness, scratches, and even some fruit juice for some renders where the squire had a pear on his hand. The key here was to collect a lot of references and test different shades and intensities of red for the flushing. It was a delicate balance, but after some trial and error, I was quite happy with the results.
Tweaking the shaders in Unreal, tweaking my textures, testing in different light conditions, it’s all part of the fun. I missed ZBrush dearly, but I can’t say I was having a lot of fun exploring the possibilities and making combinations of all the assets. Some of my MetaHuman rigs were all scuffed at this point, but it didn’t stop me from fooling around with some poses and testing my textures in the meantime.
Rendering
I rendered my scenes using a collection of assets from different packs and environments. The final rigs were made in CC4, and I animated the faces using my own face with Live Link and even tweaked some poses with both the face and body Control Rigs. Going through each scene would take much longer than I can keep your attention in this article, but I’ll name the most important settings:
- Enable Lumen in Project Settings and Post Process Volume
- Ray Lighting Mode set to Hit Lighting (UE 5.6)
- Max Roughness to 0.8
- Lumen Scene Lighting Quality, Detail, and Final Gather quality to 9 (I don’t fear God)
- Played around with the Exposure Compensation and Shadow offset values to ensure my scenes and shadows weren’t too dark
- All lights set to Movable, enable Transmission and Deep Shadow. Set Samples per pixel to whatever your computer can handle. Mine were set between 4 and 16, depending on their impact on the scene
The only post-processing I did was adding watermarks, combining images together, and some small focus tweaks. I did this all in Photopea because my Photoshop license has expired. If you don’t have a Photoshop license, I can’t recommend Photopea enough.
For anyone out there looking to build an environment or a scene for character renders, I can recommend going to the Fab and UE stores and watching the demo videos of any environments that match the vibe you’re after. Some environments even have a playable free demo. I like to take screenshots of these environments and demos, and download any free environments and do the same thing for inspiration. Because these environments have a lot of meshes and specific Post Process Volume settings, I prefer to avoid rendering directly in there (learned that the hard way while working on Camilla). I usually identify the meshes I like and create a new level where I add them or migrate the meshes to my rendering project. I also like to keep a PureRef with any technical details, inspiration, and test renders. I use ShotDeck, Shotcafe, and FILMGRAB when I’m hunting for shot and lighting references.
Looking Back
I feel so grateful to be on the other side of this project today. There were many times when I wasn’t sure whether I was going to make it in four months. I shared my progress daily, sometimes multiple times a day, and I kept all the encouraging words of other students close to my heart. Whenever I felt the insecurity and fear creeping in, I returned to my work, thinking that time spent dwelling was time better spent working. It was a super fun project, I learned more than I even imagined I could, but the most important thing I’m taking with me from this experience is how much fun 3D can be.
If I were to give any advice to artists who are just getting started, I’d first quote my Advanced instructor, Luis Omar: “Just keep going”. It sounds silly, but when you’re passionate about something and want to see it through, that’s the only thing that makes sense. There’s no problem without a solution, and having the right mindset will make things much easier. Another one is to find your kind of people and stick with them. I prefer to pair my Tetris struggles with a good laugh; so, rely on your community, be kind to others, and don’t isolate yourself. We’re all humans sitting alone in front of a computer at the end of the day. The last one, and it’s a big one, so I saved it for last, is to stop comparing yourself to others as early as you can. When you see a great artist’s new post, try to approach it with curiosity: wonder how they did it, look at their work and think about their process, and try to learn as much as you possibly can from them. I have a Goals folder in ArtStation where I save everything I love, and I go back to it quite often to get inspired and motivated.
My most sincere gratitude goes to Alena Dubrovina, who saw my potential and inspired me greatly while leading me with faith, determination and kindness; to Luis Omar, who taught me so much about perseverance and almost everything I know about concepting; and to Alfonso Zambrano, Sriram Venkatesh, Cyrus Kian and Pontus Bengtsson, who made sure I was fed, held my heart, kept me sane and put up with my endless whining about UV packing. It takes a village to build an army! If you read this far and would like to check out what I’m working on lately, I post WIPs quite frequently on my Instagram account and keep my ArtStation up to date with my latest finished work.