I am a generalist developer who began to look into 3D and game development related topics about 2 years ago. I initially focused on learning Unreal Engine 4 and photogrammetry techniques, and more recently Substance Designer. At this time, I was also looking at modeling characters and creatures and this is why I got interested in hair and fur generation.
The first introduction I had to real-time hair was when watching Johan Lithval’s CGMA webinar Creating hair for games. I was blown away by the quality of what was done there, the techniques used and that I was finally understanding how the different parts were coming together. However, not being a Maya user I had to find another way to generate the hair textures. I looked at other tools but was not very successful in producing something easily and in a reasonable amount of time.
From what I could see, the most common way to make hair/fur for games was to use hair cards, low-poly geometry onto which hair clump textures are mapped. These are then handled by a specific shader that can drive various effects such as depth, transparency, anisotropic reflection, etc. The types of texture maps required and the way shaders use them is not necessarily common to all workflows, it depends on how the shader is working and performance requirements. I think the fact that some workflows require specific maps and some others (for which sometimes only the map name changes but not the function) is creating some confusion among artists, at least it did that to me at first. Hair & Fur is able to deliver 10 types of maps (including derived ones, not counting all the possible generation modes) which generally covers most usages.
In parallel to hair topics, I was looking at Substance Designer and was especially interested in its programmatic capabilities (functions). What really linked the hair texturing to Substance Designer was when I watched Vincent Gault’s enlightening FX-Map introduction stream (in French). I understood then that most Substance Designer nodes were built upon a reduced set of atomic nodes, and the FX-Map was actually acting as an advanced screen device able to display patterns anywhere into a texture with various options. Most of them could be driven by functions. Adding to this the ability to iterate, this makes the FX-Map a freely programmable self-contained texture generator. I started to implement a Bézier curve and gradually it started to look like a tool that could generate hair/fur textures.
I want to thank Nicolas Wirrmann from Allegorithmic who provided invaluable help regarding the understanding of Substance Designer programming specificities, advice, and optimization tips. Hair & Fur wouldn’t have been the same without his support.
Hair & Fur lets users shape hair clumps of various types and styles to be used as hair cards and generate PBR texture maps plus others typically needed for hair shaders. It also provides tiling tools to assemble clumps into larger textures so rendering engines have a smaller amount of textures to handle. The tool may also be used for 2D works (samples are provided). It features two colorization modes that may be used or not. The Depth map, for instance, may also be colored outside of Hair & Fur, either in Substance Designer or other applications. Samples (examples), Templates (start-up files) and Presets (base designs) are also provided so people can quickly get started with various hair designs.
Even though running in Substance Designer (SD), only a basic knowledge of the latter is required as most items are self-contained in the tool’s Substances. Substance Academy has great introduction videos to the UI basics of Substance Designer and Hair & Fur also has tutorial videos first part of which contains an overview of the SD UI.
A challenge regarding hair texture generation is that there are a lot of hairstyles (including coloring), but the feature set and the user interface need to be limited to a few items only. So every time I am looking at supporting something to represent a given hairstyle, there is the need to abstract it from a specific example into something more general that would work for this case but also others. The approach I came up with is to view things at several levels and provide shaping functionalities for each. The top level called Parent represents either the clump shape or a part of the clump. Up to ten Parent Strands can be generated this way, each usually representing some large scale specificity within the clump. They can be individually positioned, rotated and shown/hidden along with their associated strands (children and subdivisions as described next). Child Strands are spawned per parent, their count is not limited, they are used to “fill up” the clump section associated with their Parent Strand. Then come Subdivision Strands, which are spawned per parent or child strand. They provide thickness as strands are sometimes aggregated together along with a common shape like in curly hair. Finally, Related Strands share an organization relationship, they are used to create braids but also cordage.
Strands can be shaped and organized using up to 10 control points. Their position is defined by either various parameters or directly by moving them in Substance Designer’s 2D view.
Child hair strands can be combed, either along the whole clump length or partially.
Strands are made of patterns that are either bitmaps or procedural textures generated by Substance Designer. Four default patterns are available, users can also provide their own, grayscale or color. Pattern size and resolution can be chosen, patterns can be oriented to follow the clump flow or not and can be dynamically sized, reducing their count to produce a continuous line. Dynamic size can also be used to create a fluff effect or even straws when pushed to an extreme.
Strand thickness can be controlled at root and tip along a defined length to which randomness factors can be added. This is mostly useful for thick strands which may be used for stylized hair.
A fade effect at root and tip can be applied to strands in several maps, it can be either uniform or more or less random based on user settings.
Strand length can be controlled as well as root grouping with a proximity factor.
Subdivision stands are spawned per parent or child strands (then called referral strands), usually very close to their referral strands to give them thickness. Users define a spreading space where they can drift into, their dispersion can be controlled, their depth may be more or less related to one of their referral strands. Subdivision’s depth can also be attenuated on the side of their spreading space (Side Depth), braids use this feature for roundness. Some subdivisions may be stray and move away from their spreading space to create flyaway hair. Subdivision strands may be randomly distributed among their referrals to create diversity.
Modulation is used to create wavy/curly hair and braids/cordage when associated with Related Strands. There are two modulation functions, sine (with variations) and triangle, the latter may be used for non-hair designs. Users define the frequency/amplitude of the waves. Amplitude can be more or less randomized as well as frequency through the usage of frequency modulation. These settings help for realism so not every curl is exactly the same.
Depth can also be affected by modulation, creating a 3D effect the placement of which is configurable and can be combined with the actual strand depth. Modulation amplitude can be faded at root and tip, so it is possible to have the root part less wavy than the tip, for instance. Modulation follows the strand flow so it can be used for strands in any direction such as round shapes (curved hair or hair buns).
Hair & Fur outputs multiple map types, some are useful for shaders, others are more suited to 2D works.
Hair depth is managed through an independent depth profile. It generates a texture where each column represents a distinct depth profile which can be associated with a hair strand.
Depth profiles are generated automatically with general generation parameters such as frequency/amplitude of the depth variations. The first 10 columns of the depth profile texture are reserved for parent strands. These can either use general/specific generation parameters or even a user-provided external depth profile (which may be generated by SD nodes or other apps).
Hair & Fur has two colorization modes - Length and Group - which can be used independently or combined. Both operate using a user-provided color source texture. Length colorization colors hair strands along their length while Group colorization picks color groups into the color source and applies them to groups of strands. Both modes can be combined using multiple blend modes such as Multiply, Soft Light, Overlay, etc.
Both Group and Length colorizations can be applied to groups of hair: Parent, Child or Related Strands (braids). Using Group colorization, each parent and relatives (child, subdivisions) or each child strand and relatives (subdivisions) can use a distinct color group. For braids, each braid component may use a different color group.
Color source textures can have various organizations, below are some examples as well as the results they can produce.
Length colorization usually uses a color source organized by row, as each column represents the potential color variations of a hair strand along its length. Colors do not have to be uniform for each row though and below are examples of what can be achieved by modulating either the luminosity or the color along the texture width.
Regarding Group colorization, the illustration below shows how color groups are being made by the function of the color source organization. With no specific color organization, a group is defined by a circle the center of which is picked randomly and the radius is configurable. For color sources organized horizontally or vertically, the group is made using a random center and an extent in the direction where colors vary. Once a group is defined in the color source, random colors will be picked inside it and assigned to hair strands grouped by either Parent, Child or Related Strands.
It is also possible to get coloring without using the colorization features, here is an example where the depth map has been colored using SD nodes.
Hair & Fur can generate multiple hairstyles: straight, wavy, curly, large curls, braids (and cordage) as well as round shapes suitable for hair card buns. It is, however, to be noted that currently it only generates textures, so designs such as braids or large curls may be used on hair cards generally for small features. For larger ones, a 3D positioning such as interlacing for braids is needed for more realism. Hair textures then do not need to be braids but straight hair as the 3D model would handle the braid design.
Braids may have different aspects determining the frequency and amplitude of modulations, the spacing between components, their overall shape, how much they are compressed when passing under other components, adding flyaway hair, etc.
Large curls can also be configured in various ways regarding their thickness, depth aspect and how organized they are.
Within the context of hair cards, several hair clumps are generally assembled into larger textures in order to reduce the texture count for game engines. For this, tiling tools are provided:
These tiling tools can operate on any texture, not only hair textures, but they may also be used to organize atlases, for instance.
Tiling tools enable the user to set up the aspect ratios of both the inputs (the clump textures to tile, generally square) and the outputs (here a rectangular 2:1 texture). Clump spacing/sizing/offsetting can be operated for all clumps at once so we obtain a tiled texture very quickly. We can then position/rotate/size independently clumps that would require further adjustments. Note that during these processings, the input clumps are never upscaled, the various scaling adjustments they receive are combined and then performed on the original clumps resulting in downscaling only, this way preserving texture quality.
Creating hair textures procedurally has several benefits:
Substance Designer is a procedural texture generator, so Hair & Fur is in a well-suited environment there. The benefits of operating into this context are mainly modularity, the non-destructive aspect, and the additional functionalities which, combined with those of the hair generator, lead to even more possibilities. Here is a non-exhaustive list of features provided by the interactions with SD:
The generated textures can be used in 3D applications and sometimes also in 2D ones with appropriate map blending as mentioned above. To that attention, samples are showing various examples of what can be made for 2D.
The output textures can generally be used as-is in 3D applications such as game engines, they do not need further adjustments. If such ones are wanted though, they may be made inside SD by further processing the outputs, or in an external application. The package is provided with a Toolbag template and the documentation contains usage instructions for the Unreal Engine 4 hair shader and Toolbag.
Finally, I tried to make things easy for users to get into the app providing a detailed manual covering all parameters, various concepts, and tips on making specific hairstyles, a free try-out app, and a 4-hours, 5-part set video tutorial:
I can’t wait to see how artists are going to use these tools! Feedback is greatly appreciated (for instance using the Polycount thread), it will help to enhance and create additional features along with those currently being planned. For further updates, you may check my Twitter and Artstation accounts. Have fun making hair!