My name is Steven Skidmore and I am a Principal Artist at Gunfire Games in Austin, Texas. I got my start in video games after shifting focus from film VFX. I knew early on in my life that I wanted to work in Computer Graphics. My original goal was to try and work on game cinematics like those from Blizzard and Blur Studio.
I studied at the Savannah College of Art and Design (SCAD) in Savannah, GA, where I received a degree in Visual Effects. During the end of my Junior year, I started to use Unreal Engine to try and bridge the gap between the games and visual effects departments. After college, I interned briefly at Raven Software where I worked on Singularity. About a year later, I landed a full-time position at Vigil Games where I’ve been with the same team (Vigil Games, Crytek USA, Gunfire Games) for the last 10 years working on titles such as Darksiders 2, Darksiders 3, Remnant: From the Ashes, Herobound(VR), Chronos(VR), Dead and Buried(VR), and From Other Suns(VR).
I’ve recently become very interested in procedural workflows through Substance Designer and Houdini. Having worked with Houdini in the past for film it was a no brainer incorporating it into our pipeline here at Gunfire as we began to see all of the work being done with it and real-time engines through the HoudiniEngine bridge. Being a smaller developer we could see the huge potential of procedural content to help alleviate bottlenecks in our pipeline.
Rock Generator Results
About Rock Generator
Procedural rock generation in Houdini has become a bit of an obsession between myself and our Principal Tech Artist, Cort Soest. It started out more as an experiment in learning Houdini’s noise implementations and how to go about mixing and combining noises to get various shapes. It then became a challenge to try and recreate some of the techniques I had often employed when sculpting rocks by hand in ZBrush for Darksiders 3 and some of our other titles.
The goal was to get to a point where an artist could input a loose base mesh and the Houdini Asset would handle all of the other steps of generating a high-poly mesh, creating a low-poly game asset, UV unwrapping, baking/applying textures in Substance using photogrammetry textures, and scattering points for instancing foliage and rocks.
The base meshes themselves can be created through any traditional box modeling techniques, or in the case of the pig head, it is just a default example mesh from Houdini to test if the tool is working properly. The base mesh is there to just define the rough volume and scale of output mesh. Ultimately the tool ends up voxelizing the mesh and converting it back to geometry in order to add enough detail and ensure the mesh is airtight.
The first pass of the network handles the basic macro shaping of the rock forms. This is predominantly done through a mountainSOP (surface/geometry operator) that applies displacement to the mesh. Under the hood, the mountainSOP is nothing more than a digital asset containing some noise functions that are built-in Houdini’s node-based scripting operator called VOPs (VEX operators).
I ended up tweaking the default mountainSOP to add some more functionality. This consisted of masking the noise on the vertical (Y in the case of Houdini) axis and only applying the noise to the X and Z axis.
The noise I use for most rock shaping is Worley F2-F1. This noise combination is one of the best for blocky, cellular shapes. The Y frequency of the 3d Worley noise has been scaled up to give the appearance of vertically stretched cells. The ‘voxelmesh’ node from my main graph is a Houdini Digital asset created by SideFX Labs gamedev team. These are great tools for speeding up workflows. This particular asset is basically a VDB to a polygon conversion tool that voxelizes a mesh to a certain density and converts the mesh back to polygons, very similar to Dynamesh in ZBrush.
The next part of the tool slices up the mesh into different strata layers. This can be accomplished in a hundred different ways in Houdini (boolean, clip, VDB fracture, etc). The particular method I chose was to copy planes to points and slice the mesh into parts based on those planes. There is a very good explanation of the technique from Gatis Kurzemnieks - Creating procedural game assets with Houdini. Part 1. I’m using a similar method to the breakdown covered on his Youtube channel.
Gatis also talks about how he added different thickness values to each layer. While I didn’t choose this method, it’s certainly a viable approach. The issue I had with my mesh was that when displacing the points along their normals, they eventually overlap one another and begin to interpenetrate. There are some techniques to blur the normals by a large amount and then displace, but all of these methods result in soft, mushy edges and I wanted to maintain sharp strata layers.
I came up with a solution of creating my own local transform for each layer and scaling on X and Z randomly for each layer. This consisted of generating an oriented bounding box on each layer and constructing a transform matrix. That gave me the local rotations of each strata layer so I could align my TransformSOP node's pivot in order to scale in local space. There is a really nice video on Houdini Matrices and Local Transforms over at VFXHive’s Youtube channel. (https://www.youtube.com/watch?v=t9yoUkFzInA)
Using expressions is where Houdini really starts to shine. In the X and Z scale values of a Transform SOP I am able to supply a random value.
Since all of these layers are being iterated over in a ForLoop, I know that each strata layer has a unique iteration number assigned to it. I can fetch that global iteration value with a detail() function and use that value in the expression as a unique ‘seed’ value to drive the rand() function. This returns a random float between 0 and 1 so I can then fit that range using a fit01() function into a new range of 0.8-1.2. This means my layer's scale will never go below 0.8 or above 1.2. The seed channel is used to have control over changing the seed and recooking the scale with a slider.
These pieces are then passed into another ForLoop in order to add the final noise. This step consists of very similar methods to my first macro pass, but at a higher detail level. AttributeVOPs / MountainSOPs were used along with mix nodes to blend different frequency noises together to break up procedural noise shapes, very similar to breaking up patterning with warping in Substance Designer. After each Noise, I would use an AttributeBlurSOP to blur the Normals on the points so that adding and subtracting more noise would not cause extreme pinching. One interesting thing worth playing with is using noise to drive distortion on other noises. For example, one way to break up Worley F2-F1 geometric shapes is to use another noise to help drive the offset of the Worley Noise. (similar to warping in SD). Changing the amplitude on the AANoise causes distortion on the Worley Noise to help break up the procedural look.
The final part of the high poly creation was to chip off the edges of the layers. This was done by using a low res version of the layer mesh, distorting it with noise from a MountainSOP and then using that mesh to VDB fracture the high detail voxel layer.
That wrapped up the high-poly creation. The next step was the low-poly. I took the high poly mesh and voxelized it at a much larger VDB density. This produces much larger voxels and thus less detail. This was used to make an airtight shell that fit the rough profile of the high-poly mesh. This was then reduced with a polyReduceSOP to drop the triangle count into something acceptable for a game engine.
Because I knew I was going to be triplanar projecting textures in Substance Painter I wasn’t too concerned with hiding UV seams, as long as the texel density was the same and there were no overlapping UVs. For this, I used a simple UVUnwrap node which is similar to box projection in 3ds Max. However, this results in hundreds of tiny UV islands. To solve this I used another SideFX Labs tool, Merge Small Islands, to combine tiny UV islands back together into large clusters based on a threshold. There are much better approaches to this method and I am currently exploring other options such as using curvature to drive UV seams and or using the layer separations themselves to cut seams. Regardless, this worked for me at this stage in the experiment.
Now it was time to bake in Substance.
The low-poly mesh was loaded automatically through a plugin my buddy Cort Soest wrote for Substance Painter and baked with the high-poly mesh to generate all the necessary mesh maps (AO, normal, World normal, curvature, position). One major limitation of this approach, for the time being, is that Substance Painter’s API does not support injecting resources from the shelf into the layer stack, meaning we couldn’t automatically assign a smart material to the mesh from the command line. I had to manually apply the smart material to the new rock asset. However, this was very quick and only took a few seconds to apply.
The smart material itself was created from a blend of 2 Megascan texture sets both set to triplanar projection mode on the layer and then blended on top of one another through various mask/generation methods in Painter. The rest of the look was just a combination of generators and fill layers to build up colorization. Strata layer colors were modified from baking an ID map from the vertex color. In Houdini, each layer was assigned a unique vertex color and exported in the high-poly mesh.
At this point, it is just a matter of importing the FBX files and the textures exported from Substance Painter into the game engine and setting up the rocks like any other asset in the pipeline.
Customization comes in the form of random seed values placed on all of the noise functions as well as on the Transform that handles the strata layer scaling. The noises themselves can be tweaked and rescaled for some different rock types. There are also controls for the number of strata layers and the distance between each layer. From start to finish it generally takes anywhere from 5-10 minutes to cook everything and texture the asset in Substance. Going forward the goal is to automate the entire process through PDG to crank out hundreds of iterations of rocks on our build machines. The only major step left is figuring out how to generate unique rock-like base meshes to input into the tool.
Houdini is a very challenging 3D program, but it is also a very rewarding and empowering program. This is a great time in the industry to pick up Houdini and give it a shot. More and more studios are starting to harness the procedural nature of Houdini to help drive all kinds of game types, from large-scale open-world games to small indie titles. Houdini has a ton to offer any team. It is a program that is constantly evolving and SideFX has done a great job over the last few years to make Houdini a viable tool for game creation. I would suggest anyone looking to incorporate it into their toolset to start by scouring Youtube for the many, many tutorials out there. Some great resources include Kenny Lammers and his courses (https://www.indie-pixel.com/), Rohan Dalvi and his training videos (https://www.rohandalvi.net/home), as well as all of the amazing tutorials and resources on the SideFX website (https://www.sidefx.com/tutorials/).
Diving into Houdini and learning how to copy shapes to points and how to use attributes to control the scale of those copied assets, as well as their orientation, is a really simple but great way to start to dive into the power that Houdini can offer. At first, it seems like every other 3D application out there until you realize you have everything under the hood available to you at any time and it’s how you decide to mix it all together that determines the end result. It is a problem solver’s most valuable 3d tool.