Pete McNally, a 3D artist with an impressive career, shared his thoughts on Artomatix, Reality Capture, Substance Designer and the future of 3D production in general.
Hi! My name is Pete McNally, I live and work in Dublin, Ireland. I started my career back in 2003 when I was hired out of an Art college to work as a character artist/animator for a small startup in the North West of Ireland called Torc Interactive. Although primarily I was hired to create high and low poly characters, the work soon shifted to many other areas of the game art and I learned a lot. I often had to leave my comfort zone completely and try new things. This led me to more of a 3D Generalist/Technical Artist role. Techniques and tools for normal maps were in their early stages and I often had to get creative with the applications available to try to raise the quality bar of the texture we were putting out.
Over the next 5 years or so I was promoted to Lead Artist and then Tech Art Manager before leaving the games industry for a while. I then went to work in broadcast animation as a 3D Supervisor in the Commercials department of Brown Bag Films, a twice Oscar-nominated animation studio in Dublin. It was also a great learning experience.
The following year an opportunity arose to work with Havok on some new products they were launching (Cloth and Destruction). I knew my heart led to the game art so I moved again and have been with Havok ever since. Havok was acquired by Microsoft in 2015, but my role hasn’t changed that much. I had lots of demo work to show off new features to the customers and other stuff I can’t really talk about, sorry!
3D modeling software
Photogrammetry and procedural materials are really pushing things forward right now. A lot of my spare time experiments involve lightweight photogrammetry. I often use just my phone to take photos and then build full PBR materials from the resulting models in 3ds Max, Photoshop, and Knald.
Powerful tools like Capturing Reality for photogrammetry and Substance Designer for materials are not out of reach. I’m constantly amazed by the quality and cleanliness of materials created in Substance Designer by the community. Some of them exceed 3D scans in the clarity of output. AI-based tools like Artomatix (also based in Dublin) are taking some of the pain out of wrangling textures from 3D scans, and can often intelligently remove seams on tricky surfaces or fill in missing detail without manual help.
I think the machine-learning/AI approach is going to be a game changer. Imagine scanning an outdoor surface composed of grass, mud, asphalt, stone, etc. and having AI identify each component material, then looking up albedo/roughness values from a library and automatically creating PBR textures. I think tools are key here, they allow to keep control in the hands of artists. Although Substance, Artomatix, and Quixel have made inroads on making it friendlier to work with the huge datasets that come with 3D scans, I’m not sure that there is a fully robust and streamlined end-to-end solution yet for taking raw geometry and outputting PBR correct, seamless materials.
Creating seamless materials
Outside of my day job I tend to explore and write about the areas that interest me, and right now I’m experimenting with extracting textures from 3D scans and making them usable across arbitrary geometry. In other words, I try to take a texture that perfectly fits a unique piece of scanned, UV-mapped geometry and fix it so that it can be applied on other modular 3D meshes, essentially creating versatile world textures from props. Issues with this usually involve wrangling heavy high-poly geometry and large textures, preserving detail scale throughout, removing repeating features and extreme displacement protrusions, and removing seams at the texture edges.
There are already some solutions for removing seams on materials but I haven’t found anything to match Artomatix. It uses an example-based approach, rather than procedural to remove seams and can also mutate textures, creating new textures based on input from existing ones. For example, I used Artomatix in the Сraggy Cliffs:
It has given a new lease of life to some 3D scans I’ve done that were too problematic to be used in production, such as textures that were too blurry or geometry that was stretched or had holes in it. Artomatix can fill in missing detail, allowing you to paint areas that you’d like to ignore and producing new viable textures. Occasionally I’ll still remove seams by hand, in an old-fashioned way using clone stamp and offset in Photoshop, like with the Tree Bark Study.
I’m also making use of hardware tessellation and displacement, so on low to mid-poly geometry, you can still add some larger detail forms to the model. As long as you have a good UV layout you can tile on the materials appropriately, like this:
You can see how some of the larger forms are kept, while smaller detail is added by the material. It’s important to avoid sharp edges in the low or mid-poly geometry models you apply the materials to and to keep the topology regular so that it works well when displaced.
Here is an example of a tiling 4k texture set that was created from a smaller section of a 3D scan (created with Samsung Galaxy S8) and Capturing Reality. The original model looks like this:
I liked how the rock was layered and folded in places. I thought it’d make an interesting material so I set up a render in 3ds Max at 4k and cloned/rotated the entire model a few times to fill up the render window with the layered rock area. I rendered out albedo, AO, normal and z-depth maps and took them into Artomatix. Here is the resulting material:
Here is some quick smooth low-poly geometry I put together to test it in a scene context:
This is the same geometry with proper lighting and the material applied, the ground is a different sand material obviously:
Working with Artomatix
As Artomatix is based in Dublin, I’ve chatted with the team there a few times over the past few years and have been on the alpha test for a while now. They’re really into what they do and value feedback at all stages. As you’d expect from an alpha product, it’s still rough around the edges, but there are regular updates and the team fixes the bugs quickly. Currently, there is a node-based workflow, and the seam removal is as simple as creating a material from imported textures, connecting it to a seam removal node, setting an edge radius and hitting calculate. There is also an analysis pass before the seamless textures are created. The new detail is not procedural, it rather samples the input textures and intelligently decides where those samples can be placed and blended to create a seamless image. It works across all textures at once, so you get a full PBR set from your inputs. For more control you can add a mask-painting node that will allow you to paint areas that you want to remove on a textured 3D model or in 2D, see below:
After seam removal and mutation:
And the final result (Toolbag 3):
I know there are other material creation features in there such as “material from photos” but I haven’t explored much further. At the moment I only work with those features that help me with my scanning experiments, and they have become an important part of my workflow.
3D Production future
Already when texturing, my workflow has changed. I’m doing less baking from high-poly models to low, rendering albedo base, normals, AO and z-depth to use as displacement straight out of the viewport in 3ds Max. This texture output goes straight into Artomatix and comes out tiled, after which I create a roughness texture using the albedo as a base and referencing PBR charts. Sometimes I’ll paste the seamless normal map into Knald and pull out some curvature, pixel-based height and AO and blend them with the originals, testing with Marmoset Toolbag. The material above was created this way.
Something with the immediacy of Quixel Mixer style texture blending, the power of forgiving AI image cleanup tools and tight Photoshop/Substance Painter integration would be a dream, and I think we’re heading in the right direction.