Hell yeah Titanfall 3
It looks very promissing, I hope it meets the demands, we should give indi studios a chance more often.
We talk with Piers Coe (BigBig Studios, Freestyle Games) about the production of high-quality hard surface models. With a huge portfolio of games (Guitar Hero Live being the latest one) Piers knows everything there is to know about great 3d content for games. In this particular post he talks in detail about the creation of materials with Substance Painter.
Hi, I’ve been working in the games industry for a little over 12 years now, having graduated from Cumbria University.
I started out in the then-independent developer BigBig Studios (I was the ninth employee) where I stayed for 8 years as the company grew. I was working initially on PSP titles: Pursuit Force, Pursuit Force: Extreme Justice, Motorstorm: Arctic Edge, and later on various concepts and IP pitches for the Playstation Vita. I worked on various asset types including props and vehicles, but I was principally involved in environments, which I developed from concept all the way through to final polish.
I then moved to Freestyle Games where I currently work. I’ve had the pleasure of working on a variety of titles here, and a very broad spread of work. I’ve done everything from audio-driven player feedback systems, rhythm-based vfx, assets & hud artwork for mobile titles, environment assets for real-time strategy games, particle systems, vfx & motion graphics, all the way through to high-res pre-rendered sequences. Certainly the biggest project I have worked on is the most recent one: Guitar Hero Live. This was a behemoth of a project with a very large team, and certainly some unique challenges. I was part of the team that created the CGI assets that were composited with the live-action backplates.
High-poly Detailed Models
I think with high-res assets, certainly if they’re realistic in style, reference & a methodical approach is key. I generally build up a mood-board of applicable reference material, both for the shape & form of the asset, and for the surface finishes and imperfections. Then it’s a case of blocking-out the asset, which is especially important if it’s to interact with other assets of particular parts of an environment. Once I’m happy with the size, scale, shape & form of the blockout, I usually just work in passes over the asset, first building in the larger details, then the smaller ones, and so on. It’s a good idea once you have the large details modeled, so have an idea of how fine you’re going to go with polygonal detail, and what you’ll leave to the normal map & textures. Knowing where the asset is going to be used and how much screen space it’s going to occupy is key here.
I generally do all my modelling in Maya as I find it fastest, but have recently been using Fusion360 more and more for particularly mechanical assets that are full of intricate details.
Building with Substance Painter
Substance Painter has been great for texturing/material workflow… it’s hard to imagine now how we managed without it for so long! Once the modelling phase is done, sadly there’s still the unavoidable time-sink of unwrapping to do… the less said about that the better (although Modo does have some great tricks up it’s sleeve for some of this). Once that’s out of the way I do a basic material assignment on the various parts of the asset and export the various elements for use in Substance Painter. As for building the materials, again a good selection of reference material is really useful, and I usually have a mood-board up on a second monitor. I then start with setting up the fundamental layers in groups, with associated masks, in the appropriate order. Base material (metal/wood/plastics etc), decals & detail textures, then damage, dirt, grime, etc etc.
The best thing about using Substance Painter is that because so much can be done procedurally, it’s very easy to go back and balance the textural detail at any stage. Once I’ve done a first-pass on all of this, I usually do an export of the maps using the relevant preset for the renderer or game engine the asset is going into. With the maps exported, I set up the asset in-engine or in a test render-scene, plug all the maps in and set up the shader so I can make sure it responds the way I want in it’s final situation. With these test done, I go back and work into all the fine details of the materials and textures in Substance Painter. From here it’s so fast and easy to update the textures and asses them in the final scene/engine and I find this feedback loop really useful.
Perfect Lighting for Material Production
Most of the assets I’ve been working on recently have been for use in pre-rendered sequences where HDR image-based lighting is the norm. Substance Painter comes with a decent selection of HDRs out of the box, so there’s usually something to suit the target result. If the assets will be going into a particularly unique env
I find it’s useful to switch between a few HDR’s when balancing the materials anyway, usually one with a nice soft lighting and another with sharper light-sources that really show off specular highlights & glossiness-maps.
I don’t find the size of the objects really effects my approach. Screen-space is key. It’s very easy to get absorbed in the process of modeling tiny details (this is my favorite part anyway!), but you have to keep in mind how far the object will be from the camera. On larger objects the clustering of details is important to give the right sense of scale. Scale of texture is also important, especially when using tiled or procedural elements, as scratches on a metal surface, for example, can completely alter the interpretation of the object’s scale.
The production process is largely the same between personal projects and game studio work. The key difference is that I can generally spend a lot longer experimenting with design ideas on a personal project. With game studio work there is usually a pre-existing concept, or at least an established idea for a given asset by the time you come to actually start work on it. Other than that, there are just the technical constraints applied by whatever engine or rendering solution you’re using, such as poly counts, texture sizes, format of the various maps for the shaders you’ll be using, etc. Quite often the workflow is still flexible, as long as you end up with an asset in a particular format. We’re lucky in that our pipeline allows artists to work in more-or-less whichever packages they like, and the assets can be fed back in to the system at the end.