100 Substances in 100 Days Challenge: Goals
The challenge started very simply: somebody dared me to do it! Of course, that wasn’t the main reason. I am a huge fan of self-improvement plans and use them constantly. Luck be it – I was in the middle of planning my next long-term one when Substance Alchemist came along. It was a perfect opportunity for me to improve my material skills and be a useful beta tester – those were the two main goals. Also, I had many side quests as well – like testing my work consistency, trying new workflows and elevating my personal relationship with Substance team. So, like many things in life, it was about being prepared and ready when the lucky chance presented itself.
How Was It?
The challenge was sometimes daunting and, to be honest, I kind of jumped into it with a lot of enthusiasm but little knowledge of how challenging it would be. Completing it would not be possible without Alchemist as some things there are unfairly easy. Having experience with Substance Designer where I am used to making everything from the ground up, this felt more like material kit-bashing. You bring in your parts, start blending and very quickly out comes something that feels nice. Since I enjoy tinkering very much this was perfect for me – it reminded me when I was younger and played with Lego.
What Input Is Needed?
Thing is, you can’t really start from a blank canvas in Substance Alchemist as you can in Designer. You need to input that starting point from somewhere and feed the machine. Luckily, Substance Alchemist is a hungry animal and it’s an omnivore. You can import sbsar materials or textures, but the most interesting thing for me was extracting material data from images – and that was the most used first step for most of the materials. With that option, everything is the input – your phone’s camera, every texture library, and the entire internet. Plus, I see people using it for processing scanned data and getting superb results, but I didn’t try that.
How Blending Works in Substance Alchemist
Blending is super easy, barely an inconvenience! You are just dropping layers and choosing how and what you want to mix. Even out of the box you have plenty of options for mixing – height, curvature, AO, color, mask …all the basics are there. It’s different from SD in a way that it is a minimal set with a focus on usability and feedback. Sure, you can do even more in SD with a much higher level of precision and control, but in Substance Alchemist – anybody can do that.
I’m not entirely sure what the underlying tech is, but it feels like a simplification-wrapper around the SD engine. Which is great! You give the usability to your average artist and the expandability to the advanced users. Win-win. And yes, it still is procedural and non-destructive.
I didn’t use the Inspire tab that much as I was building one-off materials and didn’t really need to build up variations. But it feels like magic! I was poking around a bit and it’s really interesting how fast you can get a new color scheme. I particularly enjoyed recoloring materials by color schemes extracted from famous paintings. Like anything in Substance Alchemist – it’s very easy to play with and experiment,
Extracting Height Data
Alchemist has an improved version of Bitmap2Material and I was really surprised what you can extract from a single source. The results are not perfect all the time but with a little massaging and preparation you can usually get something really decent. You can also choose what to keep – many times I just kept the extracted height data and re-surfaced it with other materials.
Expect magic but don’t expect miracles. Height data you extract (from a single photo) usually comes out softer than it should, which works well for organic shapes though. For other times I wish we had some more control over how hight channel is generated, but I guess you can make your own height filters in SD – I might even try that! Nevertheless, it’s a great tool for building that first step of your materials and I see huge potential for AI to help with this.
Including Substance Alchemist into the Pipeline
I think that Substance Alchemist is aimed at completely different users than SD but I also think that SD users will surely adopt it as well. Of course, every project and pipeline will find what works best for them but what looks like a perfect approach for me is – using SD for making base materials and then Substance Alchemist for blending, variations and potentially library management.
And if your work does not require uniquely crafted materials then you will probably never have to leave Substance Alchemist at all. Just utilize Substance Source and blend what you need. For average user Substance Alchemist will immensely increase the mileage you can get from the Source library. Add out of the box filters to the equation and possibilities skyrocket.
Combining Substance with Photogrammetry
My experience with photogrammetry is very limited but I can draw some parallels with using single photo extraction – and it works pretty well. You have a nice arsenal at your disposal to fight this problem. The tiling nodes from SD are there so you can get smooth blends which utilize height/color/normals for blending and behave kinda smart. On top of that Equalizer and Clone, Patch filters are ideal for final touches.
But, and there is a but, it really depends on what you want to tile. Ambiguous tiles like ground or dirt, work fantastic but if you want to tile something very specific patterns then it gets trickier. For my materials I managed to get a decent tile directly in Alchemist around 80% of the time, for the rest of them I just spent some time to manually tile the images in Photoshop.
Delighter is real magic! I’m perplexed how good it works in most of the cases, and since it’s ML-based it will only get better and better. This truly is the future and it’s so effortless, but as every new tech, it needs some time to be production-proven. I’m sure we will be seeing more and more AI/ML tech in Substance Alchemist as it evolves, there are so many places where it can be useful – better height extraction, smarter tiling, more precise recoloring or even full material synthesis from input samples.
Using Materials in Games
I have 100 materials and quality vary a lot but I am sure some of them are worthy. My goal was not to make game-useful materials, so many of them depend on height maps and tessellation which is not really game-friendly. Another problem is that I was making them for appeal, not for usability, and thus they usually have very unique features which is also not that good for games.
That being said – I’m sharing all of the materials for free so why don’t you get them and try them out in your game!
Nikola Damjanov, Lead Game Artist at Nordeus
Interview conducted by Kirill Tokarev