Ghislain Girardot did a very interesting deep dive in the production of tile environments for Shardbound with Blender, World Machine, GeoGlyph, Substance and Unreal Engine 4.
Hi! My name is Ghislain Girardot, I come from France and I’m a self-taught freelance video-game artist.
Honestly my career so far has been a big struggle so I did not work on anything major until Shardbound came up. For the past 6/7 years I’ve done a lot of small gigs for particulars and tiny indie studios during pre-prods or for prototyping projects that all mostly never saw daylight, unfortunately. It’s tough world. Shardbound is my biggest professional experience to date and I’m very glad for it. It was a sick year.
I’m very curious when it comes to CG so I’ve experimented with many different domains in the past 10/11 years: pre-calculated architectural visualization and car renderings, rigging, character and creature animation for video-games, even a bit of UI design at some point, hand-painted environments, props and characters, pbr characters and most recently environments, one that grew on me a lot. I love discovering new softwares and recently learning Unreal Engine to make environments brought me a ton of passion. It seems to be where my strength lies so I’m going to focus on that particular domain in the future.
I was in charge of the environments: modeling, sculpting, texturing, setting everything up in Unreal Engine including materials and lighting. Everything pretty much, except the scripting/integration part to make the map actually playable, although I should have been involved as well at some point during the production but we did not have time to make it happen. The various optimizations done for lower end hardware, just like theintegration part, were left to team-mates and tech artists Justin Prazen and Ira Goeddel. Thank you guys!
It was hard for so many reasons. To begin with, I’d say that it was my first big project with Unreal Engine so it was a constant technical challenge. There’s just so much to learn with a game engine like that, it’s astounding (and awesome!). I wasn’t necessarily technically directed or constraints so I had to use my best judgment to approach any artistic challenge with a good enough technique and keep my work as optimized and efficient as possible (which I did with more or less success as I learned my ways through Unreal).
Obviously, it was always a big artistic challenge as well, Shardbound’s being intended to be a very high-quality game. It had to look good, even though we left room for greater optimization and polishing later on (the game is in early access it was important to implement the work asap as well). The process for making a map was never set in stone, they all were challenging in their own ways, bringing us props we never tackled before and new atmosphere to translate in 3D in a very specific art-style, all for the first time in the production. I had to experiment, to be independent, I had to be very flexible with my work to let us toy with the art direction and the overall composition.
Each map had a great concept art to start with that gave me an initial mood and overall art direction but it wasn’t something to precisely duplicate.Occasionally concept-artists stepped in during the mid-production of the first two maps and drew paint-overs to help bring art-director Josh Nadelberg and I new ideas. Most of the time we had to improvise and see for ourselves what worked in 3D and what not and Josh was of tremendous help through all this. Mad props to him. It was a daily battle, a constant urge to better myself and I loved every minute of it.
So as you can see, there are four different type of tiles in Shardbound: low-grounds, dead-zone (neutral low-grounds that come by four in the center of the map), high-grounds and blockers, each having their own distinctive look and game-play mechanics. Each map always starts with their own same unique tile arrangement but things can evolve during play: certain cards can destroy a blocker and change it into a low-ground for instance, meaning an adjacent high-ground can then require an additional set of stairs. Fortunately, that is handled by the engine, we just have to feed it every possible stair combination (i.e. no stairs at all, single set of stair, dual, triple, quadruple sets and so on), like below:
Each tile has to be a separate object in Unreal Engine and to be set as a movable object (can’t have bake lighting on them). Tiles also can change material and color during the game to directly give player information regarding actions he intends to perform (where is he able to cast a unit, for instance) so each tile has the constraint to use only one single material on top of being a single, merged mesh. It is not necessarily troublesome with a simple rocky-looking tile, but a complex grassy tile with opacity on borders, grass on ground, grass blades, all mixed with dirt and pebbles, can start giving you a headache.
Here is the three other tile meshes used for HighLands, including two variants for the blockers and one unique single mesh for all low-grounds and dead-zone tiles. I could get away with having only one single low-ground mesh for HighLands because its very particular treatment and complex material (covered further below) gave me all the variety I needed once duplicated and moved around. Otherwise, for other maps, I had to create enough variants for low-grounds and dead zone tiles so that once randomly positioned and oriented, no tiling pattern emerged.
Tiles were also coming with artistic constraints, biggest one being readability. Shardbound is all about gameplay and bringing new interesting ways to play a Collectible Card Game so it is super important that the player clearly understand what’s going on on screen at any given time. Tiles need to be easily differentiate from one another, a high-ground can’t be mistaken for a blocker for instance. The play-field boundaries needs to be easily understood as well and the overall focus has to remain on units so their position are not miss-understood. It all sounds obvious but it’s really a challenge when you try to push the visual to the next level and tell a story. It would be really easy to make the tiles drastically different from the environment and from one another, but then it would not make sense nor look any good. It was a constant battle between making the tiles blend with the environment while remaining unique enough to stand out from it.
Rocks and land
My sculpting techniques for both the more natural/organic assets and the man-crafted structures are mostly what I’ve described in my Shrine Breakdown, using masking and triming brushes a ton. I like how stylized you can get with TrimSmoothBorder. Again it was all about iterating, experimenting, seeing what works and what not, willing to start over and over until it had the correct look. I don’t consider myself an expert at rock sculpting and I think it is particularly difficult so I’m happy if you like the result!
Here is a little breakdown of the cliffs: the first layer is a simple plane using a tiling material. I then added modular rocks that were quickly sculpted in Zbrush and textured in Substance Painter using custom procedural materials. It’s definitely not perfect, especially the grass (blended-in using Vertex Painting and a World Aligned Blend in Unreal Engine) but these were mostly seen in shaded area so it was good enough for what we needed. Then to polish things up a bit I added some transparent/masked curved planes to ease some of the main transitions between the modular rocks and the cliff plane, and later added some foliage and grass.
The key thing for me was to have the same setup in Blender as in Unreal Engine. Having the exact same cameras and scale allowed me to build the scene in both software simultaneously and very precisely know what was going on through both player 1 and player 2’s eyes, live, in Blender. I knew exactly what to expect when I was modeling and blocking-out props so it was quite easy to check out the overall composition and making sure the tiles remained in clear view. Getting the right composition then was just a matter of trial & error, trying to convey the atmosphere and scale depicted in the concept art as best as possible.
The trees, for instance, weren’t part of the initial concept art, at least not shown like that, but we found out it was a good way to frame the picture and bring some depth and shadows to the scene. Figuring out the leaves were the most challenging part, always trying to find the sweet spot between sharp, dense details and simplicity for the best readability possible.
Here’s my basic workflow for the leaves: I first created various branches and leaves in Blender and rendered the result as a mask texture containing leaves, branches, tips to roots gradients for both branches and leaves, all in different RGB channels to later use in my leaf material inside Unreal Engine (that’s what the psychedelic redish/purplely looking texture below is about).
Once I had the texture, I cut each branch in tiny individual planes and built a dozen clusters of branches and leaves which were grouped.
I linked these groups in another Blender scene where I built the trees, and using empties (like dummies in 3dsmax) and dupli-groups, I could choose what cluster of branches and leaves I wanted and move it around as a single object, which was quite welcome considering each of these are made of dozen and dozen of planes. The thing is I had to retain the original location and rotation of all these individual planes to later bake their pivotsand use Epic’s Pivot Painter node and complex vertex animation setup. I hope it make sense. It was a bit tricky.
Almost all materials were done in Substance Painter using mostly only custom procedural materials. Some black & white masks in Substance Painter were eventually modified by hand and rare details painted by hand as well but it was mostly an automated workflow all the way. We were constantly trying new assets and experimenting with different material treatment for the assets and the whole map so I had to use a procedural workflow to be fast & efficient. Just fooling around with Dirt and Metal Edges generator along with filters can give you amazing results in Painter. The hard part, one I was not necessarily always great at, I must say, was to find the right balance between a simplified stylized material and a more realistic detailed one. I had to constantly balance a number of details put in the albedo, roughness and/or normal map to avoid having assets looking too busy and to let the stylized sculpts speak for themselves.
Surprisingly most materials in the scene are quite simple. Classic albedo/roughness/normals combo, and emissive maps for those glowing structures. Most of the details were pushed further by directional rimlights and static point lights to push albedo values and specular to the limit. Big thanks to team-mate and senior tech artist Ira Goeddel for sharing his lighting workflow with me. He gloriously revisited the lighting on Sinners Mire, the first map I did for Shardbound, and sharing some of his tips helped me a ton for HighLands!
The grass, however, is a bit more tricky but nothing too crazy. I found out that the key thing was to make grass blades not cast any shadow, that way they were more subtle. Another important thing is to use custom normals on grass blades to match the ground normals. Last key thing is to use world projected textures on both the terrain and grass blades so it’s then easy to make them both share the exact same albedo/roughness value where they clip. If you do all that, you should have a seamless and super smooth transition between the ground and the grass blades. Then you can add a gradient on grass blades from root to tips to push the albedo value at their tips and voilà, stylized grass blades!
The grass on tiles is a bit more tricky, due to that one material constraint I was talking about above, but in essence, it’s the same process. Here’s a very very rough breakdown of how the material is working
And here’s the final material in Unreal Engine, which would be a bit complex to explain in great details. One interesting thing I did was to dynamically hide grass blades where the dirt material was blending in. For that I used the World Position Offset Material Node to push the blades below ground level to hide them (it’s shown in one of my gifs). The deadzone (the 4 muddier-looking low-grounds tiles in the middle of the map) were also using the same low-ground mesh and same material as the other normal low-grounds: I just added parameters switches in the material to alter the way the dirt was processed and turn it into mud.
So those background floating rocks weren’t even part of the initial plan. We didn’t work on backgrounds on the first two maps because we intended to add them later on, either with matte-paintings or with 3D elements. I was really passionate about the project though and loved doodling on my own time on ways to improve my work. I took initiative on my own and toyed a bit with a temporary background. Josh liked it and we agreed that it should be pushed further and take the form of distant floating shards.
Honestly, Shards were uber hacked, a super long and tedious process and, had I had back then to create many of those with the need to tweak their final look more easily and freely, I would definitely have think of another way to do it. But I just rolled with it for that one time and it worked. It’s a weird mix between Inkscape, WorldMachine/GeoGlyph, Blender and Zbrush which would be super long to explain in great details, but here’s the basic principle…
I started in Inkscape and drew the shape of my shards in SVG. Now, because importing SVGs in a 3D software is often most arbitrary -there is no sense of scale or boundaries- a key thing was to draw a square to manually define the bounding box of the texture. That way you can import the SVG in blender and use the bounding box to make some sense out of it. The reason I did this is to both provide a black & white png mask to use in WorldMachine AND a way to cut the terrain in Blender using the SVG curves as a boolean operator later on. Hacky, yeah, you’ll see. I wanted the edges of each shards to be eroded, like it’s smoothly transitioning to mountains and higher peaks in the center so I used that black & white mask png in World Machine to flatten my terrain near the borders. Here’s an example of how to achieve that.
The result inside World Machine:
Once I’ve generated both a heightmap for the surface and the underneath (using the same process, I just used a different terrain treatment for the underneath), I went into Blender and cut a flat plane. This is where drawing the boundaries of the SVG directly inside it was useful because it was then very easy to make the kinda-randomly imported svg’s curves, match my plane. Once it was cut, I duplicated it and flipped to make the underneath surface, then bridged both together.
Shards are also beautified by clouds and other effects. The clouds were generated in Blender using Quick Clouds:
I made two renders of those, one containing the main directional light, the second containing the blue glowing light coming from the gaseous planet underneath, both embedded (opacity as well) in a single texture.
I then embedded two type of information in vertex color for later use in my material. On the red channel I stored the cloud distance from the camera and on green channel I stored whether or not the cloud is on top of a shard (and so whether or not it should display that bluish light coming from the planet underneath). I also used a directional light in Blender that matched the main one in Unreal Engine, to bake shadows both from and on shards and clouds. Here is the final material used for clouds:
So yeah most of these animated props weren’t necessarily even part of the initial plan. I just loved taking initiatives and thinking about ways to bring some life to the scene. I’m also interested in programming, I learned C/C++ a few years ago so it was only natural that I started playing with Unreal Engine’s blueprints sooner or later. It’s just mind blowing how powerful it is… I toyed with interactive events on my own time and Josh thought it was cool and it got implemented in game (although using different triggering mechanism I think, but still, cool!)
Grass, foliage and trees were all animated using vertex animation and Epic’s Pivot Painter. It’s quite nice, although a bit tedious to setup I think.
Flickering lights were done super easily with blueprints and timelines like so
The bird is one of the interactive event available in Highlands. It was super quickly modeled and animated in Blender over a day during a week-end. That little dude is landing every now and then on top of the ring structure and flies away on its own after a while. You can also send it away with a unique animation by clicking on it at the right time. You can also call it if he’s not here yet. It was quite fun to do. Here’s the blueprint I end up doing for it:
There’s also a ring event. It starts emitting light for a brief period of time and if you click on it at the right moment, a pulse of energy sparks and travels all the way around the destroyed ring. Here’s the blueprint:
Most of the man-made structures also features what we call “Chromite”, that inner grid-looking structure, its the most inner core being still active and emitting a bluish light. For each asset I baked an emissive map in Blender using Cycles (to include bounced & indirect lighting). The result was then used to blend-in a world projected panning hdr texture fed into the material’s emissive channel.
I had to fight many battles, technical ones first. I was always figuring out how to technically approach all those artistic challenges to keep the map optimized. It was mostly an artistic battle though, I was trying to achieve something that looked good, detailed but not too much, most importantly readable, stylized as well but looking natural and alive, all while conveying a story and without hurting the gameplay. It wasn’t an easy task but Iwas greatly directed by Josh and in the end I couldn’t be more stoked on what we achieved together and on each of these three maps.