Recreating the Tidal Basin Location from The Division 2 in UE4

Bence Blazer did a breakdown of the Tidal Basin environment recreated from The Division 2 during his studies at Vertex School.

Introduction

Hello everyone! My name is Bence Blazer, and I'm an Environment Artist from Hungary currently based in the UK. Ever since I was young, I always aimed toward some kind of creative career. One day my uncle showed me an early version of Maya, which got me hooked and started me on this career path.

After graduating from Norwich University of the Arts, I started working full time in hospitality which didn't leave much time to work on my portfolio. I enrolled in Vertex School (Games Art Institute previously) led by Ryan Kingslien to level up my portfolio to industry level. As someone who had no insight about studio life, I thought this would be an excellent opportunity to eliminate possible misconceptions that I might have picked up over the years. 

1 of 2

Tidal Basin Base (The Division 2 Fan Art): Start of the Project

Over the time of the course, we worked on two projects – started with a prop and then moved onto the capstone project which was either an environment piece or a character depending on the specialisation you chose. When I was researching my environment, I found some concept art made by Tony Tran for Tom Clancy's The Division 2. At the time, I was playing the game, and I really enjoyed the look and feel of the environment. I found this project a great opportunity to work on something that was a recent release and set the bar of quality I wanted to reach. At this point, I didn't reach the area in the game, so I only had the concept art to base my work on. The concept was perfect as it showed a relatively closed space, and I didn't want to over-scope the scale of the project. 

To keep my references organised, I used PureRef, which was always next to me on one of my screens.

Modeling

After I had my references lined up, I started blocking out the level in Unreal utilising the VSD brushes. Using VSDs, I could easily set the exact size for each element, and I had the added benefit to cut in the doors and windows. 

Tip: Dragging on the middle of the pivot with MMB allows you to reposition the pivot temporarily. You can alternatively click with MMB while holding Alt+V to snap it to a selected vertex.

After the blockout was done, I exported the building out of Unreal to Maya and started breaking up the exported mesh into modular chunks. To accommodate the use of tiling materials with a medium poly approach, I tried to keep my meshes clean and think about how I want to texture them down the line. At this point, I went a bit overboard with variations. I started creating pieces that weren't in the concept but seemed to be useful if I would want to build new structures. As I was planning to stay true to the concept, I dropped these assets to keep my workload reasonable for the deadline I set for myself. I recommend planning out the assets first as it can save time down the line. By planning, you can focus on what is essential and not waste time on unnecessary items.

1 of 3

While in Maya, I also created blockouts for the smaller pieces. I like to work in one "master" file, so I have all the work laid out in front of me at the same time. To simplify the iteration process, I used Maya's Game Exporter which I set up at the beginning of the project, allowing me to quickly update my assets as long as the name of the objects is consistent.

Alchemist and Material Work

As the environment started to come together, I needed to see what my surfaces were going to look like. As Substance Alchemist was a somewhat new software solution at the time, I wanted to give it a try and do a quick material pass with it. I thought it would be interesting to see how fast I can make iterations. 

First, I looked through Substance Source and also the Megascan library (as it went free about the same time for Unreal Engine users) to see if I could find a few materials that would be a good start for what I had in mind. For most of my materials, I had found something to begin with. To try out the "Bitmap to Material" option, I made a Dimond plate material using a non-tiling texture from Textures.com.

With some quick adjustment, I was able to populate the scene within minutes (as my UVs were set up for tileable materials it only took a few minutes). My next step was setting up two materials in Unreal that allowed me to vertex paint on the building and the ground plane (the two are different as the ground was a landscape object with more layers). When these materials were ready, I jumped back into Alchemist to generate some variations. From the mud, I created a dryer one, one that had puddles and one that was as a grassy surface which helped me blend the foliage with the muddy ground.

The variation creation took some back and forth to get the values perfect but using Alchemist made it super simple and easy to organise and maintain.

Tire Tracks

To create the tire track, my initial idea was to use the landscape tools and try either sculpting them or just stamp them down on the surface through the use of heightmaps. These proved to be ineffective for many reasons, so I ended up using Jack McKelvie's spline blueprint technique (see below). As I already used spline blueprints in the scene for the HESCO barrier wall, this was super easy to set up. The mesh component needed some experimentation as I used a more complex profile, which had some issues when I tried to tesselate it. After tweaking the shape of the asset and the material, I managed to get the result I was shooting for. This is a great way to add directional detail to your scene. It's a good idea to look out for stretching as splines are deforming the geo.

Supply Boxes Generation in Houdini

I had some experience with Houdini before I started this project. The concept of creating assets through a procedure that can be altered for quick variations has always interested me. When I started the course, Ryan briefly talked about Houdini, which made me want to use it in the project. As I was creating the blockout meshes, I realised that I could make a "generator" as most of the supply boxes had a similar shape with just different sizes. It took me about half a day to come up with a proof of concept that worked. There were a few things I wasn't happy about, so I made some changes in the initial graph. 

The main reason to use Houdini was to cut down on variation and iteration time. I ended up creating two outputs for the graph. One of them was the high poly model with colour IDs, and the other one produced the game-ready mesh with procedural UVs. To further build on this, I created a smart material in Substance Painter. This enabled me to create boxes with variable sizes and detail levels rapidly, bake them in Painter, apply the smart material, adjust the texture details, add some stickers and decals and export it into Unreal. The whole process takes about 10-15 minutes per box. 

"Outsourced" Assets

As I mentioned, the Megascan library just got free for all Unreal users. We had a long discussion with Ryan Kingslien about the application of these assets. I always felt that as an artist, I have to make everything in the scene. He argued that in a studio environment, the production is shared between teams, so the use of "outsourced" assets shouldn't be an issue as long as they are appropriately credited. Also, now that Quixel made all of their content reachable to everyone, the level of competition is going to skyrocket. All-in-all, it is just a tool in an artist's arsenal similar to any other piece of software.

With this in mind, I wanted to keep the focus of the scene on the building and the ground plane, so I spent most of my time working with the assets in these areas. This meant that I had a section in the background that needed some additional detail. I had some Dekogon assets in my library, so I adopted some of those and a few Megascan assets to fill in the empty areas. These assets were a great start, but here and there, I needed to adjust them to match the colour pallet and tone of the scene. This was easy to do in Substance Painter using some masking and HSL adjustments.

For the Megascan assets, I would recommend thinking about the use of the object as they can be somewhat high resolution. Chances are, the camera never gets close enough to the asset to tell the difference between LOD0 and LOD2 or 3 in the final renders.

The Sunset

At the beginning of the project, I was unsure if I wanted to use static or dynamic lighting, but as I never used dynamic lighting before I thought this would be the perfect opportunity to try it out. As I wanted to keep the space realistic and possibly playable, I didn't want to over-complicate the lighting by adding fake lights as rim lights. Even though the scene is utilising a simple lighting setup, it went through multiple iterations until I found the exact values I was going for.

At first, I used the default lighting setup with the "BP_Sky_Sphere", a directional light and a skylight and a Post Process Volume with "Min" and "Max Brightness" of 1. While the default sky sphere is a great start, I couldn't achieve the result that I was going for. So I created my own using the "EditorSphere" that can be found in the Engine Content folder, scaled up to 10 000. For sky texture, I used an HDRI from HDRI Haven and mirrored the top half down vertically to only get the sky as I didn't want the colours from the bottom of the image to appear in the lighting. This was a tip from Tilmann Milde's (aka 51Daedalus) Unreal 4 Lighting Academy. My sky material setup was a bit complicated at first, which I learned can be simplified with the UVtoLongLat node.

For the skylight, I matched the "Sky Distance Threshold" with the scale of the sky sphere, dialed in the intensity scale, and set the "Lower Hemisphere Color" to a desaturated brown to match the colour of the mud.

At this point, I added back the directional light and rotated it to match the custom sky, changed it to use Temperature and set it to 4200 Kelvin. This started to look like a sunset, but it felt a bit flat which was solved with the addition of an Exponential Height fog with Volumetric Fog turned on and by switching on light shafts on the directional light.

I tried to get it as close as I could only with lighting without heavily relying on Post Process. But to get a more cinematic look I made a LUT with some curve adjustment and small colour tweaks in the shadows.

Conclusion

The biggest challenges I had to face were managing my time and not getting lost in small details.

Every now and then I got lost in minor details that made no difference in the final result. There are always going to be small details that the artist will be aware of, but these details are not going to show up in the final renders unless the camera is right next to the detail. And it is easy to waste a significant amount of time working on such minor details. My advice for portfolio work is to focus on the big picture and on making the scene look perfect in the renders. Having the mentors at Vertex School checking the work every few days and redirecting my attention to areas of the scene that needed more work was super helpful to keep me on schedule. On top of that, Henry Kelly, who was the environment art mentor, was a great source of knowledge and helped a lot to overcome technical difficulties throughout the project.

Bence Blazer, Environment Artist

Interview conducted by Ellie Harisova

Keep reading

You may find this article interesting

Join discussion

Comments 1

  • Anonymous user

    Awesome work. Great explanations. Interesting read and really easy to follow along even for a block head like me.

    1

    Anonymous user

    ·3 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more