Game Dev Under Limits: Tools, Solutions & Sacrifices

Artemiy Bulgakov shared an article about video game environment production with limited resources, the setup of helpful tools, performance profiling, and more.

Artemiy Bulgakov shared an article about video game environment production in a scenario of limited resources and time, the setup of helpful tools, performance profiling, and more.

In the article, Artemiy is using an example of ShellShock, a game under development, to describe how a more technical approach to video game environments can benefit both the production pipeline and its artists’ workload.


My name is Artemiy Bulgakov and I am a graduating senior at Ringling College specializing in pipeline tools, procedural content, and performance profiling. Originally from Russia, I’ve had a chance to live in a variety of places and speak multiple languages which taught me how to produce solutions to a wide variety of situations on both professional and personal levels.

I’ve always had a passion for solving technical problems through resourcefulness and non-standard approach and in this article we are going to take a look at how a handful of creative solutions can help develop a project with just one person, limited resources and an extremely tight schedule. This breakdown will go over the steps and setup for some of the tools while discussing hurdles and pitfalls that had to be addressed in order for the project to run responsively while sacrificing as little artistic merit as possible. It is also important to remember that a tool without any assets would be of little use and a large chunk of development time was spent on asset creation which will not be covered in this article.

If you have any questions, suggestions or feedback I can be found via LinkedIn or my Website. I am always open to job offers and work opportunities so don’t hesitate to contact me. With that in mind, it’s time to jump right in and get lost in the maze of meandering trenches and haunting performance costs of dynamic objects.

Pre-Production stage:

Polish and Material stage:

The Backbone

In order to start creating any video game environment, it is important to understand exactly what drives its operating efficiency and whether it may be reached procedurally. Not every environment benefits from a procedural approach. Repeating, similar in nature, static hallways, buildings as well as large biomes are typical examples of locations that can be first universalized and then randomized. From something as simple as spawning a handful of shrubs to creating complex landscapes procedural content can reduce stress on the art team and create sprawling environments with fewer artists or a smaller selection of assets. It is important to remember, however, that no matter how intricate the tools and systems are there can be little environmental storytelling done through complete randomness. Therefore, it is important to equip any procedural component with an override tool that would allow users to assert full control over the process and manually tweak the generated product. In the case of a trench spline, it was the ability to detach single parts of the trench structure to be replaced with specific assets to add storytelling and its unique setting into created scenes.

The majority of trench setup is done inside construction script which was inspired by the powerline spline and subway spline videos. In order for the spline to work correctly, it should be able to pull static meshes from an array of actors but first, we need to let the blueprint know if an actor assigned into an array is actually going to be splined or simply placed along the spline as a singular object. This is controlled by a “Separate” function which, when enabled, prevents the asset from being stretched along the length of the spline and places it on the spline instead. The script cycles through every asset in the array with the use of “For each loop” and then checks if the randomizer has been enabled. If enabled it regenerates that section of the spline again.

This approach acts as a failsafe which lets users reassemble the spline in case the current asset placement doesn’t work well. Actor placement is later defined and retrieved from an array with an addition of a small rotational offset which makes objects appear more hand placed. All objects can be detached from the spline if it is necessary for manual tweaking and the addition of unique scenarios. Armed with a working tool we can then address the trench filler.

A Tale of Buttons and Bows

To create an appropriate sense of dread trenches were to be filled with the deceased. For each of the dead to look unique every time it was replicated in the level a blueprint was made to cycle through gear arrays. For this approach to work, soldiers had to be stripped down to the base mesh and every asset had to have its pivot adjusted. This meant some gear might be incompatible with another due to size, pivots or the actual soldier rank. To ensure correct application the blueprint first selects from a list of gear that can pose restrictions.

All steps taken above do not take into account any animation that might be added to the character. If left as is the armor pieces and ammo pouches will simply float. To avoid this each piece of gear should be imported with its pivot already adjusted to spawn exactly where needed and attached to a respective socket inside the Skeletal Mesh blueprint. Once socketed the gear will be following each character’s motion. After completion, all that’s left is to populate the level while tweaking any individual mishaps or visual inconsistencies.

Although procedural assets greatly facilitate environment creation they come at a significant price. Nothing done inside the construction script can be baked and as a result, everything will remain as a movable asset. That is why great care must be taken to optimize the level as much as possible. As indicated before each and every procedural tool requires additional tweaking requiring override options that allow for objects to be singled off and deleted in order to be replaced with more appropriate ones.

However, if care is taken and enough unique scenarios are created it is possible to produce procedural environments that don’t look generated while saving time for the artists. Once environmental heavy lifting is complete it is time to optimize the level and remove GPU bottlenecks associated with construction scripts.

Performance Profiling in UE4

It all starts with an artistic vision. Beautiful screenshots, massive textures, and incredibly detailed assets. Then reality kicks in, you look at the GPU profiler as numbers crawl higher and an optimization pass becomes more and more necessary. When surrounded by environment artists one might forget that tight performance control is even a thing. It is equally hard to remember that far from everyone has the luxury to benchmark their project on a GTX1080Ti. As a starting point, we are going to take a project that was approached strictly from an artistic point thus far and optimize it to our best ability with as little loss to the artistic merit as possible.

The main variable to track here is how long it takes for GPU to load a frame. By the end of profiling, the goal was to see a drastic improvement in performance on the same hardware as well as to run the project on a machine with far more realistic end-user specs. After a heavy profiling session, the goal was to benchmark the project on a Mobile Nvidia M620 Quadro but first one needed to see the difference on its native machine.

While getting dangerously close to the limit of the streaming pool the choice was to address a less important first section. To begin with, I locked most textures to a much lower LOD level. Yet, as I was going through my grunge decal textures I realized that some of them were not at the power of two. A quick trip to Photoshop and some cropping dramatically decreased the texture pool.

1 of 2

Step 2 was addressing the procedural nature of the level. Although it allowed for quick and fairly seamless creation of the level it was all being done through Construction Script and due to its nature static light couldn’t be baked into it. In order to address the issue, I started combining all spline sections and converting them into static meshes. Similar action had to be taken with procedurally generated dead soldiers. All blueprints were merged and converted.

After converting all splines into static meshes the next step was making sure that all the lighting in the level was static and could be baked. While fiddling around with baking lighting I stumbled upon an article by Leszek Godlewski discussing optimization for “The Vanishing of Ethan Carter”.

This made me realize that as long as all my lighting is baked I do not need Unreal to use Translucent Lighting Volumes or Ambient Occlusion. So I added the following lines into the Engine.ini file.

The very next thing was to aggressively crank down on LODs. For the most part, Unreal’s built-in simplygon system works somewhat well with certain types of objects however it doesn’t do a very good job at working with screen size and distance thresholds. In order to avoid setting it up by hand for every object or converted soldier blueprint, I ventured once more into the Engine.ini to set up a new custom LOD group.

After the LOD groups were set up and the process simplified it was just a question of assigning a proper group to each object and selecting simplified materials for each group.

As a final step of broad strokes set all lightmaps to appropriate size. After that, I tied them to the LOD groups of each respective object.

Once I dropped GPU render below 10ms it was time to get more creative and specific. By using “FreezeRendering” command I was able to see objects that would be rendered either through tessellation cracks or LOD silhouettes. All trenches were laid out the same way as they were done during WWI. Meandering labyrinths with twists and turns to avoid clear lines of sight in order to defend against trench raids. I used this to my advantage and set up culling volumes around every corner where a player would be entering a new area.

Those volumes would forcefully cull out unnecessarily rendered meshes reducing the number of draw calls and triangles on the screen. To take it a step further I started merging all similar meshes that appeared in clusters or stacks to further reduce draw call numbers.

Past this point, I decided to focus on shader complexity and the number of instructions versus their visual impact and importance. One of the first offenders became a material used for snow particle effect. The snow particle spawns 1500 GPU sprites and is seen in every frame of the game. By removing some more fanciful effects and nodes I reduced the material from 85 instructions to 65 while also disabling fogging and switching lighting mode to “VolumetricPerVertexDirectional”. This reduced Vertex Shader from 113 instructions to 52 and Volumetric Lightmaps from 112 to 76 instructions. Lightmap Directionality was set to Off. Final result was indistinguishable from the original version.

1 of 2

The Road of Sacrifices

Unfortunately, nothing happens without making a couple of sacrifices. By default Unreal rendering is set to only perform a partial Zpass. This doesn’t allow for Dbuffer decals. Enabling Dbuffer decals was necessary since all lighting had to be baked down. Unless Dbuffer and full Zpass is enabled each decal would need a movable light over it which would contradict disabling Ambient Occlusion.

Enabling full Zpass was a solution as long as there would only be few Dbuffer decals in the frame. Full Zpass may increase the fidelity of overdraw heavy scenes but at a performance cost. As long there are few alphas on screen and overdraw is minimal it is a justifiable cost.

1 of 2

Before Optimization:

After Optimization:

Artemiy Bulgakov, Technical Artist

Join discussion

Comments 1

  • InnaBogach



    ·4 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more