Through the detailed breakdown of the Other Spring project, Cecil Boey has shared unique methods and various techniques used to improve efficiency and enhance the visual richness of this grand and spectacular sci-fi scene.
Introduction
Hello! I am Cecil Boey, a game developer who specializes in Technical Art and CG content. Born in Guangzhou and raised in Shanghai, China, I am currently taking my graduate studies at USC in Los Angeles. My interest in 3D came a long time ago as a child from watching sci-fi films and early pre-rendered game cutscenes (that look miles off from the actual gameplay). I was fascinated by the intricacy of digital environments, and I felt a strong resonation to turn the crazy imagery in my head before I went to sleep into something tangible.
During my undergrad studies at Parsons in NYC, I officially dipped my foot into the bottomless ocean of game development. Developing games with Unity and Unreal nurtured my interests and skills in Technical Art, mainly because I really enjoy being able to understand what is behind the curtain, and utilizing a combination of software and workflow to realize an outcome, you really get to stay level on the latest and greatest of tips and tricks.
From that point on, I have managed to work on projects spanning across mediums — internship at NetEase Games, developing interactive VR experiences, publishing my first indie game on Steam, to creating virtual worlds for “Other Spring”.
About the Other Spring Project
“Other Spring” is a theory-fiction project developed by the fantastic media art duo ZZYW (Zhen Zhen and Yang Wang). They approached me to be their Unreal developer after I had just finished a photorealistic environment piece inside UE5 switching from offline renderers like Arnold. I was immediately excited to be a part of this project from two major aspects.
The first aspect is that I was able to use Unreal Engine 5.1 for its strength — real-time capabilities and proceduralism in making massive scenes! Using Nanite and Lumen throughout the project has shown their major benefits for small-scale productions like ours in achieving high-fidelity results!
The second aspect is that the concept of “Other Spring” is very unprecedented. It is a hypothesis on imagining an alternate fiction world challenging the dominant paradigms and possibilities for human-technology-environment interactions, and a true testimony to exploring the cosmo-technics of East Asian heritage, as it is based on the ancient Chinese fable “The Peach Blossom Spring” but in Sci-Fi. How can I say no to that? Eventually, “Other Spring” came out better than our expectations, and the short film was shown around the globe in awesome places! Here is the link to the official 5th VH Award archive for the complete film.
Production
As the sole Unreal developer on this team, I must wear multiple hats simultaneously. Thankfully, my workflow was supported by our amazing architect teammates. The base architecture design and kitbashing 3D model components were provided to me by Mehrdad Ranjbar. My role is to transform all these models into a complete scene with the desired visual results from the visions of the directors.
This is where UE5 shines the brightest at collaborating with team members in showcasing iterations. I will be going through the following workflow: Organizing and assembling the assets, blocking out the scenes, surfacing/set-dressing/VFX, and final renders.
There are 2 worlds that I need to create:
- UNO — Universal Network Observer. A homogeneous data transparent interconnected society.
- OS — Other Spring. A hidden village in the mountain range isolated from UNO.
Assembly
To begin with, I started by laying out all the models in an empty scene.
Then I assembled them into a single cubic structure that satisfies the design decisions. Since for “Other Spring” the idea of UNO cubes is homogenous, I only needed a singular complete cube to duplicate them in masses with similar looks. Therefore, I created a Level Instance. Think of this as a prefab, a root cube that your other cubes will reference. Any changes you made on the root cube, such as adding/removing new static meshes, moving geometry, or changing surface details will be reflected on all other cubes.
I also color-coded each section with basic material for better references on visual qualities and for organizing surfacing needs later in the process.
Houdini Magic
There are additional meshes that are needed to manifest a strong sense of interconnections between the UNO cubes, hence I needed to create more data cables between the structures. It would not be ideal to poly-model each cable, because the structures of the UNO cubes might be adjusted to different configurations. Therefore, I made a procedural tool that exports 2 clusters of cables from random points on the specific cube structures.
I exposed a few parameters for fast iterations, achieving the effects of customizing the scatter points and cable XYZ curve strengths.
Before and after implementing procedural cables:
I used the same process for assembling OS as well.
Blocking Out with Sequencer
Once the buildings were ready, I began blocking out the scene. Using proxy geometry to block out the scene is crucial for fast iterations, as these UNO models are extremely heavy in polycount. I made sure that I could retain a decent reference to lighting setups and camera compositions without sacrificing too much performance by placing 1-2 proper UNO cubes alongside proxy cubes to establish a fast workflow. The rest can then be easily replaced.
Although our final goal is to render out film sequences, where optimization won’t be too big of a concern for dumping heavy assets in the scene, make sure to turn on Nanite for your static meshes!
I then used Sequencers extensively to organize different shots. Sequencers are extremely useful, especially in my case where I need tracking for the various shot-specific adjustments. For example, I organized different lighting conditions, camera compositions, and other dynamic parameters such as animations, material parameters, and post-processing settings for exterior and interior shots into each folder, with its corresponding name, to ensure the correct elements are displayed during the render.
During most interior shots, I disabled unnecessary and non-visible UNO cubes for better performance.
For sky lighting, I recommend all Unreal developers to get this one tool that I have been using for all of my UE5 projects — Ultra Dynamic Sky.
Although this is not a free plugin, it is definitely worth the price. You get instant visually stunning sky lighting and weather conditions. It also comes with great customization capabilities to tailor to your specific artistic and technical needs.
Surfacing and Adding Juices
Texturing, set-dressing, and adding VFX are the processes that I spent most of my time on, and they are also the parts that I enjoyed the most! This is where I brought in tons of details, dynamic elements that make the world alive, and adjustments that I can play and tweak around with for instant and unexpected variations.
There are 3 major areas of workflow that I incorporated for both UNO and OS:
- Tillable textures for mass area repetitive applications.
- Proceduralism:
a. Animated textures for additional dynamism.
b. Procedural assets for articulated in-engine set dressing. - Niagara Particles and post-processing.
Tillable Textures
I created different sets of tillable textures in Substance 3D Designer. This approach is due to the main factor that the amount of 3D models for the 2 worlds is unmanageable to create painted textures for each model. On top of that, painted textures are much more destructive in terms of flexibility and variations, and we want fast iterations and mass applications for this project.
For the building facades, I created variations of patterns and properties to manifest concrete, brutalist, and reflective variants. In short, I used a combination of tile sampler, tile generator, and patterns to create the final textures. For optimization, I combined all the texture maps into their corresponding color channels:
- ARMA
- Ambient Occlusion/Greyscale Maps - Red Channel
- Roughness - Green Channel
- Metallic - Blue Channel
- Alpha/Greyscale Maps
A simple quick tip: To non-destructively layer additional details and maps onto your texture, such as the additional lights on the concrete texture. I created a greyscale additional light map that can be added to the existing concrete texture. Using a static switch parameter, I can then control whether the additional lights will be shown or not. The greyscale nature of the map allows me to further articulate the coloring in Material Editor, and to apply variations with a really fast per-object variation approach — SpeedTree Color Variation node.
Vertex Painting
For OS, I implemented a slightly different approach. Since the base mountain model itself is a static mesh and not a landscape, I needed an alternative method to paint the massive mountainscape. As mentioned before, texture painting using third-party software such as Substance 3D Painter is not an efficient solution for this setting.
Luckily, Unreal Engine has a Mesh Paint workflow that allows me to apply vertex colors directly onto the static mesh. This workflow enables proceduralism within the engine itself. I created a shader that corresponds each vertex color channel to a tillable Megascan texture, and each channel has its own adjustable material properties.
This is a snippet of the node networks for the Blue vertex channel.
I assigned 4 different layers of textures to adhere to the need for the visuals of the mountain itself:
- Base Material Layer (Without Vertex Painting) - Base mountain rock
- Red Layer - Grass and moss
- Green Layer - Chipped rock cliff
- Blue Layer - River bedrock
Animated Textures
Moving on to dynamically animated textures, these are shaders created directly in-engine using Material Editor. I will swiftly showcase a few with their nodes because mostly it’s just playing around with texture maps, blending noises, and a little bit of maths. These materials are designed to have variations during play to enrich the environment, which also means they are extremely flexible to customize down to the smallest detail.
Proceduralism
This is the most fun part of surfacing. These procedural shaders act almost similar to a procedural mesh tool that reacts dimensionally to the surroundings. There are 2 main procedural shaders that I have created.
- Auto-wrapping volumetric iridescent fog
- Auto-wrapping iridescent waterfall
Both materials use the Distance Field and Distance To Nearest Surface component. Distance field, in short, generates a greyscale value from the proximity distance of one surface to another. With this variable open to manipulation, I can displace vertex positions along with various other shader properties.
Using this shader saved me a great deal of time from using packaged DCC workflow in creating complex simulations, even using in-engine tools like Niagara fluid requires repeated trial and error, and it makes the scene very heavy. Now, I can populate the scene just by easily dragging and dropping a subdivided plane or a box into the world, and making an abundant amount of adjustments with instantaneous visual feedback, while all these effects look visually stunning like they are simulated!
For the displacing volumetric fog effect, I referenced this amazing tutorial by Asher that went in great depth on creating volumetric shader in UE and 3D noises using Houdini:
On top of that, I added various networks to allow articulations on volumetric properties, noise controls, and iridescent appearances.
For the wrapping waterfall, I used a very similar approach to using the distance field. On top of that, as usual, I added various adjustments to create the digital waterfall effects.
These are the core setups to realizing wrapping surfaces and edge ripples.
This fantastic Unreal dev created an amazing tutorial showcasing this effect on falling sands:
This is one of the setups for the digital banding effects on the waterfall using UV noise and other adjustments.
Fog Cards
Fog cards are great for covering large or very specific areas of the composition, and it is much easier to define their shapes and silhouettes. I added Depth Fade to the card that gives a depth occlusion effect in 3D space on a 2D plane. However, since a flat plane is in 2D, we need specific cloud textures for masking to create a realistic shape, instead of just looking like noises in a rectangle. This creates a complication, where we need realistically animated sequences for motions if we want animated fog cards. We can achieve this by rendering a series of looping simulations as our masks, but this was too deep of a rabbit hole for the amount of time that we’ve got.
I eventually developed a fast solution that creates flowing cloud motions from static cloud textures. The shader distorts the UV of the mask textures and interpolates between different static cloud masks. This method creates a completely procedural and adjustable flowy mask that provides a rather decent look to deforming clouds. But if you are looking for finer fidelity, more input will be needed! I got my static mask textures from William Faucher’s Easy Fog pack, but you can create your own masks in Photoshop with brushes or extract them from images you take of the sky.
Iridescent
For the iridescent property, I incorporated a combination of methods to achieve the final look. There are two main approaches to iridescent surfaces:
- Hue Shift
- Zucconi 6 gradient function
Both of these functions are well referenced in this article, where the author also shared the precise formula and nodes needed.
In my case, these function reacts differently on their own to opaque, transparent, and volumetric surfaces needed for “Other Spring”. Therefore, I added extensive modifications to the coloring and noise for the desired results. Overall, I found that the following 2 adjustments provided fast and flexible outcomes:
- Adding color blend to the result of the functions. Although less physically accurate, it allows for custom spectrum gradient coloring. In my case, I tinted the gradient to cyan and purple.
- Blending both functions with noises.
VFX and Tricks
Lastly, for the underwater scene, I need to create an environment that depicts a large structure utilizing a massive amount of energy for data interchange. I aimed to render something that resemble powerful momentums. To achieve it, I used Unreal’s Niagara particle system to create an air bubble vortex. This setup primarily uses vortex force and dynamic material parameters on the bubble shader to achieve the final outcome.
A cheat code for filmmakers: Usually, the underwater effect is created using a post-processing material with an overlay of volumetric fog noises to create foggy underwater visuals. However, this approach caused translucent sorting problem, where none of the transparent materials will render. Due to time constraints, I searched for a solution from the nature of the film medium, where I could use interchangeable shots to complete a coherent sequence. I went for a completely separate level specifically built for the underwater shots.
In this level, instead of creating a post-processing volume, I took inspiration from the underlying logic of the shader, and used height fog with tinted sun color that made the entire scene dark blue. Now, all the materials function as usual, and I made sure the camera captured no elements above the water. I then passed both footage from above and below water to our editor. The stitching of the 2 shots created a coherent logical progression that naturally persuaded us to believe it was underwater. Movie magic!
Quixel Library
Finally, to bump up the set dressing, utilize the Quixel library! The Quixel library offers a range of very high-quality surfaces and photo-scanned models, especially foliages with nanite support. Utilize the advantages Unreal offers, these assets can help bring the details.
Rendering
Rendering for “Other Spring” was very interesting. Our work was displayed on the screen at Hyundai Vision Hall in Seoul, Korea, and it has a massive non-traditional aspect ratio. Therefore, the final output of the video was 14400 by 2160 pixels. In addition, we also needed a variant version at 16:9 in 4K for the judges.
The biggest problem with the drastic differences in aspect ratio is the display of content. My solution to preserve the most out of the core compositional elements is that I designed the cinematography in 16:9 first. Then, we can utilize the flexibility of a virtual camera and input results that go beyond the capabilities of our existing camera hardware. I first stretched the image through the squeeze factor lens setting. This is a property used in reality by Anamorphic lenses that squeeze the incoming rays to fit a more cubic sensor, and then stretch the image in post-production to widen the image to the correct aspect ratio.
Inside Unreal, we can simply imagine this option as expanding the horizontal vision. We can further expand the image by increasing our Sensor Width. By doing this, I ensured that while I preserved all the vertical visuals without the need to crop, I also expanded the horizontal screen estate to display more stunning visuals. It is like filming a mobile-format movie for your phone screen and then expanding the shoot again for an IMAX theater. Crazy, right?
Movie Render Queue
I used Movie Render Queue for my final renders. The settings I used are rather simple and straightforward, and there are definitely more intricate settings and tutorials you can find online! Here are some Console Commands that I was using that gave me desirable results in the picture below. For those who do not want to render in cinematic scalability when using MRQ, you can add the Game Overrides module and uncheck the Cinematic Quality Settings option.
If you want to render outputs in massive massive resolution, MRQ has a module called High Resolution. In short, this module splits your final image into tiles and renders them individually. I did dive too deep into the settings for this module, as it works as is for my renders. Feel free to play around with it, check out the official Unreal documentation, and let me know if you find anything interesting!
Summary
This is by far the most ambitious project I have participated in. I spent around 2-3 months from opening up Unreal Engine 5 to hitting the render button. Although I was the only developer operating inside UE5, and I would be lying if I said I did not crunch hard because I was obsessed with the tiniest details, I was accompanied by great support from our small but amazing team all the way from creative directions to post-production editing.
During the entire process, I was in a constant process of learning and applying new skills to the work, which is something I find satisfying and accomplishing. Before I jumped onto the project, I was very much less familiar with UE5 compared to the rest of my toolkit such as Unity. However, I was able to shuffle my skills around and effectively utilize transferable knowledge to achieve the desired outcomes. From this experience, I have concluded the one biggest tip for those who want to get into developing with UE5 or any DCC software:
There is NO one right answer to any problem!
Be resourceful and imaginative in finding a solution. When you have something working for your needs, you may be onto something that others may have neglected. It may not be the best solution, but it is absolutely a certainty that it is not wrong! Be bold, always have the passion to attempt, and have fun!
If you would like to follow more of my works, here are a few links that you might find interesting:
Cecil Boey, Technical Environment Artist
Interview conducted by Gloria Levine
Keep reading
You may find these articles interesting