Creating Stylized Combat VFX in The Legend of Zelda Style

Soenke Seidel discussed step-by-step how he made a stylized combat effect in Unity based on the swirling cloud dissolution FX in The Legend of Zelda: Breath of the Wild.

Introduction

My name is Soenke Seidel, but online you can find me as "Warby". I am a self-taught Technical Artist. I have no formal education in graphics, computer science, or anything like that but since this industry values your portfolio over any kind of degree, this has never been an issue for me.

I started my career over 20 years ago as a Level Designer. When Environment Art/Level Design split started to take hold of game studios around the world I switched to environments because I always enjoyed the process of creating beautiful graphics more than setting up gameplay, even if the final results are just as satisfying.

But my real passion has always been performance optimization. With that in mind, it was inevitable that I also got to work a lot with FX and shaders because that is usually what most of your GPU cycles will be spent on. And when the term Technical Artist grew in prominence it became clear to me: "This is what I have always been... there just wasn't a word for it".

I have contributed to over 50 games with varying levels of involvement, the most famous projects are no doubt the two Kane and Lynch games by IO Interactive. I worked on a bunch of multiplayer maps and also got to do some FX and a lot of performance optimization. My most well-known indie effort is probably Codename: Gordon, a 2D Half-Life fan game made in Flash. Valve actually published it on Steam at the time when no third-party games were present on the platform. If you downloaded Steam and didn't have a Half-Life CD Key, it was the only game you could play there.

I am currently working at Deep Silver Fishlabs on the sci-fi space-shooter Chorus and I always have at least one unannounced indie project cooking.

Stylized Combat VFX: Project Idea

Showing performance optimization work in one's portfolio is rather difficult and getting the foot in the door with some bombastic visuals will always be a lot easier. All FX artists I know and love have these "2 spheres fighting" demo scenes in their portfolios/FX-reels, and I felt pressure to abide by this newly forming standard to keep myself employable. Also, I have a document with tech art topics I want to dip my toes into. There is stuff in there like:

  • Try out Quixel Megascans
  • Make a flipbook with EmberGen real-time fluid simulation
  • Get better at stylized effects
  • Create a flow map shader from scratch without using out-of-the-box material functions
  • Learn how to make parallax mapping shaders 

And one entry was: "Reverse engineer Zelda: Breath of the Wilds twirl cloud dissolution. It looks like a hand-animated flipbook but it's not!"

So over a year ago when I started the Combat VFX project, my mission objective was to make a kickass stylized effect that would get me hired again in the future and that uses flow maps and the same technique Nintendo utilizes for its BotW clouds. Kill four birds with one rock... because time is a more valuable resource than money, don't ever let anyone tell you otherwise!

To gather reference, I recorded some Breath of the Wild footage from YouTube with Nvidia Shadowplay and ripped individual frames from the video with Avidemux so that I could easily cycle through the cloud dissolution frame by frame.

I also dove into Overwatch and recorded videos of all the combat fx for every hero. Why Overwatch? Well, apart from this game looking amazing and featuring some of the most beautiful color compositions in gaming, I think the whole "2 spheres fighting" thing started with an FX competition for League of Legends. Overwatch is the closest analog that I actually know and like! Top-down camera perspectives and indirect controls are not for me.

Analyzing Overwatch and The Legend of Zelda Effects

The Overwatch effects all turned out to be a lot simpler than I would have expected, no doubt to keep framerate high and the gameplay readable. 

The Zelda cloud swirl has a lot of stuff going on, there seem to be two different texture layers per sprite, the first revealing the second during its dissolution. But what is interesting is that the first texture gets distorted with a flow map/UV noise before dissolving and it looks like one continuous motion. There is masterful timing in that effect. I have no doubt that whoever authored these flow maps and dissolve masks has a deep understanding of traditional 2D animation and could easily create an identical looking flipbook texture. I assume they opted for a shader-based solution because Wii-U and Switch are heavily memory-constrained when it comes to a large game like this.

What's quite unconventional in that effect is that the BotW clouds speed up their rotation with time. Usually, with explosions or smoke puffs, movement speed would be fast at the beginning and then drop almost to a standstill while rotation would either slow down over time or be constant. Nobody speeds them up over time... except Nintendo.

I was happy to see in my frame-by-frame analysis that they used flow maps. That meant I could kill another bird from my to-do list.

I always hear people including masters like Hayao Miyazaki lamenting over the fact that so many artists just use existing art as reference and everything is becoming homogenous in the process. They say that one should go outside more often and use nature as reference... that said, I am not sure where one would go to see explosions in real-life, especially of the sci-fi/fantasy/supernatural kind.

1 of 7

Creating Combat FX

The plan was to have a muzzle flash, a projectile, an impact, and an explosion because those are fundamental and omnipresent in any action game and thus would be applicable to pretty much any studio and project that I would want to work with. The little charge-up intro and the fiery aftermath were added later.

Every indie developer is always being advised to "ьake the toy first" meaning whatever makes your "thing" special, start with that and give it the most time and attention. For me, that was the explosion because it had the most interesting tech in it that I had never used before.

Next, I made the projectiles which are just four different curved strips of polygons that have an alpha masked texture scrolling/panning through them. They spawn with a random rotation around their forward axis. I reduced the range of random rotation values a bit because sometimes the arches would clip into the ground. Two strips are actually intersecting like a cross. Now thinking about it, this pretty much disqualifies this effect from being used in a first person game. If this had to be implemented in an actual game and not just a YouTube video, the distance that a projectile can cover would have to be handled very differently – either by scaling the meshes on the forward axis to cover the variable distance to the impact location or move them forward in sync with the time it takes for the texture to reach the end of the strip. Here I am just enjoying the simplicity of a stationary fixed distance target. I suspect a lot of these "2-sphere FX" reel videos only work with a fixed distance which is fine but I think if somebody asked you in a job interview what you would need to change to make it work under real-world circumstances, you better have some answers ready.

Next was the muzzle flash. I reused the flow map dissolve shader from the explosion cloud to give it a bit of a flipbook feel and have the hand-painted muzzle texture deform over a couple of frames. It's subtle but I think it works.

The missile barrage originates in one spot but flies immediately in different directions while the muzzle flash assumes one strict forward angle. It looked quite bizarre and I solved the issue by copy-pasting a bunch of muzzle flashes around with angles that roughly corresponded to the possible random missile directions.

By default, Unity uses randomly generated seeds every time an FX plays. Because this project was for a video and I didn't want to leave anything to chance, I hand-picked the seeds for all the FX. Making the seeds of the muzzle flashes play along with the seed for the projectiles was quite time-consuming but it paid off. The alternative would have been to capture hundreds of videos and pick the one with the best random seeds but who has time for that? Again, it's not something you could do for an actual game.

I don't know why but shells being ejected from a gun is something I usually do as part of the muzzle flash. Unity allows for particles to collide with the world (or just a ground plane which is MUCH cheaper) and lets stuff bounce around "semi-physically". The shells also have a smoke trail and a heat glow attached to them which is something I had never done before. This was inspired by this guy named Gunship MK2 who pimps the old Half-Life games with cooler more cinematic FX. I think it's called "MMod".  

Next was the impacts because the projectiles and the explosion didn't feel causally connected. I took a screenshot of the sphere and hand-painted the cracks over it to find a shape for the remains that I liked. Then I cut up the sphere in Maya, separated it into different chunks, and made a mesh to represent the little light beams coming through the cracks from the inside. Once again, this is something you probably can't get away with if your impact and damage feedback FX have to fit 100 different animated skeletal meshes. You will probably have to accept that the light glow will peak through in random places.

The actual sparks and spikes are done the exact same way as the muzzle flash, just with a different texture and different spawn angle. Always reuse what you have! Or you will never finish anything!

At this point, all the "must-have" features were done. I did some test builds of the Unity project and captured Shadowplay videos. There might be more professional ways to get videos out of Unity but this works for me. I felt that the timing of the whole thing was off somehow. The video started and before the viewer had any chance to orient in the scene space, hell was already breaking loose and with the same breakneck pace it started with everything was already over and VLC jumped to a black frame at the end of the video. I instantly knew I needed way more breathing space on both sides but since this is an FX video, there just can't be 5-10 seconds of FX-unrelated nothingness.

This is where the introduction/charge-up and fiery aftermath idea was born. I once again looked at portfolios of a lot of other VFX artists to see how they visualize a charge-up and the universal consensus seems to be that abstract stuff needs to be flying towards the subject matter that is charging and punctuated by some sort of inverse shockwave circle. And who am I to argue with that? It definitely visually sells the idea that the subject is soaking something up like a sponge.

I also made a new shader that looks like it distorts the frame buffer but what's actually going on is that a sprite uses the already existing pixels as a texture and then we apply some UV noise or a flow map to distort it. This is a trick as old as programmable shaders. But one thing I still see even in high-end AAA games is: people use the values (0 to 1) of a single channel texture for the offset. It gets the job done most of the time but your distortion is now limited to pushing all the pixels in a single direction, probably to the right (u+ positive). Pushing is really a wrong analogy to use here. The pixel to sample from is being offset, so it is more like a pull than a push. If you use two channels and offset the RGB values in the shader to occupy the -1 to 1 range, you can "pull" pixels up, down, left, and right and at any angle in-between that looks way better and is so simple to do!

Now that I had this shader I felt the urge to use it everywhere. I added little shockwaves on the ground around the spheres and a large one triggered by the explosion.

The last missing piece was the fire and smoke. Usually, I would do stuff like this with a pre-rendered Maya fluids flipbook or pull an already existing texture out of the archive. But since the Zelda cloud had this strong "hand-painted traditional 2D look without flipbooks" theme going on, I wanted to double down on that. So I opted to resurrect a favorite trick of mine that I learned ages ago from analyzing Guacamelee!:

Make a texture that has three different masks in the three RGB channels – the bottom of your flame in one and the left and right "sides" of the flame but as a vertical strip. Then use shader code to rotate both those sides inwards until they meet on top. Make them pan/scroll at different speeds and... instant fire. The masks also have a gradient towards the inside with a different cut off threshold, they add some brighter inner flame.

If you are an FX/Shader/Tech artist, I am sure you are familiar with the concept of using UV noise to make stuff wobble by adding/scrolling random texture noise values over your UV-coordinates before sampling from a texture. The bad thing is, this is something you have to do at the beginning of your shader network and you can't do it at the end when everything is combined. But you can achieve something similar by using a tessellated plane instead of a sprite and then having the noise drive a vertex offset. That's how I create the wind here.

Smoke is just three hand-painted layers panning upwards with the same vertex wind trick. I tried using Unity's ribbons but just like Unreal's, they break apart immediately if there is any kind of randomness in the velocity of the individual control points. It's just not suitable for anything that is supposed to look like it's being affected by wind.

At this point, I started doing a lot of small scale polish changes that bugged me the entire time but I doubt anyone else would even notice them. For example, the seed I picked for the explosion clouds would leave some small areas empty and it didn't create enough volume so I made a new emitter that spat out a single sprite just to fill in that tiny space on the screen. I really started to play directly to the camera here and introduced a bunch of little forced perspective tweaks for better screen composition. The two spheres, for example, are not actually at the same height.

I was practically done but I had released enough content onto the web to know that as soon as you hit the release button you will notice some horrible mistakes. Instead, I opted for one more round of feedback and then sat on that feedback for a week to really stew on it. I send the video to a bunch of my FX-savvy friends and people I know who actually work at Blizzard (some even on Overwatch). The feedback was clear – the effect looked pretty cool but that explosion cloud was plain bad! 

Necessary Sacrifices

The biggest challenge was reverse engineering a shader and a bunch of texture maps based on nothing but compressed video frames. Even though I am sure I am on the right track without actually seeing the shader code and maps that were used, I will never know if the delta in quality is me doing something fundamentally wrong or just me lacking traditional 2D animation art skills.

All the improvement suggestions that people threw at me would all have moved it even further away from my reference. What you do in a situation like this is you "kill your babies"! The explosion cloud was my beloved centerpiece and everything else was just there to support it and now it was the weakest element of the whole video. All I could do is minimize the damage... short of starting over from scratch.

Everyone said it was too obvious to see where one sprite ends and another one starts and nobody "got" the speed-up of the rotation, even though the timing was frame perfect identical to Zelda. I scaled it all down, halved its lifetime, faded it out quicker, made it all more transparent and softer around the edges. I de-emphasized the clouds as much as I could.

If You Want to Create Zelda Effects

If there was a tutorial out there on a Zelda effect and it had any other source than Nintendo, you might as well try to reverse engineer it yourself. We are all just shooting in the dark here. Your approach is as valid as anyone else's.

For beginner FX/Shader/Tech artists I would highly recommend Simon Trümpler's blog. I am an occasional guest contributor there, appropriately enough with some Zelda analysis content (see Wind Waker). If you are looking for inspiration, I like to follow Harry Alisavakis's Technically Art Issues.

Soenke Seidel, Technical Artist

Interview conducted by Arti Sergeev

Keep reading

You may find this article interesting

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more