VUE without competition
Can you please give us a walkthrough how to implement this into Maya? would be super helpful. Thanks a lot.
Gregory Silva talked about his experiments with Unity and ShaderForge, which helped him to create amazing visual effects similar to Zelda: The Breath of the Wild.
Hi, I’m Gregory, a VFX Artist from Brazil. I am 26. I was born in Manaus, a city in the middle of the forest, literally. I have 3 years of experience with game development.
Since I was a child I had a great interest in video games, not only I liked playing games, but also creating some stuff. I used to do some experimentations with RPG Maker and Lunar Magic. At the same time, I was always the artist from my classroom. I enjoyed drawing a lot.
As time went by, I started college studying Computer Science. Graduated and even did a Master degree in Information Retrieval and Machine Learning. At that point, I was working as a System Analyst, developing Android stuff. That was when the company I was working for decided to try making games, and I was the guy who could program and do the art at the same time.
With a very low budget and a small team, we made some very simple mobile games, and I started making my portfolio. That gave me the opportunity to start working on a big studio, with over 40 other game developers. I started as a game programmer, but I was introduced to a bunch of new things, like 3D, game design, animation, and VFX.
I started learning from my new friends. First, I tried to learn 3D, it looked so awesome, to be able to model my own character an all. But I didn’t want to throw away all I got from my developer experience. That was when I found out what VFX was all about. It was the perfect combination of art and tech, where I could learn a bunch of new things.
When the main VFX artist from the studio announced he was leaving the company, I got the opportunity to start working with that officially. It has been 1 year since that. And doing so, I could see that VFX is really my vocation.
The VFX in The Legend of Zelda
The Zelda game never looked so great as now. The firsts gameplays from E3 came right when I was starting learning VFX. I remember that the explosions were so eye-catching, and those were the first things I tried to study. At that time, I had no idea how to do that.
When I got enough experience with particles and shaders, I took a second look at those explosions, and the way they looked cartoonish and stylized was something that I wanted to incorporate into my work. I also got myself thinking about the overall timing and the way it stretches out of a core point, but suddenly starts behaving like facing-camera billboards. I ended up trying to replicate those things as a challenge to myself.
Using Shader Forge
Writing shaders from sketch takes some time, and when you need to write a bunch of them alone, having a visual editor tool that compiles and tests the shader on the fly is really helpful.
Shader Forge was already the tool used in the studio I work for, so I started learning that. But what makes it such a good tool is the readability of the shader code after compiled.
Sometimes we got to get our hands dirty to do some fancy stuff. Often, we need to really write CG code to achieve some effects, and having a clean code from the graphs editor is key for that.
After watching some gameplay footage many times, I analyzed the shapes used in the textures of the original effect and tried to replicate that on Photoshop. They are basically circular shapes and star/cross shapes. Here is a comparison between the original effect and my particles:
Those particles are mostly simple standard Alpha Blended textures, but I had to do something a little bit fancier for that energy arc with the bright lines inside. For that, I used two smooth steps on the texture alpha, one with a slightly bigger min value as the other, with a sharp falloff, and then subtracted one from the other, while shifting the min value using vertex color information from the particle system. These lines are added to the texture color in the end. The result is as follows:
The stripes of energy use another custom shader. They use a simple geometry with the shape of a curve, and I alternate the scale on a different axis and the rotation to create variety. For the shader, I created a mask using its own UV’s to erase the borders, then used a scrolling noise to erode the particle, again using smoothstep and vertex colors.
I found a simple solution for the shape of the beam. I use a sphere with scale animations, still using the particle system, positioned strategically to look like it’s going towards the target. For the positioning, as it was only for the video, and not for an actual game, positioning everything by hand was doable. However, in an actual implementation of the effect, you can subtract the position of the particle by the target’s position to get a direction, and then use the distance as the scale on a given axis to do the correct positioning. The particles rotation can be also set by using the direction previously calculated.
Here is the scale animation of the beam:
To create that final erosion effect on the beam, I use a similar shader as used on the energy stripes, but as the sphere has a different UV layout from a plane, I had to use a Fresnel multiplied by the noise texture to erode the opacity. Here is the breakdown:
I do the same trick for the beam deflection, but with a slightly different curve for the scales. I hold the sphere with a small scale on the direction of the target longer and make it wider, to give a sense of being compressed by the shield bash, then I increase a scale quickly and make it thinner, to give a sense of acceleration.
The lens flare has a tricky shader. I spent some time doing it as It is one of the main parts of the Ancient Arrow effect, and just used in a smaller scale in this case. But it is basically a mask made using a quad’s UV layout, added by a caustics texture with scroll, again masked by the UV’s, but with different opacity values. The trick to making it appear in front of the opaque meshes is to use the negated view direction to offset the vertex in the direction of the camera. It looks like it’s in place, but it’s actually coming towards you, avoiding clipping on the guardian. Here is the lens flare isolated from the rest of the effect:
And here is the atlas for the Guardian Beam effect:
Most of the explosion particles are simple textures with alpha erosion using scale animations. There are five stages of the explosion: the spiky ball that grows; The shockwave, which are parts of a circular mesh facing the camera, and scaling fast; the fragments or debris, to give the sense of the punch, as fast particles are ejected from the bright ball at the beginning; the heated smoke; and the dissipation of the smoke. The first 3 stages happen really fast, whereas the two last stages drag more, making an easy-out sensation. Here is a comparison between the original explosion and mine, showing the stages:
There are two tricks for the smoke, one with the emission, and other with the shaders. Somehow, on the original effect, the smoke stretches away from the center and then starts behaving as a billboard. My approach for replicating this was using stretched billboards together with regular billboards. The stretched billboards must not be emitted in the direction of the camera, so they won’t look as thin as paper. So, I use two semi-thorus that is always facing the camera via script as emitters:
The smoke was all hand-drawn, including the noise for the erosion. The texture for the smoke uses 4 channels alongside with a custom shader, which are: a color gradient on the reds to set the shading of the smoke, that lerps two colors and then multiply it by the vertex colors; a hand-drawn noise on the greens to make the fire, which is multiplied by a color and a value for the brightness, and then added to the particle`s color; another hand-drawn noise for the erosion on the blues; and finally, a mask for the opacity, with the shape of the smoke on the alpha.
Another texture is used to create the tornado effect on the smoke’s dissipation. For that, I use a flow map with a twirl, made with photoshop, to distort the UV`s of the noise at the blue channel. In order to rotate only the noise, I must use two samplers for the texture.
All the controls of the fire, erosion, and distortion are done using custom data and custom vertex streams. Unity’s particle system gives you the option to create your own data stream, by passing particles data to the shader using UV data. In this case, I use the TEXCOORD1.xyzw to control each part of the effect: X controls the fire erosion, which I use to start the particle as a fireball, then create the effect of cooling down as the fire fades away; Y controls the cutout of the shading, which means I control how much shadow the smoke has on its color; Z controls the alpha erosion; and W controls the angle of the rotation of the erosion noise.
For the lens flare, I used the same shader I described before for the Guardian’s beam lens flare. To make the effect of a bright core, I used two lens flare, one in front of the other, but one on them with a small scale on the Y-axis, while the other covers almost the entire screen.
The portal opens up in a similar way of the explosion. I actually reused most of the particles from the explosion for that. When it opens up, I use the same shockwaves and debris, with a different color, to create that punch effect together with the lens flares.
Then, I reused the energy stripes from the Guardian Beam effect, and the ambers from the explosion (again, changing the colors) but this time moving to the center of the effect, instead of being ejected. In contrast with the portal opening, the swallowing effect runs more slowly. The slow particles moving to the center work together with the portal shader, that has a texture scrolling to its center. I also modeled a vortex shape, that also has a scrolling texture moving to the center. Here is the geometry used for the portal and the vortex:
The climax moment of the opening followed by the slowly end with the vortex are completed by the anticipation created by the arrow charge effect. It consists on the moment when the proper arrow charges before being launched towards the Guardian.
Scale and dynamics
For the scale of the effect, I used as a reference to the sphere representing Link. It has 1 unity of diameter, and with that in mind, I set the scale of the Guardian on the scene. I tried to make the effects with a scale as close as the original, so I watched footage of beams being deflected to see the explosions happening on the Guardian. That gave me a better notion of scale, that I replicated on my effects.
For the animations, I tried to create, for each effect, a climax curve, with anticipation, followed by the highest climax, and ending on a slowly pace. Aside from the deflection effect, that has two climaxes, all the effect followed this pattern.
I also animated the time scale in some moments, to draw the attention of the viewer to some particular event. For example, when the Guardian Beam is deflected, there is a brief moment when the shield hits the beam, where there is a slow-motion effect. It emphasizes the shield bash, giving it the feeling of weight on the hit. The movement of the guardian also helps with the overall flow of the sequence. When he gets hit, he is thrown back, stays some time recovering and then walks back to its original position.
A complex effect like the explosion, or the ancient arrow, can take from 3 to 5 days, but it depends on how much time I have to tweak the timing and the textures, or how many adjustments I must do from the feedback of an Art Director. In this particular case, the explosion took a lot of time, as I was studying the original explosion, trying to figure out an emission technique to mimic its behavior, drawing different textures for the smoke, etc.
The ancient arrow effect took way less time, as I already had some textures I could use, and shaders as well. And the Guardian’s death I did in one night.
The biggest challenge for me was the learning curve and the lack of time to work on it. On my first attempt to recreate those effects, I was still very novice on VFX. So I had to take some time, work on other projects until I felt confident enough to keep working on it.
For the learning curve, it was a big challenge because it is hard to find good material to study real-time FX. I was lucky enough to have a teacher to show me the background I needed to keep going, and the rest came from watching a lot of videos of effects and experimenting different approaches to replicate until something work out
If one is attempting to start learning VFX, I’d recommend starting developing some background on computer graphics, at least the basics to understand what is a shader and what is the mindset to program one. It really helps when trying to find ways to achieve certain visuals. I know some youtube channels and video-classes that are really helpful, and here is a list: