Yanni Tripolitis, a winner of Riot Creative Contest 2017 (Realistic VFX category) talked about his amazing entry created in Unreal Engine 4.
Hello, my name is Yanni Tripolitis and I’m from Athens, Greece. I’ve been an FX artist for the past five years and I’m currently working on the Avengers project at Crystal Dynamics. Previous work experience includes Rocksteady (Batman: Arkham Knight) and Epic Games (Paragon).
I’m not one of those people who knew what they wanted to do from an early age or even had an art background. Up until 10 years ago, I thought I wanted to be a Naval Architect, and it had never crossed my mind that I could make games, and specifically FX, for a living. I’ve always been a gamer but I never thought about making them. Then during my university studies, I had a module where I had to create a 3D turbo engine in AutoCAD – that was the first time I did anything in virtual 3D space. From there I started experimenting with After Effects and then 3ds Max in my free time, then I changed career path and joined a different university to learn 3D animation and VFX for films. In my final year, I started learning particles in UDK through tutorials and I realized that I enjoyed real-time FX the most out of everything I had done so far. It took me a while to find it out.
VFX: stylized and realistic
Even though we may not have the technologies yet to easily and efficiently create film-like FX that can run in a game, that doesn’t mean that all real-time VFX are stylized. A realistic effect is one that intentionally replicates reality without any stylistic input, both in the way it simulates and looks. On the other hand, a stylized effect is one that has an intentional artistic style that’s reflected in its motion and/or its aesthetics.
Stylization to an extent can be considered subjective, but in any case, it’s not a failed attempt to reach the quality of the FX that we see in pre-rendered films. It’s an art style on its own with its own principles. You can just have a good or bad stylization.
When we talk about stylization in FX, it often comes from the colors and textures you use, the animation, timing and of course the elements of the FX. By elements, I mean what shapes and components you will use in your effects that can help drive away your work from realism. To achieve stylization through the use of color, it’s a matter of value and saturation. For textures you often need bold and dominant shapes, sort of “less is more” approach.
In FX animation, similarly to traditional animation, exaggeration in motion and timing is key in achieving stylization. Whereas in realism, you usually want more detail in those textures and colors that resemble the real world.
I knew I wanted to make realistic FX as soon as I read the first word in the VFX guidelines, especially as I kept reading and saw that there was room for combining stylization. That sort of artistic freedom to choose what direction to lean towards and how to approach it is what I love as an FX artist, as much as I greatly appreciate either far ends of the spectrum. So the fact that my effects are not full on realistic or full on stylised, is very intentional.
Creating VFX in Unreal Engine 4
One of the things I love about Cascade is that it can dynamically communicate with the Material Editor. You can use up to four parameters, or even up to eight and edit them dynamically. I used a lot of meshes in my work and most of them change over time on a vertex level driven by Cascade. Considering this was a real-time cinematic effect I wanted to tailor every single detail and have full control in my effects.
Another thing that Cascade offers is Random Seed, and I was using it all the time. Random Seed is basically the safest way to have randomization. In VFX we love randomization, it’s how we make the same effect look slightly different every time. In combat FX, especially those that the player sees a lot, you want to have a great amount of randomization, so FX look less repetitive. Naturally, a lot of randomization (randomizing many different values and elements in your FX) can create results that you don’t want. Since this was cinematic that had a static pre-decided camera, I tried to limit unwanted random results as much as possible. UE4 allows you to do that by showing you an infinite number of random results, and you can choose the ones you want. You can do that for pretty much everything in Cascade: color, velocity, size, parameters, location, everything. That way you do get a random result, but you get the same random result every time, which technically makes it systematic. I used that a lot to avoid unwanted surprises. It’s a small feature but it helped tremendously.
Typically there are three primary stages in combat FX animation: Anticipation, Climax, and Dissipation (similarly to the Twelve Principles of Animation by Disney).
Jason Keyser from Riot has made these videos explaining in detail the principles of VFX.
These stages are a crucial aspect of the game as they can directly communicate gameplay feedback to the player. Anticipation is meant to communicate to the viewer/player that something is about to happen. In games, the animation of the character is sometimes meant to transmit that message, especially if the ability is a burst – almost instant – one. In our case where the spheres were stationary, only the VFX could transmit that message.
Climax is the peak of that effect, its highest point, generally when the damage is applied.
Dissipation is what comes right after the climax. For example, in my case dissipation comes after the energy has reached its highest point and starts calming down and gradually disappears.
These three stages are dependent on each other and in an actual game, their timing is directly linked with the gameplay design. Note that depending on the effect/ability, you need to give the player some time to anticipate the effect – and satisfy their anticipation with a delightful climax and a dissipation which reflects the energy and forces that took place.
The runes were an important aspect of the effect, I wanted them to have a purpose, almost a sort of “personality”. They interact with the forces that visually take place in the scene and to achieve that, they had to be 3D.
Starting with the layers, there are three sets of three layers in total. One layer for the circles, one for the fires and one for the rune shapes. All three use a single mesh with a bespoke shader on each one. The mesh itself is a very thin circle that uses a material node called SplineThicken which is basically vertex offset that allows you to grow its faces larger or smaller with the added benefit of being camera facing. This way I can control the overall size of the mesh via Cascade’s Size by Life, the size of the surface area of the circles via a Dynamic Parameter and the mesh’s vertexes – as well as having the faces of the mesh facing the camera. The three sets are the same mesh used three separate times for three separate stages; one set is triggered when they appear, one set – when they have fully appeared and just hover in space, and one – when they dissipate. By splitting their animation into three separate sets I had direct control of the timing. During production, their timing changed multiple times and I figured that splitting into the three stages will allow me to easily maintain them. I then timed them in such a way that they give the impression they are all one.
Their motion is generated through the Vector Noise node that UE4 offers in the Material Editor. It’s basically a noise that (in this instance) moves the vertexes in world space. I use it to get the ambient ethereal-like motion they have and the dissipating motion when they fade out. By using Dynamic Parameters I could time and control how much that noise affects the meshes. The recoil animation you see (when the muzzle flash is triggered) is done via a module in Cascade called Direct Location. Direct Location does exactly what the name says: it’s the highest controller in the hierarchy that overrides any other location module and directly positions the particles in the local space location you choose. Basically, it takes the particles and puts them XYZ units away from the center of the particle system over a certain amount of given time – unlike all other location modules, it works in particle time. Then by manipulating the curves you can get interesting animations and make it look like their motion is generated with forces, when it’s just manual relocation.
The projectile is actually animated within UE4’s Sequencer, I didn’t use particle velocity or acceleration. In an actual game, the code would move it from point A to point B. Since this is a real-time cinematic, I chose to animate it and play around with the animation curve to get a more dynamic and powerful trajectory.
I knew from the beginning that I wanted the projectile to grow and change as it’s traveling ahead. By changing its speed and size overtime you instantly make the projectile more interesting to the viewer. The growth and change are also greatly supported by an increase in brightness and the addition of extra elements during its travel.
My main focus when making the projectile was its silhouette and shape. I wanted it to be larger than the sphere itself but not blobby or chunky, and to achieve that the trail of the projectile is always significantly longer to make up for the bulkiness of the lead. At its longest part, the projectile occupies about three quarters of the travel distance, but the viewer can’t tell due to the perspective in the primary camera angle. To integrate the projectile better into the world I used particle lights and a smoke flipbook texture created in Houdini for a dust kick up on the ground – this also helps tip the scale slightly towards realism.
Similarly to the rest of the particle systems, everything is 3D. Rather than working with Cascade’s Ribbons, I used planes that utilize the Vector Noise module and the SplineThicken module and behave pretty much like a Ribbon would but without any of the issues that you may run into with ribbons. Then alpha masks in Cascade and Sequencer control when everything appears and disappears – timing those masks, as well as the size of the meshes, correctly and in a nice way was very important to me. The trajectory distance is actually quite short and everything had to fade out nicely and not simultaneously.
As a part of the task, the wanted us to end the ability with a looping status effect – and I went for a looping fire. It was important to me that the transition between the impact and the looping effect was seamless and felt natural. I had this strong fast impact which moves on the X axis and then you have the relatively slow-moving fire which moves on the Z axis, so I needed a natural transition from one axis to the other.
What instantly came to mind was the fires you see on those oil torches on the beach that try their best to stay lit on a windy night. The fire in that case transitions from vertical to horizontal and that’s exactly what I wanted to happen in my effect.
So again with Dynamic Parameters, I transitioned between two vertex noises by using gradients and masks to control what vertices will move and how much. One noise is when the fire is pushed away from the impact that has Y and Z masked out, and one noise is when it’s vertical and it’s subtly waving left and right.
The look of it is achieved by using a texture and a gradient mask with two different powers LERP’ing between them, eroding the texture as it moves up giving me those flame wisps.