Wow, that's great. Have to try this out!
Wow beautiful environment. Very thorough and detailed. But I think there are a few images that are not showing up (error?). Is that just me? Interested in seeing those other pictures...
Jack. First of all, I want to apologize for offending you. We published this just to show how the tech could be used. We don't actually care about the message. But you do bring up a viable point, that for some people - this might be an issue, so I take this post down.
Kevin Leroy talked about the way he creates amazing visual effects in the style of League of Legends in Unity.
Well, for most of my League fan arts, I first wanted to try a thematic that I thought was fun and interesting. I was limited by my own scripts, which cannot do everything League can, so some champions were pretty much impossible to make for me. But, over the course of a year, I managed to evolve my script to a monster to something that could at least manage some stuff.
VFX can cover a lot of stuff. I’ll only speak about spells, because that’s my domain.
Let’s take for example, a projectile, as this is often the most complete kind of effects. For most projectiles, you need three elements. A cast effect, so that the player knows something is about to or has just been cast, the projectile itself, which is often an object moved by code, and then a hit effect.
The cast varies depending on your needs, of course, but let’s take a basic one:
Just to make sure: this is a fan skin I made before I started working at Riot.
As you can see here, it is very simple (even simplistic), but communicates the most important thing: directionality. The main concentric circles move in the direction the projectile will be shot and strengthen its readability, while the sparks adds some style and power to the effect, while still reinforcing that directionality by their perpendicularity.
The projectile is the most important part of the effect, at least in this case. It’s the object that the player will need to dodge, or suffer the consequences of. It needs to be readable and show what projectile it is. Is it the famous Nidalee spear that will divide your health bar in two, or is it Elise’s cocoon, which won’t damage you but stun you for some time? Things get even more complicated when you throw skins in all this. The VFXs still need to read as their base counterparts, but provide the skin’s fantasy, which can get pretty complex when there are as many champions and skins as in League.
So, here’s the projectile for SG Vel’Koz:
The Hot Point is the most important. This is where the projectile’s end is, and this is what you need to dodge. It needs to be visible, clearly identifiable and especially accurate. You don’t want your hot point to be offset compared to the actual collision box, because that’d make it close to impossible for players to accurately dodge it, or hit it, for that matter. Then, you have the trail. I think this is a very important one, because it shows the directionality and where it’s coming from very effectively. Then, you have the Secondary Elements. Those are very specific to the fantasy you’re going for. For example, if you have a Project skin, those could be some digital squares or sparks. For SG, those are often stars and shimmering sparkles. Those secondary elements tend to linger a bit longer than the trail.
Finally, you have the glow and shadow. You’ll ask me “But Kevin, why have a shadow when you shoot such a bright star?”, and you’ll be right to ask that. The shadow, in this case at least, is only used to amplify the glow. A glow alone is often not really enough. Think about it as maths. Imagine you have a clamped space: things cannot get brighter than 1, and cannot get darker than 0. If you have a base that is already 0.5 of brightness (like the background), you only have room to add a glow of 0.5. But if you diminish the brightness of the background, by adding a shadow for example, well your actual background will maybe be of around 0.2 brightness, and you’ll have room to add 0.8 of glow, thus making it look a lot more powerful and bright. Of course, this is a fan made effect, and I do not think it is a perfect one (far from it), but I thought it was a simple yet good showcase to explain some basics.
Then, you have the hit effect. It is generally triggered if the projectile’s collision box enters in contact with an enemy unit.
As you can see here, I’ve separated this hit in several pieces. The first and most important one is the hit frame. It is supposed to be very fast (my favorite time is 0.1s), and grabs the attention of the player: the projectile has hit you (or him)! Then, the main glow disappears and only the lingering effects stay. In this case, the lens flare disappears almost as fast as the glow, followed by the sparks a bit later, to leave only the bokeh dots at the end.
So, that’s the very basics for a projectile. You can always go wild and try new stuff, but if your game requires players to dodge stuff, make sure the spells are easily identifiable, that their hot point corresponds to their actual collision box, and that players can directly see what direction they are coming from, so that they can predict their movement.
I usually make absolutely everything in Unity, as I mostly only use particles and shader effects. For the textures, I use Photoshop, and for meshes, Maya. For the particles, I use the integrated Shuriken particle systems, which I find really easy to use and more than enough for my needs. I had started a VFX Course to explain how I make stuff, but never got the time to finish it. There’s still 30 pages worth of content, so if that can help, here it is: RT VFX Course.
I usually start my effects by making a super simplistic version of it. If it’s a projectile, I just create a new PS (particle system), set the basic shape I want the particles to emit from, and their color, just to get the feel. Then, it’s all and only incrementation. I add stuff and remove stuff following what I need: if I want to test a lens flare, I make a new PS that spawns only one particle that’ll display a lens flare. If I want a trail, I add a trail and I set it up, then apply a texture I think would work. If none of my pre-existing textures work well, I just create a new one and do some trials and errors until I find the perfect one. Honestly, I think VFX is a lot of trials and errors. I try something and see how it goes. If it doesn’t work out, I try something else. If it works, I leave it there until it doesn’t work anymore or until I think it needs to be replaced – or I just leave it there forever ‘cause it looks great. I think this is a great way to work, personally. It allows you to test a lot of different stuff very easily (and very quickly), and once you’re set on a style/direction, you can refine it.
For my textures, I paint them all in Photoshop. Painting isn’t my forte, so I always try to rely on simple textures. Here are a few examples of textures I use all the time:
As you can see, most of them are in black and white, and that’s because this allows me to have full control over the particle’s color over lifetime. If I suddenly want to change a particle’s color, I don’t have to go modify the texture, or even create a new one, in Photoshop. I can directly change the color in Unity, using the CoL property of the particle system. Also, yes, those textures are super simple, but when mixed in a complete effect, I feel like they work out pretty well.
So, in VFX, there are three main ways to animate stuff. The one I use the most is through the Particle Systems, which requires you to use their properties to make them do what you want.
For example, the Start Speed parameter allows you to add a first impulse to your particle. Let’s say it’s set to 20, that means the particle will travel 20 units per second. If you add the Velocity over Lifetime, you can control that initial speed over the lifetime of your particle. Add to that the Limit Velocity over Lifetime, and you can add a drag, to make it look like the particle being slowed down by friction.
You then have other properties, like Color over Lifetime, which allows you to set a gradient that will define the color of the particle at a certain point during its lifetime. Let’s say you have an alpha gradient that does this: 0 – 0.5 – 0, this means that your particle will initially be invisible, then gradually appear until it reaches 0.5 opacity, and then gradually fade out until it completely disappears.
There’s a lot of other properties, but I think it’d be best to link one of my tutorials, where I go over them one by one:
Another way of animating would be through animated shaders, just like this example:
This allows me to have complex movement, generally moving over the surface of a mesh I modeled in Maya. This can be very useful to make speed capsules, shockwaves, beams, and all sorts of things that would otherwise be impossible, or at least that would require a whole lot of particles. I often use this directly with particle systems, using mesh particles. I simply spawn one particle that is the mesh (or several if needed) and “animate” it using the particle properties over its lifetime (speed, color, scale, etc). I find it to be a lot faster and easier than using the timeline and create an extra animation file. This is not “always” the right thing to do, but I’d say this is what I use 95% of the time.
And finally, there’s the traditional mesh animation. This is often used in conjunction with the previous option, to create interesting movements. I do not really use this very often, and do not have an example I can share in mind, but one good case scenario would be to have a complex rigged (or not) mesh that you animate directly in Maya or 3DS Max, then import in Unity, before applying your material and, if needed, editing the imported animation to add some material changes (like changing the color/opacity to make your mesh appear/disappear, or change the UV offsets if you want a precise control over your radial shockwave). A very interesting and simple post about this would be this one (not from me, but from a colleague: RTVFX).
Reusing materials and textures
I’ve always been told that, in VFX, 50% of the work is using something that you made in the past, then adapting it to your needs. The other 50% is actually creating new stuff.
When you start, you don’t really have anything, so you just make stuff from A to Z all the time. But as you go and progress, you start building a library. Textures, meshes, prefabs, particle systems, etc. One burst of sparks can be reused as a start for most bursts of sparks. Just change the velocity values, the colors and maybe the texture, and you can easily adapt it to any kind of sparkly effect. No need to create one from scratch all over again. Same with textures. Once you have a good dozen of different noise textures, there is very rarely the need to create a new one specifically. Just pick the one that fits your needs the most and apply it to your material. It saves space and time.
Now, this is all good when you’re working in a single engine, like I personally do with Unity. It’s a little different if you always switch between, say, UE4 and Unity. You cannot use the particle systems or materials you made in Unity and transfer them over to another engine, so your library is often engine-specific. Textures are global, though.
Is it necessary to do programming in order to make it work appropriately?
This is a question I get very often. And the answer is both no, and yes. TL;DR: It depends on what kind of VFX Artist you want to be. It’s very good to be able to code, but I do not think it is absolutely mandatory.
So, I’ll start by saying that VFX Artist is a very broad appellation. To me, a VFX Artist is simply someone who makes magic happen, figuratively and literally. There are VFX Artists who specialize in making stylized spells and game effects, like me, but there are also some who specialize in realistic environment effects, some who specialize in shader creation, some who specialize specifically in particles, some who specialize in offline rendering, etc. Some of those require very little to no coding at all, while some require some advanced stuff. I can only talk by my experience, though.
Professionally, I never had to code a single line. Mostly because we already had the right tools, and because if any line of code was needed, we’d have tech artists or programmers. Now, this doesn’t mean that I never needed to know about coding. I think it is very important to know at least a bit about it and how coding works in general to be able to understand the requirements of a game, or to understand your programmer when he tells you that “x feature” cannot function in the game you’re working on, or when he asks you to make an effect specifically for one spell. Understanding that the spell is being fired at one specific frame of an animation is very important, or that the projectile can be interrupted by shields and proc a secondary effect, or that it disappears after a certain amount of time instead of a certain distance. It might not seem like programming, but to a certain extent, it heavily rests on it. Bugs can happen real fast.
How would you advise approaching the way to work with VFX?
I only use Unity, so I’ll only talk about that one engine.
The only plugin I really use is ShaderForge, which is a node-based shader creator, as seen below:
This allows me to create both simple and complex shaders super easily without requiring any kind of coding. All you need is logic. Also, there are tons of SF tutorials online, but I always recommend to check the basic official ones.
As for the rest, I really do not use anything else than vanilla Unity.
Now, if you want to start VFXs, I would recommend one thing very intensely: references. I believe this is the most important thing ever. If you want to make a projectile like the one I explained above, you can start from scratch, sure. But that won’t get you very far, unless you’re incredibly talented. No, if you want to make an effect, look up stuff about it, about its style, about what kind of effect it is. Play games, watch anime, movies, look at art, watch some slo-mo guys videos, check explosion footages. It’s all fun stuff, and it’s for work! Say you have to make a laser beam. Let’s also say you’re free to do whatever style you want, and you choose to make it anime-style, ‘cause anime is cool. Just watch some mecha or magical girl anime, where beams happen often. Or check some games that could have beams with that style. Now, if you want to make a realistic explosion, well, there isn’t a lot of options, but there are a TON of explosion footages out there. Nuclear explosion? Grenade? Underwater? RPG? Electric? No matter what, you’ll find it. We have the huge chance to live in an era where everything can be found just in a few seconds. Take advantage of this. If you want, here’s my Pinterest page, where I gathered a ton of references of all kinds.
Another thing that I find to be extremely important is the ability and desire to ask people for feedback. When I first started to be interested in VFXs, I had absolutely no idea that there were groups of people who were doing VFXs all day everyday, and sharing stuff with everyone, for everyone to learn and improve, for free, most of the time! Now, there’s even better: RTVFX. Go there, ask questions, ask for feedback. The community is amazing and always willing to help. It’s also a huge gold mine with tons of resources and references. As I always say: Feedback is progress. Don’t be afraid to ask people for some. It’ll help you greatly!
- League Fan Arts – Unity Package
- Unity VFX Basics Tutorials
- RT VFX Course (Never had the time to finish it, so it’s only just the first chapter for now)