Real-time VFX Artist Sjors de Laat kindly shared the way he works on awesome VFX: meshing, texturing, creating shaders and particle systems and more!
Real-time VFX Artist Sjors de Laat kindly shared the way he works on awesome VFX: meshing, texturing, creating shaders and particle systems and more!
I discovered 3D when I was 14. I really liked making mixed media at the time, and then got introduced to Cinema4D and 3ds Max to create abstract art with. Around 1 year later I got introduced to the art collective Depthcore, and it inspired me to
become a matte painter at first. Once college came along, my interest in art for games and movies started to really develop. I already knew that I wanted to do something within that field, but I wasn’t sure yet what I exactly wanted to do. It took a long time to figure out, and within the said time I explored as much as I could. During my final year of college, I needed to define myself, and settle on something that I wanted to focus on. Initially, that role was too vague, was not easily definable, but it went more towards the environment artist role. After that, one of my teachers said to me to set a clear and defined goal of what I would like to specialize in. After that was said, I started thinking about what I enjoyed doing the most during the projects I was able to take part in. In the end, VFX was the thing I enjoyed doing the most. VFX allows me to still be diverse when I want to be, to apply all the different things I’ve learned over the years but with a much clearer direction. And it is such a satisfying and fun discipline that I decided to commit fully to it. It is fun to still be at the beginning phase of something that you are really passionate about, knowing that there is so much out there that you can still learn. Which is all part of the process!
Where do I start when looking for something new to make? The good old gathering of reference and inspiration is the first thing that comes along. During this, there could be several questions you ask yourself: How complex will this effect be? What purpose does this effect serve? What style shall I go for? What target platform does it need to hit? What type of motion do I want? During the gathering of inspiration and reference, next to inspiration from other FX, look for visual cues and motion that look interesting. Look outside your subject type and search an interesting hook that you can incorporate into your effect. Seeing how silk moves in the wind, or how seeing what a wave does when it crashes onto a rock for example. In the end, real life will always be the best form of inspiration, no matter the style and subject matter you are working in.
After you feel the reference is in a good place, the blockout/ concepting phase comes. During this phase, you mainly focus on getting a feel for the major motion and actions that need to happen. This also allows you to gauge what type of resources you will need. Although this will most likely change throughout your project, you can get a good idea of the scope, and you will able to answer a lot of the questions beforehand. After this, I’d personally like to create a document that outlines all of the elements I think I might need. This doc will also be kept as a feedback tracker so you know what you still need to fix and what you might want to add. Then make sure you’ve got everything in a good place so you can start with production.
Personally, I like to start by building all the elements first individually. Meshes are overall pretty quick to make so I will do those first. Then the textures will follow, after these are in place I start with the production of the shaders, and a rough setup of the particle systems. I’d like to see it as a gradual expansion of your building blocks to make the bigger pieces with.
When you are planning to start with the learning process, start with getting to know your tools you are working with first. The biggest learning process comes from understanding the underlying principles of FX and gradually applying them to projects over time. Start small and find something you would like to improve on and keep your goals deliberate. I feel the aspects one can focus on when you are in process of learning VFX could be broken down into the following.
Meshing: Meshes in VFX will most of the time be used as a carrier, and used for a movement that cannot be easily achieved with flat cards. Learn what you can do with the UVs of your mesh. And what it can do for you to reach a certain effect. Learn how resource heavy they can be: what is too many polys and what is too few, how can I make sure my meshes do not cause unnecessary overdraw.
Dynamics: FX is all about the movement of the natural phenomena (and also the unnatural ones, of course). Get to know what causes these phenomena to happen, and see how they work, behave, and interact with each other. It is all part of understanding your subject matter. Do however apply these principles from an animation point of view. Because sometimes something might look right, but might not always feel right within your effect. In terms of software to recommend: Houdini is stronger than ever and still keeps on growing, but it is one of the hardest programs to learn and master. But start to identify what you will be using it for first. Like a said before: start small! Engines also allow for the use of some basic physics as well, but these will most likely be approximations.
Textures: For textures, make sure that you get good values into your texture. See what combines well, reads well, and see if it can work well with gradient mapping and erosion.
Shaders: This aspect can go really deep if you would like to, but to in the beginning familiarise yourself with the node based editors and see what each function does. Most depth will come from how flexible you would like your shaders to be: do I want all energy based effects to be controlled from one shader? Will I simply split them up by subject? In any case, look for ways to build functionality into your shaders that allow for reuse. Instancing materials will always be better than creating a new one.
Design: The design behind your effect is one of the most important factors that can make or break it. In the game context, it is essential that the effect communicates to the player what purpose it serves to the gameplay. It also is the keystone that gives your effect visual appeal. Graphic design is a great inspiration source for this, and most of its principles could be directly applied to FX. It is all about communicating visually, evoking an emotional response, and also leading the eye.
Animation: FX is all about animating energy. The 12 principles of animation pretty much apply to effects as well. These principles help you with creating more believable and realistic looking effects. Whether you are working on stylised FX or realistic these rules will still apply.
Implementation: Be sure to get to know the implementation process by knowing how it works and learn how to do it yourself. Unreal’s Blueprint system allows for a great starting point. This could, for example, be applied to how to make bullets. If an effect is triggered by a key-press or if you want to attach an effect to a certain bone, just to name a few. Understanding how the implementation process works, allows you to understand what type of aspects you need to take into account. In case you need to hand stuff over to someone who will be responsible for integration, you will make their lives just a bit easier.
Cascade (Unreal Engine’s particle editor) is a modular emitter based system that is used to create your particle systems in. Although Niagara just came around the corner, it will still be a good tool to start with to get an idea of the core concepts or if you feel that Niagara is too intimidating to start off with right away. Cascade exists of a list of rulesets that comprise the particle system itself. You have the option to create the standard emitters. mesh emitters, ribbon emitters, and trail emitters, and GPU emitters. The window that shows your list of emitters also gives you a representation of how the particle system in question is sorted. The 1st emitter in the list will be the last one the get rendered. Each module within your emitter contains a set amount of rules that can be changed to your liking. What each module will contain is a list of points within your timeline that run from 0 to 1.
If it is possible to change the distribution type of a certain parameter within your module, that parameter can be accessed via blueprint, sequencer, or in-engine. In the engine, this would allow you created subtle variations of a particle system if you need that. This is so you don’t have to make a new system for every instance you want to have a similar looking system with a slight difference in behavior.
(Setting the distribution to Particle Parameter allows you to adjust this parameter during runtime).
The preview window allows you to visualize the number of particles per emitter, visualize your bounds, how the system will look in motion plus check your collisions and light influence. Every module within your emitter allows you to visualize the area where the particles will spawn from. This allows you to check how the emitters are in relation to each other.
This window allows you to edit your curves. Clicking the button next to the module allows you to visualize that module’s curve in the window. Pressing the yellow square in this window will allow you to hide this curve in the window. The x, y, and z (the red, green, and blue squares) curve can be toggled of separately if they show up next to the yellow square. Then you can edit them separately if you want to. Pressing CTRL on the curve allows you to add a point to the curve.
Like mentioned a bit earlier, the way I’d personally like to approach the creation of effects is to first assess what type of elements you need for your effect(s), and make those first. During the meshing process, I mostly use Maya and ZBrush for all the sculpting work. The meshes, in this case, are for the most part pretty simple. For all of the meshes that I would like use scrolling textures on, I make sure to layout my UV island within the full 0 to 1 space. But if there is a texture that needs to loop over a mesh with a break in it (like a bullet of the trail, for example), I will scale it down a little so no noticeable looping is present.
In case you want to add a gradient to your mesh, it could be one via a shader. But a handier way is to add some vertex color to your mesh. This can then be multiplied on top of your textures to get the same fade out effect. Vertex colors also allow you to control how a certain part of a mesh should behave. For example, a soap bubble, or controlled deformation of a steel barrel that is getting crushed. For certain instances, you might want to spawn particles on a certain surface or part of a surface. In this case, turn it into a skeletal mesh, and then skin the area you want to spawn particles from to that bone or bones. Then the skin weights will be used as a mask where particles will be spawned from on that particular surface.
For the textures, I’d like to use a combination of Photoshop, Substance Designer, and Substance Painter. Photoshop is used to create all of the hand painted elements, and Substance Designer is used for most of the tileable noises and simple elements that I would like to use within the scene. For the texturing of the assets, I’d like to use a combination of both Designer and Painter. I will always use channel packing. In this case, it will be categorized by element or subject if possible. For example, different types of trails, or an element that requires multiple textures that work together to get the desired effect. The textures will then be tested to see if they read and work well within the engine, I also check how they work together with gradient mapping and adjust them accordingly to get to the desired result.
Then the shaders will be built. Most of them have been outlined in the planning phase, but I’d still like to keep the setup pretty rough and get the elements I will be using in. The setup will evolve down the line and will be adjusted if I think something does not work well or requires a different setup to get to the desired result. I’d like to build most of these shaders myself. But over time, I have learned a lot of techniques and setups from other resources and people that I’d like to incorporate when needed.
Then I will construct the particle systems with the elements I have. Construct it layer by layer. Get the behavior down for each emitter and tweak it until you get the motion you want. Then build on top of that with another emitter and make them work right the previous element. Once all the emitters are implemented, adjust and tweak if needed. Also regularly check how costly your shader complexity is, and check if you have no unnecessary overdraw. I’d like to keep the final optimization until the end, once I know all the elements work well together. Once that point is reached, check and see how far you can scale back your elements, all the while maintaining the original look of the effect as closely as possible. This will all be a balance of fidelity versus the optimal cost of your effect.
Animating your textures can be done in a multitude of ways. The approaches will vary on a case to case basis, but here a some of the ways one can animate them. Shaders will always be at the hearts of this. But it will be a matter of what method you will use that will do most of the heavy lifting, and how flexible you want you would like to make your shaders. You also have the possibility to access particle parameters outside of the particle system and animate them that way. Dynamic parameters allow you to animate up to 4 different parameters. That can be fed directly into Cascade. Every parameter you create within your particle system that has a System-Update function can be animated (in blueprints, sequencer and C++ for example). Parameter collections are a set of values that can be accessed to be changed during runtime. This parameter collection can be fed directly to the right parameter within your shader. But can also be accessed via Sequencer and Blueprints.
(Example: Alpha erosion testing. One of the ways you can suggest motion within your texture)
For scrolling textures, I’d like to combine them together in the shader by means of addition, subtraction, and adjustments to get interesting shapes. In case I’d like to erode a texture over the particle’s lifetime, I will test this in both in Photoshop/Substance and within the engine as well. I’d like to check if the erosion happens in an interesting fashion and make sure the values of the texture contrasty enough to allow the texture to be eroded smoothly. If you would like to get more control over how a certain element needs to be manipulated over its lifetime, the Dynamic Parameter will help you out the most with that. It allows you to break up the constant movement of these elements and allows you to adjust it to your liking within Cascade itself.
You have been working on something amazing, but then the age-old question comes: will it run? Overdraw can be one of the biggest contributors to make things run like a slideshow. You need to look at how one can create the largest amount of screen fidelity with the least amount of particles. Spawn only the particles you actually need and spawn them in such a way it all still feels like one coherent volume. Also, make sure you reduce wasted space in your particle cards. When possible, use trim textures to trim away some of the wasted space in your particle. One thing I also like to do quite a lot: If one fades away a particle by reducing the opacity of over its lifetime, check if the particle is completely faded out within its lifetime. If that is the case, quickly shrink it down to a scale of 0 after the particle has completely turned transparent.
(On the left the unedited, on the right the edited. Around the same result, but less unnecessary overdraw).
Also, make sure to not waste too much space within your textures itself. The less wasted space the less trimming one needs to do. If possible, see what you can create procedurally. UE4 offers options to create a wide variety of noises, shapes, and gradients.
Make sure that your particle emissions are within reason. Not too many and not too few, but this all will depend on how big they will show up on your screen. As your systems increase in size, the tick rate for these systems will increase as well. In this case, it would be how long it will take to update a particle system’s behavior. How much this particle will be reused within the same frame of the game can also be a factor to take into account.
Setting bounds for your system will help to determine whether or not a particle should be culled. Also get into the habit of setting up LODs for your particle systems that will constantly active throughout the game. It might be the case that some of the elements in your system will not be visible from far away. So it will be a lot easier to just turn them off in your LODs.
Thank you very much for reading! Although I feel like I still am in the beginning phase of learning everything about VFX and shared stuff looking from my personal perspective, I hope you were able to get something out of it.
Sjors de Laat also prepared a list of useful resources that will help you to start learning VFX, and we are going to publish it later this week. Keep an eye on the new articles!
Sjors de Laat, Real-time VFX Artist at Axis Animation
Interview conducted by Kirill Tokarev.