Professional Services
Order outsourcing

VFX for Games in UE4

Nathan Huang shared some techniques he uses to build beautiful visual effects for his projects.

Nathan Huang shared some techniques he uses to build beautiful visual effects for his projects.


Heys, my name is Nathan Huang. I was raised up in New Zealand and I came over to Los Angeles to study at Gnomon School of Visual Effects. I come from a more traditional artistic background before learning about visual effects. Throughout my childhood I’ve been drawing and painting and started using Photoshop in my high school years. During my time at Gnomon I originally wanted to just do VFX for film and commercial but was introduce to the amazing world of realtime VFX through my instructor at that time Keith Guerrette. I am currently working as a VFX artist at Survios and have worked on their VR games: Raw Data and Sprint Vector.


When I start on an effect, I like to do gather a reference board and have my reference videos on another screen and make quick thumbnails of the different stages of the effect, this gives me a better understanding on what different elements I would have to create. Just so I could have a basic checklist of what elements I need, this will constantly update as the effect is more refined.

1 of 2

I like to nail the timing right before focusing on the textures itself. So I use placeholder textures, just to get the intial timing right, then proceed by replacing it with the actual textures later on. If your engine allows it, it should be a good practice to view the effects in slow mo. Try to create randomness in everything from size, amount of particles, lifetime. This is so each time the effect is activated it does not look the same each time it is played.

The steps I take when tackling any effect is to try go for the primary shapes and actions for instance for a molotov the intial explosion and fire, then work down to secondary shapes and actions (smoke and fire puddles) and then to teritary details (sparks, debris and heat distortion).

Visual Effects for Games

Game effects in its essence is anything that moves which is not character animated. Those can be split between gameplay effects and environment effects. Gameplay effects should convey clear game design feedback and information to players, is it a healing spell or is it explosive bolt? Should it damage you overtime or does it slow you down? Was it a slight hit or critical damage? From colour, shape and timing, they should all convey the same message. Those should be clear for the players to understand, don’t try to reinvent the wheel.

Environment effects brings life and motion to a static environment and make it feel more immersive. They also be used to guide the player through the world, this could be light shafts leading to the destination or distant smoke plumes to show a significant area.

My personal opinion about using flipbooks is to use a maximum of 2 per effect and try make the most of the texture space you have (I do know that some game engines are able to make a hexagon crop out of the texture so texture space is not wasted. In unreal its called particle cutouts).

What software you use is up to whatever you feel comfortable with you. For my reel I have been using a few varied softwares including Maya fluids, FumeFx, Phoenix FD, Krakota, Houdini and After Effects.

Unreal 4 Shaders

I primarily try make most of my effects using shaders before I see if they need to be a simulation. There are three techniques I constantly use in my materials to fake some sort of animation without the need of flipbooks.

One is when using a texture for the initial mask and the others are panning tiled textures with varied panning speed and different sized UVs. With all of them being multiplied against each other, this can be stacked indefinitely but keep in mind the more you have the less opaque it gets.

1 of 2

Alpha Threshold is when you crush the blacks of the texture. This give a more interesting way of dissipating an effect compare to simply fading it away. I usually connect the parameter that controls the animation through either the opacity or using a dynamic parameter that goes from a 0-1 value.

1 of 2

And lastly is UV distortion by using panning textures that is added onto the UVs. This can be used to generate some movement in a texture that might be too static and be used to create more randomness.

1 of 2


Lastly optimization, not the funnest part here but absolutely necessary since visual effects usually are the biggest suspect in drops of performance especially with working in VR in hitting that 90 fps. This is because most effects are translucent materials. With each particle overlapping each other, the cost to render each translucency gets stacked each time.

Here are some options to optimize your effects:

  • Spawn less particles and LODs
  • Use particle cutouts
  • Converting particles from lit to unlit modes
  • Use alpha masked instead of translucent blend modes
  • Make your textures use the maximum amount of space it has.
  • Taking a look in the shaders to simplify it.


The best learning materials for realtime VFX out there really is the Real Time VFX site and the GDC vault. Also some helpful youtube channels for starting out and advance tips:

Nathan Huang, VFX Artist

Interview conducted by Kirill Tokarev

Follow 80.lv on Facebook, Twitter and Instagram

Join discussion

Comments 1

  • Anonymous user

    wow,thats cool


    Anonymous user

    ·6 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more