Real-time VFX Production Tips
Events
Subscribe:  iCal  |  Google Calendar
Utrecht NL   24, Nov — 26, Nov
Philadelphia US   30, Nov — 3, Dec
London GB   30, Nov — 1, Dec
London GB   30, Nov — 3, Dec
Dortmund DE   1, Dec — 3, Dec
Latest comments
by brtt
5 hours ago

This is a good blog,it teach me manu things.Thank for your share! visit us http://al3ab-banat01.blogspot.com

by Jamie Gibson
19 hours ago

Hi Elliott, This is a great breakdown and very generous in sharing your process and insights, you came a long way from the vending machine days!

Are you planning on releasing the UE4 project to the public? Or only builds? I'd love to play around with it in the editor if possible!

Real-time VFX Production Tips
18 May, 2017
Interview
VFX artist Andreas Glad talked about his approach to visual effects production, showed how these could be used in games and when VFX are used. Andreas worked on a bunch of cool games, including Battlefield 1 and Battlefield 4. He’s also an author of some great courses, which you can check out here.

Introduction

My name is Andreas Glad and I’m a realtime VFX Artist. Stating that sounds like I’m suffering from some kind of addiction and thinking about it, maybe that’s not too far off the mark. I’ve been doing vfx for games since 2008 at a couple of different studios. My first “real” job in the industry was at Eurocom in the UK where I started by handplacing godray meshes in the canopies of Ice Age: Dawn of the Dinosaurs – The video game. Eurocom did a lot of smallish games so I got to jump around and try different things. After my last day on Ice Age where I had just made a leafy flower destruction effect, I got thrown onto Dead Space Extraction and asked to decapitate a guy. Weird contrasts…

The last few years I’ve worked on larger games though, namely Battlefield 4 and 1. That was an amazing experience. The first time I could tell my fiancée when viewing the E3 stream “Look, the skyscraper is going to collapse! I worked on that!” was a really cool feeling. And seeing the viewcount on the BF1 reveal trailer climb over 50 million and knowing, that many people have seen my explosion/gascloud/blood effects is still hard to grasp.

After BF1 though, I left DICE and started my own company, Partikel, where I provide remote contract work and try to train the next generation of realtime VFX artists. Oh, and I’m studying mechanical engineering at university as some sort of a masochistic hobby.

Peculiarities of VFX for Games

Well, if I put it in film terms, I’m a FX-TD with no budget. Where a film simulation can take minutes or hours per frame and the rendering even longer, I have 2 milliseconds to do both. That means the way I do it is quite different. I can’t afford to run an epic smokesimulation, but I can afford to render one out, and use it as a texture on some camerafacing spriteparticles. Everything needs to be hacked and cheated in some way. On top of that game effects need to look good from all angles so you can’t tailor it to just one camera and ignore the rest. Effects in games are very dynamic as any amount of players could trigger effects at the same time and you must build them in such a way that they interact believably and don’t kill the framerate. Not an easy task when you have 64 players equipped with gasgrenades…

There are many different types of effects in games, but they can be summarized into three categories.

  • Gameplay effects – Muzzleflashes, explosions, destruction, powerups and so on.
  • Environment effects – Weather, backdrops, wildlife.
  • Cinematic effects – Custom events that happen at only one time in the game. These are the most similar to film as they often have a locked or at least controlled camera.

Combining Houdini and UE4

Houdini is allowing me to build setups that I can reuse. I’ve used Maya a lot in the past, and a lot of the time you have to start over from scratch if you want to tweak something early in the workflow. Take the example of a destruction simulation. Let’s say you are simulating the destruction of a barrel to bake and bring into game. In Maya you go ahead and split it up, build the constraints simulate it and bake it. In Houdini you do the same. So far, so good. Now, the AD comes in and says; “Sorry, it’s going to be a crate, not a barrel. My bad” In Maya, you start the process over, curse the AD and work late. In Houdini, you plug the crate in, bake and go home. The setup stays and if you took care when making it the first time, you don’t need to redo it when creating similar effects.

In my course I show how to set up a looping smokeball to use as a texture in Unreal. When you are done with the course, and you realize the settings I used for the simulation were crap. “I want a much thicker smoke!” You simply fix the simulation and the rest of the setup, like the looping and masking is already done. I’m a very strong believer in being able to reuse assets to keep a high velocity.

Smoke Ball and Explosion Material

The course is aimed at people who aren’t super experienced with Houdini so it relies a lot on using presets and the shelf tools to get the setups up quickly. That way I can point at things to tweak instead of spending lots of time trying to get the basic setups up. So in the course you start by simulating a smoke pillar. Then you make the volume loopable, just like you would in after effects, except this is in 3d so you an relight and shade it afterwards. Then you mask out the top and bottom of the pillar and you are left with a ball that fits nicely on a sprite.

This can then be used as the base for an explosion. You use it for smoke and fire by applying some shader love. Combine this with a healthy helping of cascade and you end up with the base for a pretty cool large scale explosion.

Exporting Houdini Work into UE4

Yes and no. You can’t use Houdini in the traditional sense to make realtime effects. There are always steps in between Houdini and the engine. Same as with any 3d package. Houdini is just a very good way of creating the building blocks you need to create effects in engine. That said, with the new vertex animation tools that’s about to change. Now you can bake out some really impressive looking stuff and pretty much play it back in engine. The technique works by recording the delta position of all vertex points and storing it as a colour on a texture. In engine you simply do the opposite to get the shape 3d back. Oversimplified, yes, but that’s about all you need to know to start using it these days. This is a really good tool to create the cinematic effects I talked about. You can get some really impressive stuff done with the technique. The drawback is that it’s prebaked, and therefore not as dynamic as you often need gameplay and environment effects to be.

The End Result

The end result is everything. You always need to keep the bigger picture in mind. A sprite based effect on its own VERY rarely look good. But if you add all the necessary details, put it in the environment it’s built for, add some camera shake, some nice lighting and suddenly, it can start looking really convincing. A common problem I see juniors make is that they create their effect in a vacuum. Even worse, on a black background. You can take a cube, give it a bright color and put it on a black background and it will look good. But once it’s placed in game, where the player will see it, it will lose a bit of that impact. My lead at DICE once told me that when I did my work test during the application process, I was neck and neck with someone else. The test was to create an explosion. What got me the job was my use of camera shake. I did this in an engine where I didn’t know how to add camera shake so I had to hand animate the camera move to get the effect I wanted. Watch a film with explosions in it and count how many are shown with a perfectly still camera. The shake makes the effect feel a lot more powerful.
It’s the same with decals, and debris. They ground the effect make it feel part of the world. Perhaps the player was looking the wrong way when the explosion happened. But if there’s a scorch mark on the ground and debris bouncing around, he’ll understand what just happened behind him.

Of course there can be too much of a good thing. We had an instance during the development of Battlefield 1 where the camera shake from artillery impacts had no distance limit so whenever there was an explosion anywhere on the map, everybody shook. Guess how often there are explosions happening somewhere on the map in a game like that… It’s a balancing act. You want the effect to feel impactful, but you don’t want to overwhelm the player. Same with debris. It’s cool when there are pebbles raining down around you after a big explosion. But if it does so continuously for ten minutes you are going to start looking for an umbrella.

Advice

Start out simple and iterate your ass off. Make a quick and dirty simulation, render it out and stick it on a sprite in Cascade. Does it look good? No? Why? Analyze what’s not working and what is and start tweaking. Finish the setup before you start polishing. A common mistake is to spend hours on getting the simulation perfect in Houdini, and then realize that it doesn’t work in game for one reason or another. Don’t be scared to mix in real elements as well. Grab some stock footage and make sprite textures from it. Maybe that can work as a base, and you only need to simulate that one little piece that’s missing. Mix and match techniques. A lot of cool stuff can be done in shaders. A lot of games rely on a single texture frame and then create the movement with UV-Distortion or flow maps in engine. Your effect is done when it looks good (and performs) in the situation it will be used in game. Don’t fall in love with it in the preview lighting/setting. Always be prepared to kill your darlings.

Andreas Glad, VFX Artist

Interview conducted by Kirill Tokarev

Follow 80.lv on Facebook, Twitter and Instagram

Comments

Leave a Reply

6 Comments on "Real-time VFX Production Tips"

avatar
markus
Guest
markus

I have a essential question, i have the problem to make Partikel effects that the impakt is strong example i make weapon slash is quit easy to make bud how to make it feal strong impaktfull. I allways struggel to get impaktfull partikel. Any ideas videos tutorials or tips

Roy\'s half brother
Guest
Roy\'s half brother

Why mechanical engineering?

Jed
Guest
Jed

I thought camera shakes were automatically handled by the LFE (sub audio) channel in Frosbite. Were you manually only tweaking those camera shakes for cinematics or in-game real-time effects as well?

Andreas Glad
Guest
Andreas Glad
Roy, I’m flattered you are looking out for me, but I did leave very much by my own choice. I handed in my resignation in March (when I applied to university) but was talked into staying until my university semester started in September. In Sweden you can quite easily see if you’ll be accepted or not so I was sure of it and had planned a nice summer of leisure, before starting everything up in the fall. That didn’t really happen 🙂 I’ve been approached by DICE several times since, as they wonder when I’m going to “give up” this… Read more »
Roy
Guest
Roy

Yeah, he totally “left” Dice to start his own company, which means he got let go after the game was done, which is pretty common in the industry. Just no need to try to make it sound like he left by choice.

Katten
Member
Katten

make

wpDiscuz
Related articles
CGI/Static Rendering
Environment Art
Environment Design
Interview
AI
Interview
Materials