Real-Time VFX: Overview from Keith Guerrette

Keith Guerrette, the founder of Beyond-FX & RealTimeVFX, talked about VFX peculiarities and evolvement, technical approach to it, fundamentals and more.

Keith Guerrette, the founder of Beyond-FX and, talked about Real-Time VFX peculiarities and evolvement, technical approach to VFX, the knowledge necessary for VFX artists and RealTimeVFX community.


Self-introductions are always the hardest, but here we go… My name is Keith Guerrette, and I’m a real-time visual effects artist, the founder and a principal artist at Beyond-FX, Inc., and the founder of the community.

I come with a fairly long history of developing games (13 years), all specifically as a visual effects artist. During that time, I’ve been fortunate enough to be a part of the creation of some truly inspirational developments—Uncharted 2, 3, & 4, The Last of Us, What Remains of Edith Finch, and many others. But more than that, I’ve been able to watch our medium of games mature enough to allow us to tackle those types of complex narrative experiences.

I started like most of my peer group — at a technical school specializing in 3D arts. At the time, Full Sail University had the educational approach of teaching the full 3d pipeline for films, so I was introduced to most art skill sets of a production and absolutely fell in love with compositing and visual effects. Both require the artist to have an understanding of math and art in such a way that every task is it’s own puzzle, waiting to be solved in unique and rewarding ways.

I took my first steps into the video game industry at High Moon Studios during an opportune time for my career. Developers had jumped up in consoles to the PS3, and, for the first time ever, had spare processing power to spend on visual effects. This was the era that games started to feel awesome. This was also the dawn of the “VFX Artist” in games — there was suddenly enough available work for the profession to be a full-time job. I didn’t know it at the time, but I stumbled into an industry niche that was and still is extremely in-demand.

That demand allowed me to step into a leadership role at Naughty Dog as a fairly inexperienced artist, right as the studio (and industry as a whole) was learning to use this extra power to tell stories through visuals and cinematography. Though often subtle, the role FX plays in these stories is important, adding texture and layers that give greater depth and make a world more engaging.

Today I’m at the helm of Beyond-FX, a studio comprised of some of the most inspiring visual effects artists and game developers in the world. Our goal is to fundamentally change the way our industry works by partnering with creative developers and working as a fully integrated, but external, part of their art and engineering teams. Together, we’re creating remarkable real-time experiences that leave a lasting impact. We’re also the admins, or perhaps rather caretakers, of the community.

Real-Time VFX Peculiarities 

The “real-time” requirement of the description simply means that the full, final image seen by the viewer, from simulation to integration, to render, has to happen in less than 1/60th of a second (if you want 60 fps). If the visuals can update that quickly, it means that it can respond in “real-time” to the users’ input, whether by the game controller, headset, or whatever another feedback loop is at play. To compare with movies, most of the effects that we see on the big screen take hours if not days to simulate, then that again to render and composite. The process for the creation of a single frame, not including artists’ time, can frequently take several days.

That explanation doesn’t necessarily do justice to how different our approaches have to be. With real-time, our general rule is if it doesn’t run at the frame rate, it might as well not exist, because we can’t ship the product with it. This changes our focus from making something simply visually beautiful to figuring out how to make something that runs on whatever the target hardware is and meets the design purpose of the effect. Then, and only then, can we turn our attention to making it beautiful and compelling.  

This almost always leads to creative cheats and hacks, which is where our jobs begin to feel like solving puzzles. I can make a cool looking explosion in the Unreal Engine in a white box, but maybe once it’s in the level, my computer just can’t maintain that 60 fps. So then we start looking for our hacks, which typically go something like this:  

Does the camera actually only see it from one angle? I bet I can reduce the number of particles to only look good from that angle. Is the light direction locked too? Perfect, let’s disable real-time lighting and bake the light directly into the assets. Is the ground plane flat? Let’s get rid of geometry collisions and use just a ground plane primitive. Does the camera turn before the smoke dissipates? That means I can reduce the fidelity of my textures and focus on the fiery explosion part. Does it need more art directed shapes? I guess I’m going to paint those by hand. Needs smoother motion without increasing the texture memory? I’ll make add some motion through a shader.

The other exciting part of this field is also a direct result of this need to constantly solve these puzzles —everyone around me is innovating all the time. I’ll be so proud of something I’ve created, but then a few days later someone else will come up with a different technique that absolutely blows mine away. At this point in my career, I’ve made thousands of effects, but I’d be hard pressed to find two that followed the exact same process and workflow.

Technical Approach to VFX for Games

I’m having a hard time picking out technical approaches that are universal. As I described in the last paragraph, every project is bound entirely by the platform that it has to run on. As a developer, if you’re targeting the widest audience possible, you need to make a product that runs on some seriously archaic devices. Let’s take League of Legends as an example – I wouldn’t be surprised if they ran on machines that are slower than the refrigerators in some of your homes right now. Because there are so many variables across the industry —platform, engine, art style, game type, etc. — exactly how you achieve your effects can vary wildly.

These types of limitations cause us to take a step back and think through this question:  What is the purpose of this effect? Sometimes, like many in LoL, it’s actually to deliver information such as, “What is the radius of damage from this attack?” Or, “Was that a heal or a damaging spell?”

Sometimes the effect is there just to let the player know they did something by providing visual feedback like, “Congratulations, you wasted a bullet and shot the wall.”

In each of these scenarios, when you understand the actual purpose of the art, you start to see the most effective ways to achieve it, both technically and visually.

Now as to the art itself — that is where universal principals come in, and frequently where our premature technology actually gets in the way. Visual effects, regardless of their techniques are by-products of classic art and animation principles — composition, colors, shapes, timing, follow-through, secondary motion, etc — these are the skills that universally will make your effects impress.

Real-Time VFX Nowadays

I’ve frequently said in the past that real-time VFX tools are 10 years behind the film industry. I started building my effects out of more and more complex shaders around 2011 — roughly 10 years after the tricks of the film industry were largely fancy shaders on particles, such as the tool Afterburn. Today, games are starting to utilize simple, but beautiful approximations of volumes, whether through 2d slices, low res voxels, etc.  The parallel in film is that this once again roughly matches the growth of real volumetric fluid simulations in productions through tools like FumeFX circa 2009.

While that does give us a neat roadmap, and a starting year to search through SIGGRAPH’s archives, I’ve also come to realize that it’s a fairly restrictive way to look at the potential of real-time tools. Many film simulations spend a vast majority of their computational time achieving the last 5%-10% of picture fidelity. By and large,  games are not striving to look even 90% as good as those film counterparts, but our tools have traditionally veered away from even trying film techniques—this leaves a MASSIVE opportunity for innovations, and we’re slowly starting to see the fruits of it.

We’re on the cusp of having some fundamental paradigm shifts in our tools from every major game engine in the industry, notably Niagara in Unreal and the upcoming particle tools in Unity. Both of these are geared around the philosophy of allowing more flexibility and control for developers to invent their own solutions, rather than providing restrictive pipelines. While many tools are in beta right now, it’s a regular occurrence for me to see new art created by a particularly tech-savvy artist that I can’t even begin to comprehend how it was created. That’s a thought that’s both infectiously exciting because it’s like the wild west for us, but also terrifying for an old artist like me stressing over keeping up.

Essential Knowledge for VFX Artists

Our world is changing so rapidly that I struggle to say what tools it is essential to learn. The first crucial step if you’re trying to enter into the real-time industry is to open a game engine, any engine, and learn how to develop effects in that environment. I’m still surprised how often I see VFX artists from films or other mediums applying for real-time positions that have never opened a game engine.

Past that (hopefully) obvious statement, it gets a little more ambiguous. What I can say is that there are two sets of skills you can develop that will always be useful and make you stand out (the order matters here):  

  • Have a solid foundation in artistic principles—composition, color, shape, timing, follow-through, secondary motion, etc.  If you’re strong in these skills, you’ll know how to create exciting, fresh art with whatever programs you’re working in.

  • Learn how to use the tools in a game engine to make a simple, but diverse set of visual effects.  Start with the fundamentals of understanding emitters, textures, and particle motion. Then start to explore the cool materials/shaders people are using.  In most engines, you can get quite far without needing to become extremely technical, and then you can begin to explore the more fringe solutions as you get comfortable – dig into writing shaders, whether with nodes or through HLSL.  Get familiar with the design script that will be used to trigger your effects. Explore off-line animation, rendering or simulation tools such as Adobe Animate, Houdini, FumeFX, Thinking Particles, etc. – each is extremely useful, but ultimately are small tools in your toolbox compared to using the game engine itself.

No matter which tools you pick up, just know that there isn’t any certain correct solution or approach to creating an effect, and there will always be a better way. If you get into visual effects, it means you’re signing up to be a student of the industry for the rest of your career. community

When I stepped away from my role at Naughty Dog, I was fortunate enough to have several amazing opportunities to help other productions around the world, each with their own workflows and tech. Alongside these unique, customized methods of producing work, I found that each place had its own language as well. It was exciting, but also baffling — what one VFX artist would call an emitter at one studio meant something entirely different in another, and it made it nearly impossible for me, an outsider, to convey ideas without stumbling over basic standards of syntax. At the same time, I also realized that so many of these productions had engineers that wanted to build cool tools, but it was a free-for-all of well-intentioned ideas. In other words: the industry as a whole wasn’t communicating.

Tutorial by Andreas “Partikel” Glad

I’d initially spent time thinking that providing educational content was a great way to help relieve some of these issues. I talked with my peers, trying to discern the best medium for educational content delivery, and learning A TON about web hosting. Through all of those conversations, what I saw repeatedly was that everyone in our industry wanted to share, but had no place to do it.  Through the help and, honestly, pep talks of fantastic people like Jason Keyser, David Johnson, Jennie Kong, and Drew Skillman (along with a brutal self-education in web administration) I was able to pivot quickly and launch the forum for a relatively little cost. The most exciting part: It’s fully indexed by Google. I can’t begin to tell you how thrilled I was the first time I searched Google for an answer, and it led me straight to a post on the forum of another artist willing to share her techniques from halfway around the world.

VFX by Leroy “Sirhaian” Kevin

Today we have around 2300 active users, with several hundred unique visitors each day. That seems small until I consider there are probably more active users on the forum than there are real-time visual effects artists in the world. While my studio, Beyond-FX, funds the still small server fees, we’ve built out a team to actually care for the site — Nathan Lane of Riot Games is the heart and soul of the community, running all of the monthly sketches and crafting fantastic dialogue. Jason Keyser of Riot Games fosters education and runs the “Jobs” board section as both an attempt to funnel recruiters to a very helpful channel, as well as drive some economy through the site. The rest of the success of the community is entirely due to the wonderful and inspiring artists that are generously contributing, sharing, and challenging each other to grow.

LoL VFX by Shannon Berke

Our desire since the site’s inception is this: we want the site to be a place for any artist or business to productively reach other real-time visual effects artists around the world—after all, this can only benefit all of us. And it already has! The sharing happening on the forum, at the VFX Bootcamp at GDC, in our Facebook group, Discord channel, and everywhere else is unifying our industry and directly driving exponential improvements in our tools, techniques, art.

Keith Guerrette, Founder & VFX Director at Beyond-FX

Interview conducted by Kirill Tokarev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more