In case you missed it
Here're previous insights from CG Spectrum mentors
Hi! My name’s Greg Hird-Rutter and I’ve been a VFX Artist for the last 18 years. I started working in TV production and after several years I moved to game development, then to film and then back to game dev, where I've been ever since. In my time working in film and TV industries, my projects ranged from CG animated shows like Hot Wheels and MTV's Spider-Man, to live action films like Life Of Pi. In game dev, I've largely worked on Action/Adventure projectiles like Assassin's Creed 4: Black Flag, For Honor, Mass Effect: Andromeda and Star Wars: Battlefront 2. I've been working at EA Motive in Montreal for the last 2 years and I'm currently the Lead VFX Artist on an unannounced project.
What Tools to Learn for VFX?
It’s important to learn fundamentals in a 3D package like Maya or 3ds Max, but Houdini is quickly becoming the go-to package for VFX in both Film/TV and Game Development industries. It has a steep learning curve, but once the user is comfortable with it, it can open up a ton of possibilities for authoring real-time VFX. On the game engine side, it’s valuable to learn either Unreal or Unity (or a bit of both). These engines are fairly user-friendly and commonly used throughout the industry. Even studios who use their own proprietary software will have a lot of similarities to both. As for how tools can be improved, that’s never ending. For VFX specifically, a lot of game engines are trying to improve the usage of GPU particles. GPU particles allow for much more complex effects, but often the workflows can also be complex and as a result aren’t always ideal for a production environment.
Film/TV VFX vs Game VFX
Because VFX for games have to run in real-time, by their very nature, the can’t be as complex as what can be achieved in film or TV. Game VFX also have to effectively communicate the gameplay to the player so it has to be authored with the player’s experience in mind. This makes for more a collaborative and iterative approach when creating content for a game.
For Film/TV, there are way less constraints so FX can be far more complex. There's more opportunity to concentrate on quality and not be concerned with the constraints of a real-time engine.
Both film and game dev provide their unique challenges though they involve recreating natural phenomena or fantastical/sci-fi elements and trying to create "wow" moments for player/audience.
VFX for Games: Workflow
When creating any game effect, the first place to start is with reference. Look at film, game and youtube footage and breakdown the FX to the separate elements. Something simple like a bullet impact will still contain a number of different components, such as the initial flash, sparks, debris, dust and finally the damage to the surface that was impacted.
When you begin to author the effect, keep it simple and concentrate on creating these elements with simple particles that use basic textures, meshes and shaders. Be sure to test everything in context. It should be viewed the way the player will see it and not just in a particle editor. Ideally, this can be tested with a designer so you can receive feedback on how effectively you're communicating the intended gameplay. For this bullet impact example, you should be considering: is it clearly showing what surface is being hit, does it differentiate between different impact types, does it communicate the power of the weapon that is shooting?
When you're happy with the overall look and timing, it's time to polish. You can implement more complex textures and shaders as they're needed. Performance should be a priority through the creating process, but at this stage you can also optimize more aggressively. Certain elements can be disabled or reduced based on distance from the player or when offscreen.
When you're happy with things, be sure to test and view it in as many different environments and situations as possible. The last thing you want to do is to create an effect that looks beautiful in one environment and then awful in another.
This is the general process I follow when authoring VFX whether it's something simple or complex.
Mentoring at CG Spectrum
I became a mentor at CG Spectrum when I saw how the process works. I really appreciated that the mentors get to communicate directly with students. There's lot of online information out there, but I feel that having a mentor to guide students through it all is a far more effective way to learn than for students to just parse through tutorials on their own. I also like the opportunity to work with students and mentors worldwide.
Houdini Program at CG Spectrum
Ben Fox, Houdini FX Department Head:
The Houdini courses at CG Spectrum were created by VFX professionals with a focus on training the concepts needed by anyone interested in getting started in a VFX artist career in any industry. The majority of the course work in the Houdini program focuses exclusively on learning the fundamentals of Houdini and how to use it to create a wide variety of FX. While we recommend students who enroll in the Houdini program have a basic knowledge of 3D content creation tools, there is no requirement of previous Houdini training. One of the big things we discuss in the Houdini program is how to use Houdini in a full production environment. This includes working with industry standard file formats such as Alembic and FBX for geometry and multi-channel EXR images for compositing in Nuke. While we really focus on creating film style VFX in Houdini, this is the most useful way to learn the software, even if your end goal is to create FX for a game engine. If students can’t make a great-looking explosion in Houdini, there is no way they could use it to make a great-looking element for a real-time engine. So that is why we focus on using Houdini fully from the ground up, not just a couple step-by-step lessons to make cookie-cutter elements.
Let's see how a certain kind of effect is taught as an example. In the full Houdini course, we dig into creating Pyro FX in term 2. This starts with an overview of volume data and how it is dealt with in 3D packages. This info is standard throughout most FX packages, so this info helps with most volume work. Then we move into sourcing or creating our Pyro FX. To really refine pyro, we need to create and refine point attributes on the points that we are planning to source our pyro from. In term 1 we cover tons of ways to create and refine component attributes, so we tap into that knowledge to refine our pyro source attributes.
Once we have sourcing introduced, we move into the fundamentals of the Pyro Solver, the main tool that performs the simulation in Houdini. By the end of the 6 weeks of the Pyro section students will have a fundamental understanding of creating smoke, flames and explosions. These could be rendered out as sprite sheets for real-time projects, composited into a live-action plate or the basis for a velocity field for use in countless other effects and programs.
FX can be challenging and has a steep learning curve. It's always good to start small and be sure you're really understanding the concepts and techniques before doing anything too deep. Never be afraid to try something, have it not work out, and start over with a new approach. Once students are more comfortable with the workflows and processes a lot of possibilities open up.
Creating FX is rewarding, as it's an opportunity to create impactful moments in a game or film. It's a domain that is always changing and evolving, so the learning will never stop.