In case you missed it
Learn more about production
My name’s Leo Lucien-Bay. I’m an animator who worked in the gaming industry as a Cinematic Designer on the Mass Effect and Dragon Age franchises. I started doing animation way back in the early days of Machinima, before it was a website or YouTube channel, even before it had that name. I started making Quake movies as a hobby and over time, I moved outside of games to tools like iClone and Motionbuilder that enabled me to pursue animation as a career.
The Problem with Cinematics
From the very beginning, I’ve found out that there was tension between games and cinematics. Games and films are inherently different art forms but for decades their concepts have been combined to enhance our gaming experience. In Machinima, the games often provided a huge range of 3D assets we could use to build our worlds and stories. It was great, but we often lacked the right tools to easily create and import characters and animations.
In the gamedev world, you have professional tools and a lot of artistic talent working alongside you but ultimately the cutscene is often made outside of the game environment. This can be a particular disadvantage when you’re trying to give the impression that the characters are from a certain world and supposed to be interacting with it. With Reallusion’s tools and iClone Unreal Live Link, I feel like a much-needed bridge has been finally built between the creation of characters, animation, and their integration into the game environment.
What is iClone Unreal Live Link?
iClone Unreal Live Link is an idea that existed before its integration with iClone. The Live Link plugin solves the problem of digital content creation tools always being separated from the games that they help make content for.
With iClone Unreal Live Link you can animate a character in a DCC tool and see the results immediately in Unreal, removing the need to create and iterate in an environment that’s far removed from the game world. And things can again start to fall apart in the complexity of large multi-disciplined tools like Maya (which seems to be the tool most commonly associated with Live Link).
For solo developers or small teams, the streamlined nature of Reallusion tools like Character Creator 3 and iClone made Live Link an even more attractive prospect and in this article, I’ll be going over what some of those benefits are exactly.
This is how it works:
It’s possible to transfer characters, cameras, and lights, however, it’s the character transfer that’s most significant due to its general difficulty/annoyance. Anyone who has ever imported characters to Unreal will know this can be a laborious process especially if it, due to iteration, needs to be done multiple times per character model. With the Live Link plugin, the transfer of characters into Unreal is a one-click process, like an automation script that takes care of exporting and importing the mesh, blendshapes, poses, textures, and materials. This way, neither character creation nor setup process needs to be retreaded in the engine.
Once the transfer of assets is complete, animations, and in some cases, object property changes are then reflected in Unreal in real-time, removing the disconnect between animation and environment.
Unreal Sequencer is a fairly robust cutscene creation tool that allows for the import and blending of character animations via animation clips. But it doesn’t offer a simple way to make adjustments to these clips. Way back in the Mass Effect 3 days we used to wish we had a limited ability to animate the characters within matinee. Not so that we could have Matinee be our animation tool, but so that we could more easily offset bones on animations we’d imported.
Despite our many custom changes to the engine and tools we never quite got farther than blending animations and applying offsets as a fix for this problem. Not an ideal solution but better than having a whole bunch of characters salute with their hands halfway into their heads.
But what about when the changes need to be more complex or are needed more frequently?
Even now with the advances made in Sequencer, it’s difficult to make significant changes to animation clips. This is where being able to effectively stream the animation controls from a DCC tool really comes in handy. Being able to animate to other characters, objects, or even events in the game world not only removes the guesswork from cinematic animation but significantly cuts down on the back and forth iteration time between tool and engine.
It’s a huge added convenience to be able to animate Unreal cameras from iClone. It’s not uncommon for animation in cutscenes to be done according to what’s visible on camera. Why clean noise or fix weirdness in the legs if the shot only shows the upper body? So maybe it's not too surprising that iClone Unreal Live Link supports camera animation. What’s nice in this case is the ability to also stream properties like depth of field, allowing camera work done in iClone to much better represent the final version.
This is also helpful since we’re actually seeing the gameworld accurately in Unreal, rather than relying on the often grey, textureless level geometry meshes that are used as a proxy while working in DCC tools. I found it much better doing the camera work in one place (instead of in DCC and again in the engine to finalise it), not to mention that with Live Link I was able to see the level with the VFX active in the background, as well as the full extent of the level so I could be fully confident that what I was seeing was going to be the same objects and effects in the final version.
Depth of Field video snippet:
I think for some this one might be a head-scratcher. Unreal already has the ability to add and animate lights! But for me, the convenience here wasn’t much different than with cameras. Yes, the majority of the lights you need will already be in your Unreal scene but it’s the ones that need to interact specifically with your characters and specific camera work that make this a useful feature.
Again it wasn’t just the fact that I could animate the location and rotation but the fact that all the 3 main light types in iClone are seamlessly tied to their Unreal equivalent, giving the best approximation without me having to think about how to match them between tool and engine. Overall, the more I was able to control from iClone and see the results in Unreal, the smoother the overall animation was.
My knowledge of the Maya version of Live Link isn’t extensive. The reason for that is I’m mostly a one man band and I find it better to use more streamlined solutions for my work.
The benefits of the iClone Unreal Live Link plugin weren’t just in the plugin itself but the combination of Character Creator, iClone, and Live Link. From character creation to the fast animation process in iClone, there are so many steps taken care of under the hood that end up saving me time and allowing me to focus on the bigger picture.
It seems this philosophy is very much carried over to iClone’s Live Link plugin. I didn’t have to worry about different versions of characters going between both tools, the plugin actually converted materials rather than the limited transfer of textures you’d usually get when exporting FBX’s, I could stream individual cameras or just the final camera track that was switching between all of them.
All in all the process was simple and I’d even say it filled in some of the knowledge gaps I have on Unreal’s technical side. In the end, the less I’m worried about the journey from DCC to the engine, the more time I could spend working on what the audience actually sees.
Learn more about iClone Unreal Live Link.
Learn more about iClone.
Learn more about Character Creator.
Leo Lucien-Bay, 3D Artist