logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Breakdown: Creating a Virtual Ragdoll Puppet in Unreal Engine

3D Artist and Animator Peter Javidpour shared a breakdown of the Digital Puppeteering demo, explained the working process in Unreal Engine, and told us how a virtual camera rig was used to create the project.

In case you missed it, a couple of days ago, 3D Artist and Animator Peter Javidpour shared Digital Puppeteering, an incredible demo that showcased an Unreal Engine-powered system for creating virtual ragdoll puppets developed by the artist a year ago.

Luckily for us, the creator agreed to share an extended breakdown of the project, explaining the working process in Unreal Engine, discussing the engine's tools that were used for the system, and explaining how a virtual camera rig was used to create the demo. You can read the full breakdown attached below and check out more of Peter's projects here.

Peter Javidpour: I made this demo a year ago to implement some ideas around animating puppets with Unreal Engine and off-the-shelf VR equipment. I’ve had a longstanding interest in using real-time tech to build workflows for performance-driven animation. This is the same custom toolset I eventually used to animate the titular character in my recent short film, My Breakfast With Barf.

I chose to build the demo in Unreal because of its toolkit for character animation (Control Rig, Physics Actors, Animation Blueprints), as well as built-in support for filmmaking (namely Sequencer and Virtual Cameras).

The limbs of the character (named Burd in project files) are driven by ragdoll physics, and the body is tracked to the movement of a Vive motion controller. The physics system generates floppy, chaotic animation whenever the character is moved, which is perfect for the playful aesthetics and tone of the demo. Burd’s beak flaps open and shut using the trigger on the Vive controller. Burd’s neck can turn using input from the Vive controller’s trackpad. These interactions are controlled with a simple Blueprint script. 

I wanted the puppet to traverse a fairly large set while performing. I wrote a custom Blueprint for toggling between a closeup perspective for finer performance, and a zoomed-out view for traveling across the stage. This system allows me, as a performer, to move the puppet some distance while zoomed out, then “snap to” the puppet’s new position, without getting motion-sick.

Another fun piece of this tech demo is the virtual camera rig. I mounted a Vive tracker to a simple shoulder rig. The position and orientation of the real-world rig drive the camera in Unreal. The attached phone displays the virtual camera view in real-time, using Unreal’s built-in LiveLink camera.

Ideally, I would capture performance and camera moves at the same time, but since I was doing both performance and cinematography for this demo, I made use of the Sequencer tool in post to layer the takes I liked best. I was able to customize the set design after recording animation, placing dynamic props that can collide with Burd’s body based on my preferred takes. Finally, I rendered the demo in Unreal using the Move Render Queue and Sequencer.

I’m excited to continue playing in this space, and to eventually build out these tools into a package I can share with others. I’m happy to say I’ve already been able to use it to animate a character in a completed film, My Breakfast With Barf, that’s been shown at Pictoplasma and a few other festivals. I started down this path with the intention of putting together an interactive live show, and that’s still on the table as well!

Peter Javidpour, 3D Artist and Animator

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more