Brett Ineson and Athomas Goldberg, co-founders of Shocap Entertainment, discussed how the studio delivered the world’s first broadcasted XR Jazz show.
Generating Visuals in Unreal Engine
Epic Games’ Unreal Engine is receiving wide applause across the media and entertainment world, and for good reason – it’s the driving force behind the industry shift towards virtual production. Shocap Entertainment harnessed the power of Unreal Engine 4.25 extensively during the production of Jill Barber’s BBC Click show, designing and building a CG replication of the now-demolished Palomar Supper Club.
Shocap drew inspiration from old souvenir photographs of the venue in its prime, referencing band shots and pictures of the audience to piece together a puzzle of fragments inside the engine. The souvenir photographs can also be found placed on tables and walls throughout. Inside the venue, Shocap then created CG band members, an audience, and dancers in the form of ghosts with animated movements driven in real-time using motion capture. Jill Barber’s video image was composited into the venue using Unreal.
Athomas Goldberg, Co-Founder: In the 50s film noir period, there were a number of movies and stories that featured ghosts and we wanted to capture that style. We needed to ensure the ghosts were clearly visible but also transparent to make it demonstrably clear they were spirits. Part of this was an aesthetic choice and part of it was practical. When combining digital avatars with a live performer, you want there to be a clear distinction in the presentation of body language. By giving them this look, it tied them more to the environment of the resurrected nightclub. Jill is like a living performer that’s visiting.
The team was also able to bridge the 7600 km distance between Vancouver and London and virtually seat both Jill Barber and the BBC presenter, Paul Carter, at the same table. Using green screens, Shocap visually transported Paul into the Palomar, producing a strikingly realistic, bar-side interview. This was the first time the presenter had ever conducted a virtual interview in this manner.
Capturing the Performance
Shocap Entertainment combines the technology and expertise of Lifelike & Believable and Animatrik Film design. As such, Animatrik’s globally-renowned Burnaby studio was used to house the physical performance of the musicians. The studio was set up with 70 Optitrack cameras by Natural Point to capture the musicians, all equipped with motion capture suits. The GIANT software solution was utilized to retarget the characters in real-time, this type of setup is called passive motion capture.
Throughout the arrangement, there were a number of static cameras recording the performance with one closeup camera using NCAM to track all of Jill’s movements. To ensure the illusion of Jill Barber performing in a CG nightclub was maintained throughout, accurate compositing was vital and required the image plate to be perfectly synchronized with the virtual plate at all times during moving camera shots.
Brett Ineson, Co-Founder: One of the most interesting aspects of shooting mocap for this project was Jill being live video and the musicians being totally rendered. With this in mind, we could offset the musicians in the space, placing them on the other side of the room to Jill. Practically, this made it easy to key Jill out against the green screen, but it also changed the dynamics of the performance. In the Livestream, the band is clustered together behind Jill on the tiny CG stage, but in reality, they were actually facing Jill and could take cues from her in ways that would have been more difficult to achieve in a traditional stage setup.
A challenge for Shocap came in tracking the instruments as well as the musicians, and with everything rendered in CG, tracker placement became even more important – special considerations were made when tracking the symbols and drums due to the effect the markers have on the sound. Shocap also had to mask out the glimmering parts of the drums so as not to disrupt tracking by the motion capture cameras. Additionally, while synchronizing the movements of real cameras with virtual cameras using NCAM, Shocap delayed the motion capture data stream by 3 frames, guaranteeing that the video and audio streams were locked together.
Maximum Control with Ringmaster
Shocap Entertainment’s primary objective was to deliver a seamless live stream in Unreal. This required controlling camera changes and transitions alongside the lighting effects that were synchronized with real lights in the studio and with virtual lights inside the CG stage. The production space included four physical cameras and over a dozen virtual cameras. To maintain complete control over every facet of the performance in real-time, Shocap implemented its proprietary virtual production software, Ringmaster.
Ringmaster acts as a live cueing system, enabling directors to create cues that can control any aspect of the Unreal scene. Those cues can either be put on a pallet to run independently, or they can be built up into cue sheets – much like a theatre or music production. It’s possible to run through the cues manually or schedule cues to run at specific times in any combination. Centralizing control over both the physical and virtual elements of Jill’s performance made the final stream in Unreal possible, transforming the engine into a functional, live-performance application.
Athomas Goldberg: When setting up for a song with a cue list containing three or four camera and lighting transitions, you might imagine that they would run the same every time. But when you get to the end, there might be two to three minutes of banter between the performers and the crowd. The performer may walk around the stage, during that time you want to be able to decide when to bring up lights or effects or camera changes, you need that live control. Then when they change songs you might want to fire off the next cues, but if things go wrong, you can jump back and replay. Those tools just don’t exist for Unreal yet – we’re trying to take all of the functionality of the sequencer and move it into a live control context.
The final performance was broadcast globally and online via BBC iPlayer. Here's one of the songs: