Interview: How Dead as Disco Synchronizes Combat to Player-Owned Music
The developers behind Dead as Disco break down their custom BeatWarping system, BPM sectioning, animation syncing, and the technical challenges of building rhythm combat around player-owned music.
Building a rhythm-action game around a curated soundtrack is already a major technical challenge. Building one around player-owned music introduces an entirely different level of complexity. That’s the core idea behind Dead as Disco, a recently released (in Early Access) music-driven action title that allows players to bring their own songs into combat while still maintaining tightly synchronized gameplay, animations, and enemy behavior.
Rather than relying solely on traditional beat detection, the team at Brain Jar Games developed a custom system called BeatWarping, which dynamically adjusts animation playback to align impactful moments with music in real time.
To learn more about it, we talked to several members of the development team.
Dead as Disco allows players to use their own music while still feeling tightly synced to gameplay. At a high level, how does your system achieve that?
Milcho Milchev, Senior Gameplay Engineer: Most of the gameplay works by synchronising actions (Animation, SFX, visuals, enemy attacks) to the beat. We ask the player to input a BPM (beats per minute) for their song, and give them tools to help determine that BPM. In the future, we hope to add an automatic BPM detection feature.
Jon McEvoy, CTO: Once we have the BPM, we have a number of ways we synchronize actions (including environments and enemy behavior), but the most interesting one is how we line up our animations with the music.
Our animation team annotates all of our animations when they’re created with information about how playback can be scaled, as well as when the impactful moments in the animations happen. We call this tech BeatWarping, and it allows animations not just to scale with bpm, but also to match special moments in the song.
For instance, our story levels have much more information packed into the music tracks. This allows us to have animations, environments, and enemies react to the music, and not just the beat. Our audio and design teams go through each track and add events for moments in the song like drumrolls, vocals, and other instrumentals. We intend to release those tools to players so that they can also build fully featured song experiences with our tech.
What are the biggest technical challenges when trying to synchronize gameplay systems to user-owned tracks that you don’t control?
Milcho: Players love to bring systems to their extremes almost immediately. Many players would crank up the tempo as high as it could go - above 200, sometimes going up until 666 (that’s 10 basic attacks per second). We designed many of our systems to run well up until 180 or so BPM, but quickly realized we needed to support more extreme speeds for player experiments.
How do you handle tracks with inconsistent rhythms, tempo changes, or more ambient structures where a clear beat isn’t always present?
Destiny Treptau, Software Engineer: We handle this through what we call BPM sections. This allows the user to place a beat exactly where they want to in the song and change the BPM. We then calculate the beat from the start of the section until either we hit another section or we hit the end of the song. This allows users to modify the beats of the song as much as they want. Since the start of a section is a beat, a user could even hand-author every beat of the song if they wanted to.
From a gameplay design perspective, how do you ensure that actions still feel responsive and satisfying even if the music analysis isn’t perfectly precise?
Milcho: As part of our BeatWarping setup, we annotate every single animation in our game with sections that can stretch or shrink. Each section has settings for how it should sync to the music, including what beat patterns it is valid for.
This is what makes every punch land on beat. We can sync to a variety of sync points, including mid-point between two beats, every ¼ beat, or a completely custom marker that’s been put in the song.
Milcho: This 0.7sec region of animation should look forward and choose the 1st beat from now and stretch so that its end matches that beat.
After the animation playback is synced, the SFX and VFX align on top of it naturally.
Jon: One of the advantages of the BeatWarping tech is that we can align our animations with most music regardless of when the animation starts playing at runtime.
When we were first brainstorming the idea that became Dead as Disco, one of our concerns with rhythm games in general was how punishing they felt if the player’s input was not in time to the music. Most rhythm games will wait until the next appropriate moment to start the animation, so there’s a feeling of input lag if you’re not hitting the button on beat. With BeatWarping, we can start the animation immediately - Charlie is always ready to jump into action, and he’s always moving to the music.
Players are still rewarded for hitting the inputs on beat with higher scores, more impactful attacks, and special animations & effects, but Charlie Disco always feels cool to play, no matter how bad your timing.
What tools, middleware, or custom tech are you using for audio processing and synchronization?
Milcho: We use FMOD to play the music, but we don’t do a lot of fancy things to the music playback. Instead, we make the gameplay logic do the work. For that, we only need the bpm of the song and that the song is cut so that it starts on a beat.
Milcho: Unreal handles playing animations, and our BeatWarping tech handles setting the correct playrate of animations and firing combat events (e.g., punch landed now)
Jon: We have an internal tool we call SongCrafter that allows us to add data tracks to all of our story mode music. Using these data tracks, we can do more complex things like the moveset syncs, lighting changes, and enemy spawns in time to the music.
Inside the tool, we can preview animation syncs and even jump into the game at the selected moment in the song. We want to get a user-friendly version of this tool out to our players. We intend to keep expanding our My Music functionality until it has the full feature set we use internally.
Were there any unexpected discoveries or breakthroughs while developing this system?
Milcho: We have this tech, which we internally call “Moveset syncs”. These are cool moments of a song that we want to match with cool animations. This is where you see this symbol come up:
Milcho: Basically, an enemy is about to punch you right before a cool song moment, and if you counter them, you trigger this cool sequence of animations.
The way they work is that we manually annotate some regions in the song to say “here’s a percussion pattern that matches this moment in the music,” and then we have a somewhat involved system which tries to determine:
- At what point should enemies start attacking you so their attack lands as the sync starts
- Who should attack to give you the best chance of countering
- How do we fit the animations we have made into the annotated markers on the song
Another thing we learned along the way was that it feels best to KO enemies at the right point in the music. Originally, you could fluidly enter and exit each segment depending on how much HP your opponent had. It turned out that if Charlie wasn’t hitting his target until the very end of the sync, which generally coincided with the most impactful part of the music, then the player felt out of sync as well.
So now we can enter the sync at any point, but always guarantee to play it until the end with a KO attack, lining up with the last annotated marker for extra emphasis.
Finally, what advice would you give to developers trying to build music-driven systems that work with dynamic or user-generated content?
Milcho: Use the tools that you’re going to ship to build your content. That’s the best way to guarantee your tools are feature right, optimised, and working well. Visual Studio crashes a lot less than Photoshop for a reason
Jon: Find ways to match your game to the music, not just the beat. From the moment you start development, build anything you want to be music-matched to be synchronized to dynamic events, not a clock timer. That way, when you modify your music, either to a different song entirely or just because you needed to add a new chorus, you don’t end up rebuilding your whole level with new timings.
Destiny: For user-generated content, don’t be afraid to give users as much control as possible. You should always have an easy option for players who just want to play the game to their song. But for the players who want to dive deeper, they should be given the tools to create exactly what they want, even if it breaks the limit of our designs. User-generated content is all about creativity, and we should never limit that.
Brain Jar Games, Game Development Studio
Interview conducted by David Jagneaux
Don't forget to subscribe to our Newsletter, join our 80 Level Talent platform, follow us on Twitter, LinkedIn, Telegram, and Instagram, where we share breakdowns, the latest news, awesome artworks, and more.
Are you a fan of what we do here at 80 Level? Then make sure to set us as a Preferred Source on Google to see more of our content in your feed.
Subscribe to 80 Level Newsletters
Latest news, hand-picked articles, and updates