The Sagans team shared a comprehensive breakdown that explains how they created a cyberpunk music video using AI, Unreal Engine 5, and MetaHuman.
Sagans, a team of musicians, graphic designers, and researchers, shared an in-depth breakdown that explains how they created an Akira-inspired cyberpunk-style music video named Coherence. In this breakdown, the team explained how they utilized AI programs to generate most of the video's shots, spoke about using MetaHuman to create the main character, showed how Niagara and Lumen were used in Unreal Engine 5 to set up rain particles and lighting, and more.
According to Sagan's 3D Specialist Aurelien, the team blended elements of three faces from the MetaHuman database to create the music video's main character and utilized the Live Link Face app to animate her facial expressions.
As for the environments, the team utilized Megascans assets, Disco Diffusion 5, and three separate AI algorithms to create the final Akira-like style. Additionally, the team utilized Lumen and Niagara in Unreal Engine 5 to set up lighting and add small particles like rain.
"When it came to creating environments, Lumen was crucial," commented Aurelien. "With it, we were able to work on lighting in real-time, without using a lot of hardware resources. Together with Niagara, which we used for particles like rain, we knew we could generate a render that corresponded exactly to what we saw in our UI in just seconds."
"We wanted a solution that was both simple to use and fast enough to get our vision realized very quickly, and Unreal Engine didn’t disappoint," added Aurelien. "Within a few clicks, we could easily create an extremely detailed character and city environment, and animate our heroine without a motion capture suit."
And here's the music video itself:
You can read the full breakdown here.