How a Solo Developer Created a Cold War Animated Series With UE5
Lewis Roscoe discussed the War at Home project, explaining how he began animating it in Maya, later moved to Unreal Engine, and now works with the assets while tackling challenges.
Long-Form Cinematic Series in UE5 as a Solo Creator
Lewis Roscoe is a New Zealand-based solo creator developing The War At Home, a long-form animated Cold War series produced inside Unreal Engine 5. Originally built using a traditional Maya pipeline, the project has since transitioned into a real-time production workflow.
In this article, Lewis shares how he sustains cinematic-scale production as a single creator, along with the technical lessons learned from pushing UE5 through a long-running episodic pipeline.
Background & Project Origins
I remember playing Massive Entertainment's World in Conflict in the mid-2000s and being completely captivated by it. Not just the gameplay, but the storyline. It made me think: what if I created my own Cold War-turned-hot story, but set it in New Zealand?
Around 2014, I started loosely planning what would eventually become The War At Home. By 2018, I concluded that if it was going to exist, I would have to make it myself. Before that, I had spent a few years making small 2D animated pieces on YouTube using Flash as a hobby.
Over time that gradually transitioned into the 3D realm. I seem to have an ADHD-driven need to tell stories, and I really enjoy the freedom of working without production schedules or logistics slowing things down. Running at my own speed has always suited me, so being a solo creator became the natural default.
Transitioning from Maya to Unreal Engine
From 2018 to 2025, I was working primarily in Maya 2014. The reason was fairly simple: a rigging tool I relied on called Anzovin had stopped development and never released versions compatible with newer versions of Maya. Let's just say that wasn't the most efficient setup.
While powerful GPUs helped prevent crashes during scene navigation and animation, they did absolutely nothing for the render cycle itself. Everything came down to CPU rendering. I could build elaborate scenes that ran smoothly while animating, only for the render stage to slow to an absolute crawl.
At one point, my home office had seven computers running simultaneously just to distribute renders and try to move things along. I always knew I would eventually move to Unreal Engine. The real question was when. The scale of converting assets across felt potentially astronomical, and I didn't fully understand what the transition would involve.
Eventually, an episode came up that required relatively few assets and environments. That felt like the perfect opportunity to make the jump. Thanks to the excellent YouTube channel Bad Decisions Studio, I crash-coursed myself into Unreal Engine 5 and immediately dove in. I began importing assets and recreating scenes from Maya inside UE5.
Fortunately, most of my assets were already exported as FBX files, which made the basic geometry transfer relatively painless. The bigger challenge was rebuilding textures, pivot points, and scene hierarchy from scratch. UV maps transferred correctly about 90% of the time, but my somewhat messy Maya workflow meant there were occasionally extra UV channels hiding in places I didn't realize. Cleaning up assets became a necessary part of the migration process.
My Maya pipeline typically worked like this:
- Set up characters and environments
- Animate each shot in its own file
- Generate playblasts for editing
- Assemble the episode edit
- Finally, render each shot individually once the edit is locked
Unreal Engine works very differently. The project contains all assets in a single environment, so creating separate files per shot isn't practical. Instead, I set up the scene, animate a shot, render it, reset characters if needed, and move on to the next shot. There are pros and cons to this. Occasionally, you can lose animation data if you don't bake things properly before resetting characters.
Scene housekeeping becomes important, which, admittedly, is probably my weakest skill. The difference overall, however, is night and day. Being able to see lighting, VFX, fluids, and other effects update in real time dramatically reduces the need to render test frames just to verify that something works. Leveraging GPU rendering also meant I could go from a seven-computer render farm to a single workstation that renders faster.
Depth of field is a great example. In Maya, I used to fake it in After Effects using Z-depth passes because enabling it directly in renders could bring the system to a grinding halt. In Unreal Engine, depth of field is practically effortless. Lip syncing also became dramatically easier. In Maya, every phoneme had to be shaped frame-by-frame manually.
UE5's audio-driven lip sync tools completely changed that process. My tenth episode, the last produced in Maya, took three months just to render, despite being only ten minutes long. With Unreal Engine, I can now produce an episode of the same length in roughly two months total, including asset creation, animation, and rendering. For viewers, that means episodes are released more frequently and with improved visual quality.
Sustaining Long-Form Production as a Solo Creator
The episodic workflow is still fairly intense, but Unreal Engine has made it far more manageable. The process generally looks like this:
- Expand the story outline into a full script
- Send dialogue to voice actors
- Record my own parts and process them with ElevenLabs to create additional characters
- Build environments, characters, and assets inside Unreal Engine and MetaHuman
- Begin animating and iterating on shots
Often, the first shot emerges simply from experimenting with a setup that gradually evolves into the scene. One of the biggest advantages of working in Unreal Engine is that every episode adds more reusable assets to the library. As more of my older Maya content gets converted across, production becomes progressively faster.
That said, it's definitely not without problems. Crashes are a frequent reality. I have a habit of pushing tools right to their limits, which means VRAM usage in some scenes is extremely high. Sometimes a crash is minor, and Unreal Engine simply restarts. Other times, the entire computer needs to reboot, followed by a long wait while the project reloads. To mitigate this, I've started paying more attention to optimization. For example:
- Using 2K textures on background MetaHumans
- Splitting scenes into sub-levels
- Disabling hardware ray tracing when necessary
Sometimes you simply have to sacrifice certain features if it means the shot will render successfully.
Technical Challenges in UE5
Even though Unreal Engine has dramatically improved the production process, it comes with its own set of technical challenges. One persistent issue for me involves MetaHuman groom hair. Under indirect lighting, the groom can sometimes blow out visually, resulting in extremely bright eyebrows or hair highlights.
Workarounds include switching to hair cards or ensuring the groom receives some direct light. It's a fairly common issue, so hopefully future engine updates will improve this. VRAM limitations are another challenge. Some environments in The War At Home take place in real-world locations, albeit in an alternate 1989 timeline, so I often use the Cesium plugin to import large-scale terrain and city data.
It's an incredible tool for aerial shots or naval scenes, but it can add significant memory overhead. Ship wakes are another area I'm still refining. In Maya, I relied on animated 2D cards created in After Effects. These worked reasonably well but were tied to fixed animation speeds. If the ship moved faster or slower, the wake animation often looked wrong.
Inside Unreal Engine, I've experimented with the Waterline plugin, which provides dynamic ship wakes. It works well but can be finicky, especially since previewing wake behavior requires running the simulation. Adjusting settings during simulation can also be frustrating because changes may reset once playback stops.
Niagara systems have helped a lot here. Niagara essentially allows you to use 2D cards within a 3D particle system, which makes it perfect for things like explosions, fire, smoke, and, in my case, supplemental ship wake effects.
Designing Naval Interiors Without Complete References
Reference material can be surprisingly difficult to find, especially when recreating military environments. Even older ships sometimes have limited publicly available documentation. This becomes even more challenging when dealing with smaller navies like New Zealand's.
For Episode 16, I needed to create the Operations Room inside a Leander-class frigate circa 1989. The only useful references I could find were scattered images from Royal Navy archives, a few Falklands War documentaries on YouTube, and clips from the BBC series Warship, often in very low resolution.
That meant much of the environment had to be reconstructed from partial information. To fill in the gaps, I ended up kitbashing elements from U.S. Navy CIC environments I had built for earlier episodes, combined with British equipment references from the Leander bridge, where slightly better images existed.
In this case, the real hero was lighting. Unreal Engine handles low-light environments beautifully, which works perfectly for operations rooms that traditionally operate under dim lighting conditions. Combining that with shallow depth of field and tight framing helps sell the illusion of authenticity even when reference material is incomplete.
Lessons Learned
Switching to Unreal Engine has opened up enormous possibilities for the series, but it has also reinforced the importance of production discipline. I've had moments where I needed to completely redo the animation because I forgot to bake it properly before resetting characters. Organization and housekeeping are skills I'm still actively working on.
Optimization is also becoming increasingly important as scenes grow more complex. Another tool that has become surprisingly valuable is ChatGPT. When I run into technical issues, it's often the fastest way to troubleshoot ideas or narrow down possible solutions.
Episodic storytelling also demands a reliable pipeline. Maintaining scene continuity, managing large project files, and keeping redundant backups, both locally and online, has become essential. One of the hardest lessons to accept as a creator is that perfection is unattainable. The temptation to endlessly refine shots can easily delay production.
Over time, I've realized that it's far better to release an imperfect episode than to endlessly polish something that never reaches the audience. The War At Home is currently in production on Episode 16, and the goal is simply to keep building the series one episode at a time.
Lewis Roscoe, Developer
Subscribe to our Newsletter, join our 80 Level Talent platform, follow us on Twitter, LinkedIn, Telegram, and Instagram, where we share breakdowns, the latest news, awesome artworks, and more.
Are you a fan of what we do here at 80 Level? Then make sure to set us as a Preferred Source on Google to see more of our content in your feed.