logo80lv
Articlesclick_arrow
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_login
Log in
0
Save
Copy Link
Share

How Iconic Games Designs Voice-Driven Player Interactions for Dynamic Game Stories

Iconic Games discusses its approach to voice-driven interaction, dynamic storytelling systems, and building new types of player experiences from the ground up.

As game design continues to evolve beyond traditional input systems and branching dialogue trees, developers are exploring new ways to make player interaction feel more natural and expressive. Rather than relying on predefined responses or rigid systems, some teams are experimenting with more flexible approaches that allow players to engage with characters and worlds in ways that better reflect how people actually communicate.

In this interview, the team at Iconic Games breaks down how they approach building these systems, from voice-driven interaction and responsive character behavior to the tools and pipelines required to support them.

Iconic Games describes its titles as “AI-native” experiences rather than traditional games augmented with AI features. How does that philosophy shape the way you approach gameplay systems and development pipelines from the ground up? 

Andrew Bowell, CEO: We focus on the player experience and try to unlock new types of games and storytelling that weren’t possible before. We never want to tack on AI to an experience that would be better without it. We start with the experience. Once we've defined the kind of experience we want to create, we then deploy our R&D team to build the technology that makes it possible.

Working on the bleeding edge of new technology means there are often no existing tools or workflows to follow, so our R&D, Core Tech, and Game teams collaborate closely to create our own. Those tools and workflows are then put into practice as we build and develop our experiences. In effect, our product team becomes the first customer for the technology we create, which allows us to identify and iron out any issues early. That process helps us refine the tools as we go and ultimately deliver new experiences for players. The Oversight Bureau is our first demonstration of this technology, and it was very well received at the most recent Gamescom.

Your platform focuses on voice-driven interaction rather than traditional controller inputs or dialogue trees. From a technical standpoint, how does your system interpret and structure open-ended player speech into actionable gameplay logic? 

Mikel Bober-Irizar, Co-founder & CTO: We have a patent-pending set of systems that are able to interpret what the player says and use that in the context of the game and level to keep the experience focused. We think of our characters as improv actors - they’ll work with and roleplay with the player to lead them through the game, but the game designers are always in control of the experience.

The idea of memory-aware AI characters that evolve alongside the player is interesting. How are you designing persistent character memory systems that can maintain narrative coherence over long play sessions?

Martin Connor, Co-founder and Chief Creative Officer: The answer is simply with close collaboration. It really is the magic of our setup. We sit our narrative designers with the research scientists and develop the features necessary to deliver temporal relationships with characters in our worlds. Conversational memory is one of those features, but there are a whole host of others that we’re currently collaborating on, and we look forward to sharing more on that soon.

Running AI systems directly on-device is a major technical challenge. What breakthroughs or architectural decisions have made it possible to deliver AI-driven characters locally rather than relying entirely on cloud inference? 

Andrew Bowell CEO: We take advantage of the latest innovation and our own work in model quantisation and distillation to enable our models to run as efficiently as possible. We can specialise our models for specific games and experiences, rather than needing general chatbot behaviour. In The Oversight Bureau, we leverage large-scale precomputation to allow the game to run totally offline, even on a Steam Deck. Hardware improvements are allowing bigger and better models to run on-device and will continue to improve the level of experience we can deliver.

Latency and reliability are critical when voice becomes the primary interface. How does Iconic approach speech processing, intent recognition, and response generation in a way that keeps interactions feeling natural and responsive? 

Mikel Bober-Irizar, CTO: We always prioritise the player experience, and low interaction latency is paramount to that. You can have the same interaction, and it feels so much better when characters respond fluidly, so we account for every millisecond and do a lot of work to achieve that. To achieve this, we use a customised speech recognition system, developed with NVIDIA. Depending on the game, we’ll deploy a different set of models, but The Oversight Bureau uses what we call a conversation model that runs extremely fast.

1 of 3

AI-driven dialogue raises interesting narrative design questions. How do you maintain authorial intent and story structure when characters can respond dynamically to player input?

Andrew Bowell, CEO: This is something we think about a lot. It’s important to us that we give players freedom to express themselves while staying faithful to the story and gameplay progression. All our internal tooling is focused on giving writers creative control of how characters act and respond, all the way to defining the exact phrasing of dialogue if needed. At the same time, our characters need to improvise based on how players act, and can gently nudge the player along the main storyline, allowing them to actually experience the whole game without getting stuck in an unintended rabbit hole.

Ethical concerns around generative AI remain a major topic in the industry, particularly when it comes to training data and artist rights. How is Iconic approaching responsible AI usage when building these systems? 

Mikel Bober-Irizar, CTO: We take care to source and license training data responsibly. At the end of the day, games are human and artistic endeavours, allowing the best artists, writers, designers and creatives to tell meaningful stories. It’s important to us that the creative direction of our games is human-led and any AI-powered features are there to fulfill that vision, rather than constraining or replacing it.

Related to that, how do you think about copyright and ownership when AI-generated dialogue or character interactions emerge dynamically during gameplay? 

Andrew Bowell, CEO: We believe that the human inputs into AI systems - the lore, character designs, game design and programming - should remain copyrighted elements. The model outputs themselves may not be copyrightable because they are not human-created works, except where they may be considered derivative works of those human inputs. In practice, this points toward a model where creative ownership remains anchored in the human contribution. For us, that reinforces the importance of maintaining the human touch at the core of our work, with AI as a tool that supports and extends creativity rather than replacing it.

With a $13M seed round supporting development, what parts of the technology stack are you prioritizing most right now? AI models, tooling for designers, the player-facing experience, or something else? 

Andrew Bowell, CEO: We’re continuing to focus on our conversational game stack, where we have a large number of improvements already over what we shipped in The Oversight Bureau. The focus is always on gameplay first, and we build models and tooling that allow us to achieve that vision. As well as this, we’re investing in a number of efforts to build systems that go beyond speaking to characters, allowing them to interact with the world more dynamically and have the world react to the player as well. We’re excited to share more on this soon.

Looking ahead, how do you see AI-native interaction changing the way developers think about gameplay design compared to traditional branching narratives or scripted dialogue systems?

Martin Connor, Chief Creative Officer: Looking ahead, I think AI-native interaction will push developers to think less in terms of branching dialogue trees and more in terms of truly non-linear narrative. Instead of players choosing from a small set of pre-written responses, they can participate more naturally and express themselves in their own words. That creates the possibility for stories that feel more personal, because players are shaping how the experience unfolds through the way they speak, persuade, question, or connect with characters. At the same time, we don't believe this will ever replace authored storytelling. Great narrative still needs structure, pacing, and intent. I think what will change is the design mindset: creators move from scripting every possible exchange to building characters and story systems that can respond in a believable way to unexpected player input.

Martin Connor (Chief Creative Officer), Borja Gonzalez Leon (Chief Scientist), John Lusty (Founder & President), Mikel Bober-Irizar (CTO), Andrew Bowell (CEO), Johnny Venables (Chief Product Officer)

Iconic Games, Game Development Studio

Interview conducted by David Jagneaux

Subscribe to 80 Level Newsletters

Latest news, hand-picked articles, and updates

Ready to grow your game’s revenue?
Talk to us

Comments

0

arrow
Type your comment here
Leave Comment
Ready to grow your game’s revenue?
Talk to us

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more