Inworld AI: Building AI-Driven Virtual Characters

Inworld AI's Ilya Gelfenbeyn and Kylan Gibbs have told us about the tech behind their new developer platform for building AI-driven virtual characters, shared how to integrate their solution into games, and spoke about various use-cases for the platform.

Introduction

80.lv: Please introduce yourself. Where did you study? What companies have you worked for? What projects have you contributed to?


Ilya Gelfenbeyn: I’ve been in the conversational agents' space for about 20 years now since I was in university and working on chatbot experiments as part of my research. In 2010, I co-founded a startup called Speaktoit, which was one of the first and most highly rated consumer voice assistant apps. We saw a lot of interest from companies looking to leverage our tech, so in 2014 we launched API.AI – a platform for building conversational AI agents.

Mike, our current CTO, joined us then to spearhead our AI/ML efforts. In 2016, we were acquired by Google to power the Google Assistant Developer platform. Later we renamed API.AI into Dialogflow, and it is now being used both in Google Assistant and Google Cloud Conversational AI. And Kylan joined us when we started Inworld.

Kylan Gibbs: I’ve always had a deep interest in communication and AI. I studied at McGill University as an undergrad and then at Cambridge for grad school, during which I realized most business and political processes emerge from messy communication between biased humans. AI/ML seemed a much better way to analyze larger quantities of data and draw more reliable conclusions. I helped businesses improve decision-making with AI/ML while at Bain & Company and then transitioned to DeepMind.

At DeepMind, I became interested in how we could more directly communicate with AI and helped establish our product efforts in Conversational AI and generative models. I worked to integrate new models into Google products, which is ultimately how I met Mike, our CTO. When Ilya then shared the vision for how much deeper the interaction with AI could be within immersive worlds, I realized the potential and was sold.

Inworld AI

80.lv: Could you tell us about the story of Inworld AI? How did it all begin? What was the original idea?

Ilya Gelfenbeyn: Inworld AI is our new developer platform for building AI-driven virtual characters, founded in 2021. We’ve been looking at the growth of virtual worlds, such as Meta Horizons and Roblox, and realized these worlds would hugely benefit from having a virtual population. Virtual characters in these worlds could play different roles and, in general, make them more immersive.

While it may take some time for these worlds to become mainstream, there is another market that currently exists and has a huge need for AI character disruption – gaming. While the visual and performance aspects of games have greatly improved, players are still interacting with the same basic NPCs that have been around for the better part of 30 years.

Therefore, we started to think about how we could build a platform that would allow anyone to build intelligent in-world characters that could be integrated into any number of rich, immersive experiences, such as video games.

The Technology

80.lv: Please discuss brains for games/the metaverse. How does your tech work? What are the main strengths?

Kylan Gibbs: Our tech stack takes its inspiration from how humans interact with the world and relies on three main functions that our brains perform automatically: perception, cognition, and behavior. Perception is focused on inputs from other agents and the environment around us. Cognition refers to our internal state – things like our memory, emotions, and personality. Behavior refers to how we react, everything that is the output from the given input, and our unique internal state. This can include speech, gestures, body language, and any form of motion.

Obviously, it is a challenge to copy the human brain. Still, our platform is able to achieve this by using a combination of AI technologies including large language models for natural language understanding and generation, various models for computer vision, and tools such as reinforcement learning and rules-based conversational AI. 

Inworld Engine

80.lv: The platform has an AI-based core, right? What previous frameworks did you use to develop your algorithm? How did you train it? How much time did it take? What were the main challenges?

Ilya Gelfenbeyn and Kylan Gibbs: Characters powered by the Inworld Engine have a human-like perception, cognition, and behavior that allows them to interact naturally. Intelligent in-world characters can perceive their virtual environments through audio and visual perception. Based on the personalities and cognitive profiles provided by their developers, they can conceive characteristic responses, feel emotions, and develop memories.

Ultimately, Intelligent in-world characters can engage in complex interactions through speech, facial gestures, body language, and motions. This experience is powered by an integrated architecture including natural language processing, emotion recognition, reinforcement learning, and goal-oriented conversational AI. 

Integration into Games

80.lv: How difficult is it to integrate your solution into a game? What are the steps? Can the solution be customized and upgraded for different needs?

Kylan Gibbs: Our platform is meant to be highly accessible while offering the capacity for deep customization and integration. The journey begins with Inworld Studio, which is a natural language product that allows anyone to create any character, simply by explaining the character in natural language and using a simple interface.

This allows anyone – writers, narrative designers, or other creatives to jump in and start building. You can then talk to your character right in the studio, or use the Inworld App (currently on Oculus, and soon desktop and mobile) to have a deeply immersive interaction right away.

Then, it’s time to integrate. We already have robust integrations with Unity and Unreal Engine, as well as a bunch of cross-platform integrations that make it easy to import and configure our solution for any experience. From end to end, our suite of tools should allow you to create your perfect character and deeply customize the integration to suit whatever experience you’re building.

Partnerships

80.lv: Did you consider partnering with Epic, Unity, or other companies to make your solution a part of their bigger frameworks? How do you approach the business side of things?

Ilya Gelfenbeyn: Yes, sure. As Kylan mentioned, we have both Unity and Unreal integrations, but we are also talking to both companies directly to come up with the best ways to offer our solutions to their developers. Inworld Studio will be offered as part of these engine platforms' asset stores and will be optimized for the best integration with these top engines. We are also working with a number of independent engines and virtual world-building systems to make it easy for their users to integrate intelligent in-world characters.

From the business side of things – there are two approaches: firstly, we are targeting independent developers through the asset stores and our self-serve platform and, secondly, we are working with large studios and virtual world platforms through a more hands-on approach, offering them services to build and integrate characters there.

Ilya Gelfenbeyn and Kylan Gibbs: We’ve seen some great support across industries. We have amazing angel and venture investors, such as BITKRAFT, CRV, and Kleiner Perkins, as well as founders/executives at Oculus, Twitch, Roblox, Riot Games, and The Sandbox. Just to name a few!

Major game studios are taking notice. Inworld AI is working with Meta and other major metaverse builders on Inworld character integrations. Developers can easily integrate Inworld Studio intelligent NPCs with Unreal and Unity. 

Use-Cases

80.lv: What are some of the use-cases you are seeing?

Ilya Gelfenbeyn and Kylan Gibbs: So far, we’ve seen compelling use-cases across three main verticals: metaverse experiences, gaming, and enterprise.

For metaverse platforms, it’s all about building characters for the worlds themselves. Imagine an AI-powered onboarding character that can greet new users, answer questions, or show users around. A positive first experience greatly increases the chances that a user comes back the next day. Similarly, a lot of these worlds are pretty empty in their initial stages, until they reach a critical mass of users. We make it easy to create a virtual population that gives life to the world in its early days.

For games, the onboarding use case also applies. But there are other key benefits to smarter NPCs. Designers can create much more interesting narratives using entirely new in-game mechanics, such as having to convince an NPC guard to let you pass. Developers can quickly build entire batches of NPCs, all with a shared set of knowledge. This all translates into a much more immersive and compelling experience for players.

As for enterprise, these use-cases actually transfer over quite well. Companies can create onboarding characters imbued with company knowledge, such as FAQs, values, and company history. They can give tours of virtual offices and maybe finally explain the company benefits in a way that doesn’t make your head explode. We also see a lot of potential to power virtual brand reps. Imagine a shopkeeper in a virtual mall.

Roadmap

80.lv: What’s your current roadmap? What did you plan for 2022? When will we hear from you again?

Ilya Gelfenbeyn: Well, just a couple of weeks ago we launched the Beta version of our self-serve platform, Inworld Studio, and we’ve already seen great traction across many types of users and use-cases. We also released a cool, high-fidelity technical demo built entirely on Unreal, using their MetaHuman technology. It’s part of a larger, playable experience that we plan to release in the coming months.

We plan to spend the next two quarters scaling the platform and preparing for a public launch. We’re also working on several bigger projects with large companies that are looking to use our tech to build custom solutions, and we hope to bring some of them to market by the end of the year. 

Ilya Gelfenbeyn and Kylan Gibbs, Chief Executive Officer and Chief Product Officer at Inworld AI

Interview conducted by Arti Burton

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more