Meta Taught AI to Make Virtual Worlds from What You Say

Builder Bot can create 3D objects as well as play sound effects.

Meta's CEO Mark Zuckerberg showed the Builder Bot prototype – an AI proof of concept that can make virtual worlds from what you say. 

In the video, Zuckerberg demonstrated how it works by giving commands like "let's go to the beach", "let's add an island", "let's play some tropical music". The Bot creates an environment based on both simple and complex commands, like "add altocumulus clouds" and "let's get some sounds of waves and seagulls." Unfortunately, Zuckerberg didn't say how the technology works, if Builder Bot generates the objects itself or if it takes them from a library. 

Meta is not planning to stop here and will bring more complex interactions in the future.

"You’ll be able to create nuanced worlds to explore and share experiences with others with just your voice," said Zuckerberg.

Additionally, Meta is planning to work on a universal translator, build AI models that can learn from less common languages, and design new ways to translate speech in real-time.

Watch the full presentation here and don't forget to join our new Reddit pageour new Telegram channel, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more