Stable Diffusion's Inpainting Brought to Augmented Reality

Bjørn Karmann showed how a combination of AI and AR can be used to alter reality.

Last week, Developer and AI enthusiast Bjørn Karmann demonstrated a mind-blowing setup that shows how a combination of artificial intelligence and augmented reality can be used to alter reality. Meet We See!, Bjørn's proof-of-concept experiment, which utilizes Stable Diffusion's Inpainting capabilities to change real-life objects in AR. According to a short demo shared by the creator, We See! allows you to select the area you would like to alter and then use voice commands as text prompts to alter it.

You can find Bjørn's original post here. Also, don't forget to join our Reddit page and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more. 

Join discussion

Comments 1

  • Anonymous user

    Please make it more clear that he mentions it is a proof of concept, and it’s a good concept,  but this is not actually happening on the phone, not in real time, and not  in AR….
    Consider the lack of parallax movement between the trees and it’s clear he’s running this offline and editing it together to demonstrate his idea, but it is not actually functional.


    Anonymous user

    ·2 months ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more