AR Thing: Transform the Entire World into a Game Environment

Creative Technologist Shay Segal shared a detailed breakdown of the AR Thing project, explained the workflows in Unity and Blender, and showed how the multiplayer was set up.

Introduction

My name is Shay Segal and I'll do whatever it takes to turn my thoughts into tangible results.

I am a Creative Technologist, Designer, and Developer specializing in XR experiences and games. I currently work as the Head of XR at Resight and teach 3D and game design at Shenkar College of Engineering, Design, and Art.

The AR Thing Project

This application demonstrates how you can control a virtual character and interact with it in the real world. The character I have chosen is Thing from The Addams Family/Wednesday. My main goal was to enable full, real-time interaction with the physical world, turning it into a game environment without any restrictions on location. To accomplish this, it was necessary to transform the entire space into a 3D mesh and implement physics and collision to create a natural feeling of movement and interaction. The aim was to make the experience as seamless as playing a computer game.

The process began with a POC in which I used a digital joystick to move a cube on a surface in Unity. The joystick worked well, so the next step was to convert it into an augmented reality experience to ensure that the control of the cube would also work well in the physical world. To verify that, I used the plane detection feature of AR Foundation and carried out a small experiment on my desk. The cube was fully controllable and appeared to work well, so I continued to work on the model.

This was a 24-hour hackathon project, so I used an existing model of Thing and worked on it in Blender. I mainly focused on the textures and the animation; I adjusted the colors so that they would stand out against the real world and modified the movement to make it easy to control. Finally, I baked the textures and imported the model into Unity, replacing the original cube with the revised version of Thing.

Working with the 3D Thing model in Unity presented several challenges, including:

  • Ensuring synchronization between the animation and movement.
  • Creating smooth transitions between animations.
  • Adjusting Thing’s movement to match the camera’s POV.

After completing these tasks in Unity, I moved on to the main part of the application: allowing Thing to move freely in the real world without the need for pre-scanning each location. I wanted Thing's response to the environment to be immediate and real-time, so I used Resight Engine to achieve this. The engine has a convenient drag-and-drop component that enables quick mesh scanning to create a dynamic, live model of the world. This scan can be made visible with shaders or invisible and used solely for physics and collision. The beauty of this feature is that there is no need to specifically scan the room in any way; simply walk around with a mobile phone, and the scan is automatically and quickly generated.

After adding Resight's components (LibResight and MeshManager), I began testing Thing's movement in the real world, and I was happy to see that everything worked as expected. Next, I wanted to enable Thing to jump, so I added that ability and tested it on various surfaces such as the floor, couch, chairs, street, and even on my dogs. It all went smoothly! I also tested Thing's jumping ability on different floors of the building where I live, and it performed well.

To enhance the experience, I utilized another feature in Resight Engine, which enables a multiplayer experience. With this feature, I added the SnappedObject component to Thing and tested it on another mobile device to ensure that one Thing can see and interact with another Thing and that they are correctly positioned and synced in the world.

Now, multiple users can control their own Thing, walk around the world with them, and meet others. EveryThing just works.

Shay Segal, Creative Technologist

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more