Arman Jangmiri showcased a magical-looking experiment with Epic's Unreal Engine and OSC.
Without a doubt, one of the most magical-looking use cases for standard game development and digital art applications is their combination with real-life devices, which blurs the line between the real and virtual realms.
A prominent figure specializing in such endeavors is the 3D Designer from China 五天晴, who, back in 2022, managed to light a 3D scene in Blender using an iPhone and even move objects in Blender by simply tilting a phone. This feat was made possible through an app based on Open Sound Control (OSC) technology, which harnessed the phone's gyroscope to detect motion. This motion data was then linked to objects in Blender using a dedicated Blender OSC add-on, enabling manipulation of the 3D models through phone movement.
Another innovator, Arman Jangmiri, a Creative Technologist at QOO Studio, has recently joined the ranks of these OSC magicians. Inspired by 五天晴's aforementioned lighting experiment, Arman showcased an incredible setup that lets him control lighting in an Unreal Engine-made environment by flickering a real-life light switch, which not only affects the lights in the 3D scene but also in the real room.
As per Arman's explanation, this accomplishment was realized by employing a light sensor on an Android phone, with the collected data transmitted to Unreal via OSC. As a result, the in-game lights could be controlled to mimic the illumination of the physical room in real-time.
Furthermore, the creator also leveraged OSC and the phone's accelerometer and gyroscope sensors to control radial force field physics in Unreal Engine in real-time:
The author also recommended a couple of insightful tutorials that will help you learn more about the OSC+Unreal Engine combination:
On top of that, the creator shared an exclusive short breakdown with 80 Level, providing more details on the demonstrated setups and revealing the app that was used to make them:
"Because of my background in Electronic Music Production and live DJing/VJing performances, I was really a fan of the OSC protocol, which is the go-to standard for controlling Live Music Visuals, allowing countless software and devices on the same network to even wirelessly communicate with each other.
In these two demonstrations, I used the Sensors2OSC app on Android to send the data of different sensors on my phone via OSC to my PC. Then I used Unreal Engine OSC to create an OSC server in Blueprints, listen for the received data from the phone, and parse them however I like, all in Blueprints.
In the video with the emissive ball and radial physics, I used the Accelerometer sensor data from my phone to control the movement of the emissive ball (which also contains an attractive force field) and activated physics in the balls around it to create this cool physics simulation (did all in Blueprints via OSC messages received from phone).
In the second video, I just used the light sensor data from my phone to match the lights in the Unreal Engine scene to my real-life room lighting picked up by the phone sensor."
Click here to check out more of Arman's works. Also, don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on Threads, Instagram, Twitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.