Vasant Verma showed how Particle Instance noise can be controlled with real-life hand gestures.
Interactive Artist and Motion Designer Vasant Verma has recently showcased an intriguing experimental setup that allows him to control particles in a digital space using only hand gestures.
Developed to explore the manipulation of Particle Instance noise through hand movements, the setup employs Google's ML-based MediaPipe plug-in in Unreal Engine, coupled with a standard webcam, and Derivative's TouchDesigner, a node-based visual programming language for real-time interactive multimedia content. According to Verma, the project's main goal was to pinpoint specific hand positions for precise control over noise parameters.
You can learn more about the project and check out Vasant's earlier works by clicking this link.
Earlier, Steven Mark Kübler also showcased a similar setup that employs an Arduino-based controller alongside TouchDesigner to enable its creator to manipulate digital particles through actions as straightforward as blowing air into a sensor, hand movements, or submerging a sensor in water:
Don't forget to join our 80 Level Talent platform and our Telegram channel, follow us on Instagram, Twitter, and LinkedIn, where we share breakdowns, the latest news, awesome artworks, and more.