An AI-Powered AR Mirror With Gesture Recognition

The system was designed to improve the effectiveness of fitness training.

A team of researchers from the Italian University of Brescia has presented a new AI-powered application designed to improve the effectiveness of fitness training both in home and gym environments. Utilizing a deep learning algorithm trained to recognize human gestures, the team has developed a computer vision system for a smart mirror that allows the user to see the joints and muscles involved in an exercise, allowing one to monitor the training and distribute the load more evenly.

According to the developers, the system is based on the skeletonization algorithm MediaPipe and runs on an embedded NVIDIA Jetson Nano device equipped with two fisheye cameras. The software has been evaluated considering the exercise biceps curl. The elbow angle has been measured by both MediaPipe and the motion capture system BTS. The resulting values have been compared to determine angle uncertainty, residual errors, and intra-subject and inter-subject repeatability. 

"A vision system, like the one we developed, can extract information from images by means of an AI algorithm. Our most recent paper demonstrates the accuracy of our system in measuring arm movements in simple fitness exercises, such as biceps curls," said Bernando Lanza, one of the researchers, to TechXplore. "For this project, we collaborated with AB-Horizon, our commercial partner. In addition to designing the gym machinery, our partner will integrate the vision system with their prototype. Their experience in the fitness industry allows us to develop our software using athletic principles and a personal trainer from the company also guides us through the testing process."

You can learn more about the system here. Also, don't forget to join our Reddit page and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more. 

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more