A New Smart Carpet Can Capture 3D Poses Without Cameras
The device uses over nine thousand sensors to capture all the movements and poses.
Have a look at a new tactile sensing carpet from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) that allows users to estimate human poses without using cameras. The new project is meant to improve self-powered personalized healthcare, smart homes, and gaming.
The carpet is said to be low cost and scalable featuring over nine thousand sensors spanning thirty-six by two feet. Sensors allow the carpet to convert the human’s pressure into an electrical signal using the physical contact between people’s feet, limbs, torso, and the carpet as the base data.
"The model takes the pose extracted from the visual data as the ground truth, uses the tactile data as input, and finally outputs the 3-D human pose," wrote the team.
It is said that the carpet can predict a person’s pose with an error margin of less than ten centimeters. The data is then tweaked by a neural network.
"You may envision using the carpet for workout purposes. Based solely on tactile information, it can recognize the activity, count the number of reps, and calculate the amount of burned calories," said Yunzhu Li, a co-author on the paper.
You can find a technical paper here. Don't forget to join our new Telegram channel, our Discord, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.