logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_login
Log in
0
Save
Copy Link
Share

Meta, Formerly Facebook, Teaches Robots to Perceive Through Touch

Check out Meta AI's latest achievements in tactile sensing.

Meta AI, formerly Facebook AI, has shared a massive blog post where the team shared their latest achievements in teaching robots to perceive, understand, and interact through touch. According to the team, Tactile Sensing, an emerging field in robotics that aims to understand and replicate human-level touch in the physical world, is important for making robots more efficient in interacting with the world around us.

Meta AI has presented 3 main technological achievements:

  • DIGIT: Easy-to-build, reliable, low-cost, compact, high-resolution tactile sensor designed for robotic in-hand manipulation.
  • ReSkin: An open-source touch-sensing “skin” that has a low form factor and can help robots and other machines learn high-frequency tactile sensing over larger surfaces.
  • TACTO: A simulator for high-resolution vision-based tactile sensors to enable a faster experimentation platform and to support ML research even in the absence of hardware.

"Improvements in touch sensing can help us advance AI and will enable researchers to build robots with enhanced functionalities and capabilities," reads the post."It can also unlock possibilities in AR/VR, as well as lead to innovations in industrial, medical, and agricultural robotics. We’re working toward a future where every single robot may come equipped with touch-sensing capabilities."

You can read the entire post here. Also, don't forget to join our new Reddit pageour new Telegram channel, follow us on Instagram and Twitter, where we are sharing breakdowns, the latest news, awesome artworks, and more.

Ready to grow your game’s revenue?
Talk to us

Comments

0

arrow
Leave Comment
Ready to grow your game’s revenue?
Talk to us

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more