logo80lv
Articlesclick_arrow
Research
Talentsclick_arrow
Events
Workshops
Aboutclick_arrow
profile_loginLogIn

Meta Presented Photorealistic Clothing for Avatars

The model provides dynamic shadowing effects and realistic deformations.

Meta presented Dressing Avatars – photorealistic appearance for physically simulated clothing. The clothes show both realistic clothing dynamics and photorealistic appearance learned from real-world data. 

Meta's neural clothing appearance model operates on top of geometry: at train time the researchers use high-fidelity tracking and at animation time – physically simulated geometry. The research introduces a physically-inspired appearance network that can generate photorealistic appearances with view-dependent and dynamic shadowing effects even for unseen body-clothing configurations.

The method demonstrates diverse animation results on different subjects and types of clothing. It can produce rich dynamics and realistic deformations even for loose clothing. The researchers state their model allows clothing to be used with avatars of different people while staying fully animatable, enabling photorealistic avatars with novel clothing.

Learn more about the technology here

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more