The model provides dynamic shadowing effects and realistic deformations.
Meta presented Dressing Avatars – photorealistic appearance for physically simulated clothing. The clothes show both realistic clothing dynamics and photorealistic appearance learned from real-world data.
Meta's neural clothing appearance model operates on top of geometry: at train time the researchers use high-fidelity tracking and at animation time – physically simulated geometry. The research introduces a physically-inspired appearance network that can generate photorealistic appearances with view-dependent and dynamic shadowing effects even for unseen body-clothing configurations.
The method demonstrates diverse animation results on different subjects and types of clothing. It can produce rich dynamics and realistic deformations even for loose clothing. The researchers state their model allows clothing to be used with avatars of different people while staying fully animatable, enabling photorealistic avatars with novel clothing.
Learn more about the technology here.