Karen X. Cheng explained how to make clothes change in a video using DALL-E.
One of the practical applications for text-to-image AI is creating clothes, we've seen it in the collection in the style of Wes Anderson made by Midjourney and in Paul Trillo's fashion show. In fact, it's the latter who inspired Director Karen X. Cheng to make her video of fashion in motion.
Cheng presented her AI-made project of changing outfits and explained how the smooth effect of shifting clothes was achieved.
She first used DALL-E to generate clothes by erasing parts of her existing outfit and inpainting over it. However, when she erased the entire outfit, the results didn't look as good, so she kept some parts of the original so DALL-E was able to better match color and lighting.
Karen says DALL-E doesn't give you consistency from frame to frame as it is not designed for video, which you can see in her early experiment:
She wanted the same outfit to persist for several frames, so she used EbSynth, which brings pictures to life by animating them. If you want to learn more about how it works, check out our interview with the creator.
To make the transition smoother, Karen used DAIN (Depth-Aware Video Frame Interpolation), which also gave the video a slow-motion effect.
Making fashion with AI is fascinating, although some are afraid the need for artists and designers will fade the more we use it. However, it doesn't seem the case now as you can see, people still need to work with the tools available to create good results.
So check out Karen X. Cheng's Twitter to see more awesome experiments and don't forget to join our Reddit page and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.