Glaze: a Tool to Protect Artists from AI Theft

It makes changes to the artwork, invisible to the human eye.

While the artists vs. AI war keeps raging on, those concerned about copyright are looking for ways to eliminate artworks from text-to-image models' training databases. To help, researchers presented Glaze, a tool that can hide art from AI eyes.

According to the paper, it enables artists to apply "style cloaks" to their art before sharing it online. The cloaks "apply barely perceptible perturbations to images, and when used as training data, mislead generative models that try to mimic a specific artist."

The creators have worked with over 1,100 professional artists, assessing their views of AI art and the efficiency of the tool, as well as its usability and tolerability of perturbations and robustness across different scenarios.

The authors are working closely with Karla Ortiz, an illustrator and artist, one of the three creators that filed a lawsuit against Stability AI, Midjourney, and DeviantArt. With Glaze, she can upload her work and choose an art type different from her own. The tool makes changes to her art that AI would associate with something else entirely. The changes are invisible to the human eye, but the model will pick them up.

“We’re taking our consent back,” Ortiz said. AI tools “have data that doesn’t belong to them. That data is my artwork, that’s my life. It feels like my identity.”

Glaze is not available for download yet, but the creators are planning to release it on macOS and Windows for free in the coming weeks. Check it out here and don't forget to join our 80 Level Talent platformour Reddit page, and our Telegram channel, follow us on Instagram and Twitter, where we share breakdowns, the latest news, awesome artworks, and more.

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more