Skip to main content

Researchers have created a way to cloak artwork so that it can’t be used to train AI

Original art (left), plagiarized art created by an AI model (center), and the AI-generated image after cloaking the original art (right). (Images courtesy SAND Lab / Art by Karla Ortiz) Original art (left), plagiarized art created by an AI model (center), and the AI-generated image after cloaking the original art (right). (Images courtesy SAND Lab / Art by Karla Ortiz)
Share

With the power of AI, it’s now possible to replicate distinctive art styles in minutes — an innovation that is leaving traditional artists in the lurch as their art is taken to train AI models that then steal job opportunities from them.

But what if you could stop AI models from being able to replicate your art style?

Researchers at the University of Chicago have made a tool they say will do just that, a filter that, once applied to an image, means that image can’t be read and reproduced by AI tools that scrape art online.

Called “Glaze,” a beta version of the free tool launched for download last week.

AI art can be produced instantaneously, but only because AI draws data from thousands of art pieces across the internet that took human artists weeks to months to create.

The creators say Glaze will allow artists to protect their distinct art style from being absorbed into the pool of data that AI art tools draw on.

“Artists really need this tool; the emotional impact and financial impact of this technology on them is really quite real,” Ben Zhao, Neubauer Professor of computer science at the University of Chicago, said in a February press release. “We talked to teachers who were seeing students drop out of their class because they thought there was no hope for the industry, and professional artists who are seeing their style ripped off left and right.”

The project involved surveying more than 1,100 professional artists, according to the release. The tool was tested on 195 historical artists, as well as four currently working artists, before a focus group evaluated Glaze’s accuracy in disrupting AI imitation.

More than 90 per cent of artists surveyed said they were willing to use the tool when posting their art.

Glaze is the second project by the University of Chicago’s SAND Lab which brings protection to images posted online. SAND Lab previously created a tool to shield personal photos so that they couldn’t be used to train facial recognition software back in 2020. But when they began to apply the same concept to art, a few problems arose immediately.

Photos of human faces can be boiled down to a few distinct features, but art is much more complex, with an artistic style defined by numerous things, including brushstrokes, colour palettes, light and shadow as well as texture and positioning.

In order to confuse the AI tools and ensure they would not be able to read the artistic style and replicate it, researchers needed to isolate which parts of a piece of art were being highlighted as the key style indicators by AI art tools.

“We don’t need to change all the information in the picture to protect artists, we only need to change the style features,” Shawn Shan, a UChicago computer sciences graduate student who co-authored the study, said in the press release. “So we had to devise a way where you basically separate out the stylistic features from the image from the object, and only try to disrupt the style feature using the cloak.”

To do this, researchers used a “fight fire with fire” approach. Glaze works by using AI to identify style features that change when an image is run through a filter to turn it into a new art style—such as cubism or watercolour—and then taking those features and adjusting them just enough to trick other AI tools.

They target the “Achilles' heel for AI models” which is “a phenomenon called adversarial examples-- small tweaks in inputs that can produce massive differences in how AI models classify the input,” according to the website.

Basically, Glaze changes these key elements on a piece of art ever so slightly, while leaving the original art almost identical to the naked eye, so other AI tools won’t be able to recognize, and thus replicate, the original art’s individual style.

“We’re letting the model teach us which portions of an image pertain the most to style, and then we’re using that information to come back to attack the model and mislead it into recognizing a different style from what the art actually uses,” Zhao said.

If an AI tool built to replicate the style of art pieces tries to replicate an art piece with Glaze on it, it will read that artists’ piece as having a different style, such as Vincent Van Gogh’s art style, and will produce an imitation that uses that style instead.

Although many AI art tools have already had the chance to learn from thousands of uncloaked images online, introducing more cloaked images online using Glaze will chip away at their effectiveness in imitation, researchers say.

In order to use it, artists can download Glaze onto their computer and run it on images that they hope to cloak from AI. They can also customize how many modifications Glaze introduces, with low modifications appearing almost invisible but offering less protection from AI, while larger modifications might be more visible but offer much more protection.

“A majority of the artists we talked to had already taken actions against these models,” Shan said. “They started to take down their art or to only upload low resolution images, and these measures are bad for their career because that’s how they get jobs. With Glaze, the more you perturb the image, the better the protection. And when we asked artists what they were comfortable with, quite a few chose the highest level. They’re willing to tolerate large perturbations because of the devastating consequences if their styles are stolen.” 

CTVNews.ca Top Stories

Local Spotlight

Video shows meteor streaking across Ontario

Videos of a meteor streaking across the skies of southern Ontario have surfaced and small bits of the outer space rock may have made it to land, one astronomy professor says.