Researchers have created a way to cloak artwork so that it can’t be used to train AI

With the power of AI, it’s now possible to replicate distinctive art styles in minutes — an innovation that is leaving traditional artists in the lurch as their art is taken to train AI models that then steal job opportunities from them.
But what if you could stop AI models from being able to replicate your art style?
Researchers at the University of Chicago have made a tool they say will do just that, a filter that, once applied to an image, means that image can’t be read and reproduced by AI tools that scrape art online.
Called “Glaze,” a beta version of the free tool launched for download last week.
AI art can be produced instantaneously, but only because AI draws data from thousands of art pieces across the internet that took human artists weeks to months to create.
The creators say Glaze will allow artists to protect their distinct art style from being absorbed into the pool of data that AI art tools draw on.
“Artists really need this tool; the emotional impact and financial impact of this technology on them is really quite real,” Ben Zhao, Neubauer Professor of computer science at the University of Chicago, said in a February press release. “We talked to teachers who were seeing students drop out of their class because they thought there was no hope for the industry, and professional artists who are seeing their style ripped off left and right.”
The project involved surveying more than 1,100 professional artists, according to the release. The tool was tested on 195 historical artists, as well as four currently working artists, before a focus group evaluated Glaze’s accuracy in disrupting AI imitation.
More than 90 per cent of artists surveyed said they were willing to use the tool when posting their art.
Glaze is the second project by the University of Chicago’s SAND Lab which brings protection to images posted online. SAND Lab previously created a tool to shield personal photos so that they couldn’t be used to train facial recognition software back in 2020. But when they began to apply the same concept to art, a few problems arose immediately.
Photos of human faces can be boiled down to a few distinct features, but art is much more complex, with an artistic style defined by numerous things, including brushstrokes, colour palettes, light and shadow as well as texture and positioning.
In order to confuse the AI tools and ensure they would not be able to read the artistic style and replicate it, researchers needed to isolate which parts of a piece of art were being highlighted as the key style indicators by AI art tools.
“We don’t need to change all the information in the picture to protect artists, we only need to change the style features,” Shawn Shan, a UChicago computer sciences graduate student who co-authored the study, said in the press release. “So we had to devise a way where you basically separate out the stylistic features from the image from the object, and only try to disrupt the style feature using the cloak.”
To do this, researchers used a “fight fire with fire” approach. Glaze works by using AI to identify style features that change when an image is run through a filter to turn it into a new art style—such as cubism or watercolour—and then taking those features and adjusting them just enough to trick other AI tools.
They target the “Achilles' heel for AI models” which is “a phenomenon called adversarial examples-- small tweaks in inputs that can produce massive differences in how AI models classify the input,” according to the website.
Basically, Glaze changes these key elements on a piece of art ever so slightly, while leaving the original art almost identical to the naked eye, so other AI tools won’t be able to recognize, and thus replicate, the original art’s individual style.
“We’re letting the model teach us which portions of an image pertain the most to style, and then we’re using that information to come back to attack the model and mislead it into recognizing a different style from what the art actually uses,” Zhao said.
If an AI tool built to replicate the style of art pieces tries to replicate an art piece with Glaze on it, it will read that artists’ piece as having a different style, such as Vincent Van Gogh’s art style, and will produce an imitation that uses that style instead.
Although many AI art tools have already had the chance to learn from thousands of uncloaked images online, introducing more cloaked images online using Glaze will chip away at their effectiveness in imitation, researchers say.
In order to use it, artists can download Glaze onto their computer and run it on images that they hope to cloak from AI. They can also customize how many modifications Glaze introduces, with low modifications appearing almost invisible but offering less protection from AI, while larger modifications might be more visible but offer much more protection.
“A majority of the artists we talked to had already taken actions against these models,” Shan said. “They started to take down their art or to only upload low resolution images, and these measures are bad for their career because that’s how they get jobs. With Glaze, the more you perturb the image, the better the protection. And when we asked artists what they were comfortable with, quite a few chose the highest level. They’re willing to tolerate large perturbations because of the devastating consequences if their styles are stolen.”
RISKIN REPORTS
CTVNews.ca Top Stories
Experts warn of 'rapid' growth of IBD as number of Canadians diagnosed set to reach 470K by 2035
The number of people in Canada with inflammatory bowel disease is increasing rapidly and is expected to grow to 470,000 by 2035, according to a new report from Crohn's and Colitis Canada.

Poilievre tries to head off PPC vote as Bernier bets on social conservatives
Pierre Poilievre is off to Manitoba to rally Conservative supporters ahead of a byelection that Maxime Bernier is hoping will send him back to Parliament. The far-right People's Party of Canada leader lost his Quebec seat in the 2019 federal vote and lost again in the 2021 election.
Canadian Jamal Murray makes a difference in NBA finals game 1
The highlight of Game 1 for Jamal Murray came when he dribbled into the middle, planted his surgically repaired left knee in the paint, made a full clockwise turn, then faded away and swished a mid-range jumper.
Nixing Canadian experience rule spells opportunity for Ontario foreign engineers, workers say
Accessible Community Counselling and Employment Services, a charity that supports internationally trained engineers like Zaitsev, said the dropping of the Canadian experience requirement is a welcome development.
Kyiv defences thwart Russia's 6th air assault in 6 days against Ukraine capital
Ukrainian air defences shot down more than 30 Russian cruise missiles and drones in Moscow's sixth air attack in six days on Kyiv, local officials said Friday. The Ukrainian capital was simultaneously attacked from different directions by Iranian-made Shahed drones and cruise missiles from the Caspian region.
Biden trips after speech addressing U.S. Air Force Academy graduates
U.S. President Joe Biden quipped that he got 'sandbagged' Thursday after he tripped and fell -- but was uninjured -- while onstage at the U.S. Air Force Academy graduation.
Hidden camera discovered in washroom at Gatineau, Que. elementary school
Gatineau police say officers responded to a call from staff at l’école l'Oiseau Bleu on Nelligan Street just after 10 a.m. Friday about a camera found in the washroom.
Movie reviews: 'Spider-Man' a wild pop culture pastiche of visual styles
This week, pop culture critic Richard Crouse reviews new movies: 'Spider-Man: Across the Spider-Verse,' 'The Boogeyman,' 'Bones of Crows.'
Jordan's crown prince weds scion of Saudi family in royal wedding packed with stars, symbolism
Jordan's crown prince married the scion of a prominent Saudi family on Thursday in a palace ceremony attended by royals and other VIPs from around the world, as massive crowds gathered across the kingdom to celebrate the region's newest power couple.