Researchers have created a way to cloak artwork so that it can’t be used to train AI

With the power of AI, it’s now possible to replicate distinctive art styles in minutes — an innovation that is leaving traditional artists in the lurch as their art is taken to train AI models that then steal job opportunities from them.
But what if you could stop AI models from being able to replicate your art style?
Researchers at the University of Chicago have made a tool they say will do just that, a filter that, once applied to an image, means that image can’t be read and reproduced by AI tools that scrape art online.
Called “Glaze,” a beta version of the free tool launched for download last week.
AI art can be produced instantaneously, but only because AI draws data from thousands of art pieces across the internet that took human artists weeks to months to create.
The creators say Glaze will allow artists to protect their distinct art style from being absorbed into the pool of data that AI art tools draw on.
“Artists really need this tool; the emotional impact and financial impact of this technology on them is really quite real,” Ben Zhao, Neubauer Professor of computer science at the University of Chicago, said in a February press release. “We talked to teachers who were seeing students drop out of their class because they thought there was no hope for the industry, and professional artists who are seeing their style ripped off left and right.”
The project involved surveying more than 1,100 professional artists, according to the release. The tool was tested on 195 historical artists, as well as four currently working artists, before a focus group evaluated Glaze’s accuracy in disrupting AI imitation.
More than 90 per cent of artists surveyed said they were willing to use the tool when posting their art.
Glaze is the second project by the University of Chicago’s SAND Lab which brings protection to images posted online. SAND Lab previously created a tool to shield personal photos so that they couldn’t be used to train facial recognition software back in 2020. But when they began to apply the same concept to art, a few problems arose immediately.
Photos of human faces can be boiled down to a few distinct features, but art is much more complex, with an artistic style defined by numerous things, including brushstrokes, colour palettes, light and shadow as well as texture and positioning.
In order to confuse the AI tools and ensure they would not be able to read the artistic style and replicate it, researchers needed to isolate which parts of a piece of art were being highlighted as the key style indicators by AI art tools.
“We don’t need to change all the information in the picture to protect artists, we only need to change the style features,” Shawn Shan, a UChicago computer sciences graduate student who co-authored the study, said in the press release. “So we had to devise a way where you basically separate out the stylistic features from the image from the object, and only try to disrupt the style feature using the cloak.”
To do this, researchers used a “fight fire with fire” approach. Glaze works by using AI to identify style features that change when an image is run through a filter to turn it into a new art style—such as cubism or watercolour—and then taking those features and adjusting them just enough to trick other AI tools.
They target the “Achilles' heel for AI models” which is “a phenomenon called adversarial examples-- small tweaks in inputs that can produce massive differences in how AI models classify the input,” according to the website.
Basically, Glaze changes these key elements on a piece of art ever so slightly, while leaving the original art almost identical to the naked eye, so other AI tools won’t be able to recognize, and thus replicate, the original art’s individual style.
“We’re letting the model teach us which portions of an image pertain the most to style, and then we’re using that information to come back to attack the model and mislead it into recognizing a different style from what the art actually uses,” Zhao said.
If an AI tool built to replicate the style of art pieces tries to replicate an art piece with Glaze on it, it will read that artists’ piece as having a different style, such as Vincent Van Gogh’s art style, and will produce an imitation that uses that style instead.
Although many AI art tools have already had the chance to learn from thousands of uncloaked images online, introducing more cloaked images online using Glaze will chip away at their effectiveness in imitation, researchers say.
In order to use it, artists can download Glaze onto their computer and run it on images that they hope to cloak from AI. They can also customize how many modifications Glaze introduces, with low modifications appearing almost invisible but offering less protection from AI, while larger modifications might be more visible but offer much more protection.
“A majority of the artists we talked to had already taken actions against these models,” Shan said. “They started to take down their art or to only upload low resolution images, and these measures are bad for their career because that’s how they get jobs. With Glaze, the more you perturb the image, the better the protection. And when we asked artists what they were comfortable with, quite a few chose the highest level. They’re willing to tolerate large perturbations because of the devastating consequences if their styles are stolen.”
RISKIN REPORTS
CTVNews.ca Top Stories
Trump charged over classified documents in 1st federal indictment of an ex-president
Donald Trump said Thursday that he has been indicted on charges of mishandling classified documents at his Florida estate, igniting a federal prosecution that is arguably the most perilous of multiple legal threats against the former U.S. president as he seeks to reclaim the White House.

Freeland's budget bill passes House after Poilievre pledges to block it
The federal budget implementation bill passed the House of Commons on Thursday, after days of Conservative attempts to block it.
Supreme Court of Canada won't hear unvaccinated woman's case for organ donation
The Supreme Court of Canada will not hear the appeal of an Alberta woman who was unwilling to be vaccinated in order to get a life-saving organ transplant.
Special rapporteur David Johnston cuts ties with crisis management firm Navigator
Canada's special rapporteur on foreign interference has ended ties with crisis communications firm Navigator, his office confirmed on Thursday.
How the lack of gravity in space impacts astronauts’ brain
What happens to the brain when you take gravity away? According to a new study looking at astronauts both before and after space travel, that experience causes physical changes that researchers believe requires at least three years between longer missions to recover from.
Are more interest rate hikes on the way? Here's what experts say
In the wake of the Bank of Canada’s unexpected rate hike, economists are pointing to further tightening in the near term.
'Tremendous amount we could be doing': Expert shares tips for preventing, adapting to wildfires
As wildfires rage across Canada in what’s being called an unprecedented season, one expert says there’s more that individuals and communities can do to adapt and prevent forest fires from causing widespread devastation.
10-year-old girl survives more than 24 hours alone in the rugged Cascade mountains after getting lost while out with her family
Rescuers in Washington state are praising the resourcefulness of a 10-year-old girl who survived on her own for more than 24 hours in the rugged terrain of the Cascade mountains after getting lost while out with her family.
Wildfire battles continue as heat, air quality alerts affect most of Canada
Air pollution from wildfires remained well above healthy levels across much of southern and northern Ontario and several communities in British Columbia and Alberta on Thursday.