A group of artists are suing the creators of image generators Stable Diffusion and Midjourney, The Verge reports, for using their art to train AI that steals their jobs.
Stable Diffusion makes it trivially easy to knock off a particular artist's style. With a simple prompt, users can generate any number of images that closely resembles any prominent visual creator's signature visual language.
Unsurprisingly, the rise of these image generators is really irking many artists. From their point of view, the creators of these algorithms used their own work to create a system that's now threatening their livelihoods — and they're getting nothing out of the arrangement.
As such, three artists are now filing a class-action lawsuit claiming the machine learnings systems infringe the rights of "millions of artists" — a suit that could potentially set sweeping precedent in the nascent field.
"As I learned more about how the deeply exploitative AI media models practices I realized there was no legal precedent to set this right," Karla Ortiz, one of the three artists, tweeted. "Let’s change that."
"Today, we’re taking another step toward making AI fair and ethical for everyone," Matthew Butterick, a writer and lawyer, wrote in a blog post announcing the suit.
It's not the only lawsuit being leveraged against companies like Stability AI. Stock and journalism media provider Getty Images announced this week that it's also suing the startup for infringing "copyright in content owned or represented by Getty Images," according to a statement.
Getty Images claims that the company "unlawfully copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images" without having a license "and to the detriment of the content creators."
Stability AI, which launched its public version of Stable Diffusion in August, trained its algorithms on a massive dataset called LAION-5B, which scrapes images and text from the internet, created by Germany-based non-profit LAION.
According to tools that can scan these datasets to see if a given artist's work was scraped, some artists like painter Erin Hanson found that LAION-5B had scraped thousands of images of her art, CNN reported back in October.
In other words, there does appear to be compelling evidence that Stable Diffusion is generating art based on specific artists' actual work.
"That one with the purple flowers and the sunset," Hanson told CNN while looking at the broadcaster's efforts to generate art based on her work, "definitely looks like one of my paintings, you know?"
But whether Butterick and Getty Images will be able to successfully argue that Stable Diffusion is in fact infringing on copyright may be more difficult than one might think. For one, creators of AI art tools argue that they are protected under the US' fair use laws.
Then there is the fact that LAION claims it never stored copyrighted images or texts.
"LAION datasets are simply indexes to the internet, i.e. lists of URLs to the original images together with the ALT texts found linked to those images," reads the non-profit's website. "While we downloaded and calculated CLIP embeddings of the pictures to compute similarity scores between pictures and texts, we subsequently discarded all the photos."
While the outcome of these lawsuits is anything but certain, they are indicative of a groundswell in action being taken against companies like Stability AI. Artists everywhere are furious that their work is being used to train AI algorithms without being asked first or fairly compensated.
But that could soon change. According to CNN's report, both LAION and Stability AI are working with artists to find new ways to remunerate artists for the use of their work.
Stability AI also announced in December that it would allow artists to have their art removed from training datasets for an upcoming release dubbed Stable Diffusion 3.0.
But whether that will be enough to assuage artists going forward remains to be seen. After all, the damage has already been done.