They launched a site called Have I Been Trained, which lets artists search to see whether their work is among the 5.8 billion images in the data set that was used to train Stable Diffusion and Midjourney. LAION did not immediately respond to a request for comment.īerlin-based artists Holly Herndon and Mat Dryhurst are working on tools to help artists opt out of being in training data sets. Mason encourages any artists who don’t want their works in the data set to contact LAION, which is an independent entity from the startup. The group is in its early days of mobilization, which could involve pushing for new policies or regulation. “There is a coalition growing within artist industries to figure out how to tackle or mitigate this,” says Ortiz. But it’s also a lot more personal, Ortiz says, arguing that because art is so closely linked to a person, it could raise data protection and privacy problems. Karla Ortiz, an illustrator based in San Francisco who found her work in Stable Diffusion’s data set, has been raising awareness about the issues around AI art and copyright.Īrtists say they risk losing income as people start using AI-generated images based on copyrighted material for commercial purposes. Other artists besides Rutkowski have been surprised by the apparent popularity of their work in text-to-image generators-and some are now fighting back. Some artists may have been harmed in the process Stability.AI released the model into the wild for free and allows anyone to use it for commercial or noncommercial purposes, although Tom Mason, the chief technology officer of Stability.AI, says Stable Diffusion’s license agreement explicitly bans people from using the model or its derivatives in a way that breaks any laws or regulations.
0 Comments
Leave a Reply. |