An AI-generated image of someone leaving a building.
Enlarge / An AI-generated image of a person leaving a building, thus opting out of the vertical blinds convention.

Artists will be able to remove their work from the training dataset for an upcoming release. Spawning, an artist advocacy group, said on its website that Stability AI would honor opt-out requests. The details of how the plan will be implemented aren't clear.

Stable Diffusion, an artificial intelligence image synthesis model, gained its ability to generate images by "learning" from a large dataset of images from the internet without consulting any rights holders. Stable Diffusion can generate images that can be compared to human artists in an unlimited quantity. The ethical debate began after Stable Diffusion's public launch.

We created an account on Have I Been Trained in order to understand how the Stable Diffusion 3 opt-out system works. After the site's search engine found matches in the Large-scale Artificial Intelligence Open Network ( LAION ) image database, we selected "Opt-out This image" in a pop-up menu.

We were able to see the images in the list we had marked as opt-out. There was no attempt to verify our identity or control the images weopted out.

A screenshot of
Enlarge / A screenshot of "opting out" images we do not own on the Have I Been Trained website. Images with flag icons have been "opted out."

To get rid of an image from the training, it must be in the LAION dataset and be found on Have I Been Trained. There isn't a way to opt out of large groups of images or copies of the same image.

Advertisement

The system raises questions that have echoed in the announcement threads on social media. Who would pay for the huge effort to legally verify ownership to control who opts out images? Would people trust these organizations with their private information? Stability's CEO says that permission is not necessary to use them.

A video from Spawning announcing the opt-out option.

It's unpopular to put the onus on the artist to register for a site with a non-binding connection to LAION and then hope that their request gets honored. In response to statements about consent by Spawning in its announcement video, some people noted that the opt-out process does not fit the definition of consent in Europe's General Data Protection Regulation. It's necessary to give consent on a voluntary basis. Many argue that the process should be opt-in only and that artwork should not be included in the training.

Although this issue has not yet been tested in court, it appears that Stability Artificial Intelligence is operating within US and European law. The company is aware of the ethical debate that has led to a large protest against artificial intelligence.

There is a question as to whether there is a balance that can satisfy artists. Stability CEO Emad Mostaque is open to suggestions, he said the team is open to feedback and doing a good job. We are happy to engage with all sides, and try to be as transparent as we can. All moving quickly.