Unstable Diffusion, a group trying to monetize pornography using artificial intelligence, raised more than 56,000 dollars on the crowd funding website. Unstable Diffusion's campaign was shut down by the crowd funding platform as it changed its thinking about what kind of projects it will allow. The funders will return any money raised by Unstable Diffusion since the campaign had not yet ended. Unstable Diffusion won't see $56,000, which more than doubled its initial $25,000 goal

"Over the last few days, we've engaged our Community Advisory Council and we've read your feedback to us via our team and social media." One thing is clear, and that is that the people behind the work are the most important part of the project. Creative work thrives with us here.

There is a new approach to hosting artificial intelligence.

Taylor said that they don't have all the answers. We want this to be an ongoing discussion with all of you because the decisions we make now might not be the ones we make in the future.

The platform is considering how projects interface with copyrighted material, especially when artists' work is used without their permission. It is possible that the project will put anyone at risk of harm.

In recent months, tools like OpenAI's ChatGPT and StabilityAI's Stable Diffusion have been met with mainstream success, bringing conversations about the ethics of artificial intelligence into the forefront of public debate. If the open source Stable Diffusion can be used to instantly create artistic avatars that look like a professional artist's work, how does that affect those who work in that field?

The Unstable Diffusion project was being pressured to be dropped by some artists because of concerns about how artificial intelligence can affect artists' careers.

He @Kickstarter so you’re just gonna let a AI project that’s main premise is generating (potentially non consensual) porn and main sales pitch is that folks can steal from Greg Rutkowski? Or are you gonna do something to protect creators and the public?https://t.co/26nTl4dTNM

— Karla Ortiz 🐀 (@kortizart) December 10, 2022

Shame on @Kickstarter for allowing the Unstable Diffusion crowdfund. You are enabling blatant theft and are funding a tool that can create abusive content such as nonconsensual pornography.

— Sarah Andersen (@SarahCAndersen) December 11, 2022

The fate of Greg Rutkowski's work is an example of what can happen. A living illustrator who has crafted detailed, high fantasy artwork for franchises like "Dungeons and Dragons" was one of Stable Diffusion's most popular search terms when it launched in September. As a result of his artwork being used to train the algorithm, he became a vocal advocate about how art generators affect working artists.

The new model can be trained with 75 million high quality images, which include 25 million artistic images and 25 million photographic images.

Have I Been Trained is a website developed by Spawning that allows artists to opt out of popular datasets. There is a legal precedent to defend the data scraper.