It didn't take long for the internet to use Stable Diffusion for porn creation. Nude characters, mostly women, as well as non-consensual fake nude imagery of celebrities, were created using the artificial intelligence system of communities across the internet.

NewGrounds, a community that allows some forms of adult art, is one of the new forums that came to fill the gap left by the shutting down of many of the subreddits dedicated to porn.

Unstable Diffusion is the largest and is building a business around artificial intelligence. The server is currently raking in over $2,500 a month from several hundred donors, thanks in part to their Patreon.

Arman Chaudhry, one of the members of the Unstable Diffusion admin team, said that the team expanded in just two months. We see the chance to create tools that professional artists and businesses can use.

Some ethicists are as worried as Chaudhry is. The use of artificial intelligence to create porn isn't new, but Unstable Diffusion's models are capable of generating higher fidelity examples than most The generated porn could have negative consequences for marginalized groups, including the artists and adult actors who make a living making porn for customers.

Unstable Diffusion

The image was removed from Unstable Diffusion's server.

The risks include putting unrealistic expectations on women's bodies and sexual behavior, violating women's privacy and copyrights, and putting women out of a job. There is a concern that the impact of artificial intelligence-generated porn on women is different. The previous app that could undress people only worked on women.

Humble beginnings

The Stable Diffusion model was released at the same time as Unstable Diffusion. It migrated to Discord where it now has 50,000 members.

One of the admins of the Discord server wrote in an announcement post in August that they were here to provide support for people interested in making the game. The only community currently working on this is 4chan, so we hope to provide a more reasonable community which can actually work with the wider artificial intelligence community.

Unstable Diffusion was a place for sharing porn and ways to circumvent the content filters of apps. The admins of the server started looking at ways to build their own artificial intelligence systems for porn generation on top of existing open source tools.

Stable Diffusion was able to help them. The model isn't built to generate porn per se, but it isn't explicitly banned from being used to create porn if it doesn't violate laws or harm other people. Stable Diffusion has adopted a laissez-faire approach to governance and placed the onus on the community to be responsible.

There was no response to a request for comment.

A bot was released by the Unstable Diffusion admins. It was powered by the Stable Diffusion. The nude figures created by the bot had missing limbs and distorted genitalia.

Unstable Diffusion

The image is called Unstable Diffusion.

Stable Diffusion hadn't been exposed to enough examples of porn to know how to make it work. Stable Diffusion uses a dataset of billions of captioned images to learn the associations between written concepts and images, like how the word "bird" can refer to parakeets and bald eagles. While many of the images come from copyrighted sources, companies such as Stability Artificial Intelligence argue that their systems are covered by fair use, a precedent that is soon to be tested in court.

Stable Diffusion has only a small amount of explicit material in its data set. If you wanted to make a furniture generation artificial intelligence, you would give it more pictures of couches and chairs.

Chaudhry tells me that there is a technique to repair distorted faces and arms in artificial intelligence nudes. He said that they are recording and addressing challenges that all artificial intelligence systems run into, namely collecting a diverse dataset that is high in image quality and captioned richly with text.

The work-in-progress, not-yet-public web app that the admins say will eventually allow people to follow porn from specific users is powered by the custom models.

Growing community

The Unstable Diffusion server has a variety of different art styles and sexual preferences. There is a softcore and safe for work stream, as well as channels for hentai and furry artwork. Users in these channels can use the bot to create art that fits the theme and submit it to a board if they are particularly pleased with the result.

Over four million images have been generated by Unstable Diffusion. The group hosts competitions that challenge members to recreate images using the bot, the results of which are used in the improvement of Unstable Diffusion's models.

Unstable Diffusion

The image is called Unstable Diffusion.

Unstable Diffusion aspires to be an ethical community for porn that doesn't include child pornography, deep fakes and excessive gore. Chaudhry claims that the server has a full-time moderation team and uses a filter to block images of people in its database.

He said that only fictional and law-abiding generations could be on the server. We will revisit and work with partners on the moderation rules that best align with their needs and commitments for professional tools.

Unstable Diffusion's systems will become harder to monitor as they're more widely available.

The founder and principal researcher at the Montreal Artificial Intelligence Ethics Institute said that they need to think about how safety controls might be subverted. Unstable Diffusion is an example of a server that can accumulate a lot of problematic content in a single place and show both the capabilities of artificial intelligence to generate this type of content and connecting malicious users with each other to further their skills in the generation of such content. Content moderation teams have to deal with trauma as they review and remove offensive content, and they are made worse by them.

There is an issue about the artists whose artwork was used to train the models. Many artists take issue with artificial intelligence systems that mimic their styles without giving proper credit or compensation, as evidenced by the recent reaction to DeviantArt's DreamUp image generator, which was trained on art uploaded to DeviantArt without creators' knowledge.

Classical painting styles and fantasy landscapes, which have become one of the most commonly used prompts in Stable Diffusion, have been decried as poor artificial intelligence imitations. They worry that people will use artificial intelligence to create art that will crowd out their original work and hurt their income. Users are granted full ownership of the images they generate.

Gupta thinks that artists who don't want their work associated with porn may become victims of users realizing certain artists' names yield better results in Unstable Diffusion.

Unstable Diffusion

The image is called Unstable Diffusion.

Chaudhry says that Unstable Diffusion is looking at ways to make its models more equitable towards the artistic community. Licensing artwork or allowing artists to prevent their work from being used in training data sets were not outlined.

Artist impact

Adult artists can make a living drawing, painting and photographing suggestive works for a living. What will happen to human artists if anyone can create an image with an artificial intelligence?

It isn't an immediate threat. As adult art communities grapple with the implications of text-to-image generators, it may be difficult to find a platform to publish porn outside of the Unstable Diffusion. Newgrounds, which hosts mature art behind a content filter, decided to ban the creation of artificial intelligence-generated art.

One of the larger adult content hosts, OnlyFans, left open the possibility of allowing the use of artificial intelligence on its platform. As long as the person featured in the content is a verified OnlyFans creator, OnlyFans will allow content that includes artificial intelligence.

The quality of the hosting might not be up to par.

Milo Wissig, a trans painter who has worked with artificial intelligence, said that the technology is not very good. It seems like it works best as a tool for an artist to work off of... but a lot of people can't tell the difference

It's obvious to see where Artificial Intelligence falls flat. In the case of bondage, in which tying ropes and knots is a form of art, it is difficult for the artificial intelligence to duplicate.

It would be hard to get an artificial intelligence to make a specific image for people. It is very difficult to get the artificial intelligence to make the ropes work.

Straight sex between white people is the norm in traditional erotica and can be amplified by the source material behind these artificial intelligences.

There are images that are pulled from mainstream porn. Unless you specify not to do that, most of the stuff the machine can think up is white.

Wissig AI art

Milo Wissig's image was used.

Machine learning has documented these biases across applications.

When it comes to porn, the consequences may not be as dire, but there is still a special horror to watching as an artificial intelligence changes ordinary people into caricatures. DALLE-2 was criticized for being disproportionately generating art in European styles.

Wissig tried to use VQGAN to create images of sexy queer trans people. He said that he had to phrase his terms carefully in order to get faces on some of them.

There isn't much evidence to support the idea that the artificial intelligence can represent genderqueer and trans people. Almost all of the generated images depict women with penises.

Branching out

Unstable Diffusion isn't solely focused on in-house projects A group founded by Chaudhry is funding other efforts to create porn- generating artificial intelligence systems.

Chaudhry thinks Unstable Diffusion will evolve into an organization to support broader content generation, sponsor groups and provide tools and resources for teams to build their own systems. Unstable Diffusion will use a five-figure grant in cloud hardware and compute from a large cloud compute provider to expand its model training infrastructure, according to him.

Chaudhry says that Unstable Diffusion will use the grant to launch a campaign and look for funding. He said that they plan to create their own models and combine them for specialized use cases which will lead to new brands and products.

The group has a long way to go. Moderateness is the most consequential challenge Unstable Diffusion faces. There are a lot of failures at adult content moderation. Pornhub lost the support of major payment processors in 2020 after the site was found to be circulating child porn.

Is Unstable Diffusion the same as before? It is unclear at this time. The group doesn't appear to be on the straight and narrow with at least one senator calling on companies to implement stricter content filters.