The night is dark and full of terrors while the day is bright and beautiful. Like all tech, it has its upsides and drawbacks.

Stable Diffusion is an example of an art-generating model that has led to new business models. Bad actors can use its open source nature to create deep fakes at scale, while artists protest that it is profiting off of their work.

Is there anything on the horizon for Artificial Intelligence in the years to come? Is regulation going to rein in the worst of what artificial intelligence can do? The industries that were thought safe from automation will be disrupted by new forms of artificial intelligence.

Expect more (problematic) art-generating AI apps

There will be a lot of me-too apps with the success of Lensa. Expect them to be able to alter the appearance of women and be tricked into making nude images.

The good and the bad of generative artificial intelligence will be amplified by its integration into consumer tech, according to a senior policy researcher.

Stability AI, the startup behind Stable Diffusion, raises $101M

Stable Diffusion was fed billions of images from the internet until it was able to associate certain words with certain images. Text-generating models have been tricked into giving out offensive views.

Mike Cook is a member of the Knives and Paintbrushes open research group. He thinks it will be the year that generative artificial intelligence puts its money where its mouth is.

The model was created in Dream Studio.

Cook said that for technology to become a long term part of our lives, it has to either make someone a lot of money or have a meaningful impact on the daily lives of the general public. There will be a push to make generative artificial intelligence achieve one of the two things.

Artists lead the effort to opt out of data sets

Stable Diffusion was used to build an artificial intelligence art generator for Deviant Art. The platform's lack of transparency in using uploaded art to train the system was criticized by Deviant Art's long-time denizens.

According to the creators of the most popular systems, they have taken steps to limit the amount of harmful content their systems produce. Many of the generations on social media seem to agree that there is work to be done.

Gahntz said that the data sets require active curation to address these problems and should be subjected to significant scrutiny.

Shutterstock to integrate OpenAI’s DALL-E 2 and launch fund for contributor artists

Artists will be able to opt out of the data set used to train the next-generation Stable Diffusion model, as a result of the recent bow to public pressure by Stability Artificial Intelligence. The website HaveIBeenTrained.com will allow rightsholders to request opt-out before training starts.

Openai doesn't offer an opt-out mechanism and instead partners with organizations such as Shutterstock to license portions of their image galleries. It is likely only a matter of time before it follows in the footsteps of StabilityAI.

The courts may eventually have to make a decision. In the U.S., Microsoft and OpenAI are being sued in a class action lawsuit that accuses them of violating copyright law by allowing Copilot, a service that suggests lines of code, to use unlicensed code.

GitHub launches Copilot for Business plan as legal questions remain unresolved

There are settings added to prevent public code from showing up in Copilot's suggestions and a feature that will reference the source of code suggestions. They're not perfect measures. The filter setting caused Copilot to emit large chunks of copyrighted code.

The United Kingdom is considering removing the requirement that systems trained through public data be used non-commercially.

Open source and decentralized efforts will continue to grow

There were a few artificial intelligence companies that dominated the stage. As the ability to build new systems moves beyond resource-rich and powerful artificial intelligence labs, the pendulum may swing back towards open source.

He said that a community approach may lead to more scrutiny of systems as they are being built and deployed.

OpenFold

The results of OpenFold, an open source artificial intelligence system that predicts the shapes of proteins, compared to DeepMind's AlphaFold2

Large language models from EleutherAI and BigScience are examples of community focused efforts. There are a number of communities that are funded by Stability Artificial Intelligence, like the music-generation focused Harmonai and Open BioML.

Money and expertise are still needed to train and run sophisticated artificial intelligence models.

The recent release of the open source Petals project was a step in the right direction. People can contribute their compute power, similar to Folding@ home, to run large artificial intelligence models that would normally require a high-end computer.

Petals is creating a free, distributed network for running text-generating AI

The cost to train and run modern generative models is high. Chandra Bhagavatula is a senior research scientist at the Allen Institute for Artificial Intelligence. It will be important to make this commercially viable.

As long as the methods and data remain proprietary, large labs will still have a competitive advantage. Point-E is a model that can be used to create 3D objects. The source of Point-E's training data was not disclosed or released by Openai.

OpenAI Point-E

Point clouds are generated by point-E.

Chandra said that the open source efforts and decentralization efforts are to the benefit of a larger number of researchers. Despite being open-sourced, the best models are not accessible to a lot of people.

AI companies buckle down for incoming regulations

The EU's Artificial Intelligence Act may change how companies develop and deploy artificial intelligence. New York City has an artificial intelligence hiring statute that requires that bias be checked before being used.

Chandra sees the need for these regulations in light of generative artificial intelligence's tendency to spout incorrect information.

It's difficult to apply generative artificial intelligence in areas where mistakes can cost a lot. She said that the ease of generating incorrect information created challenges. Artificial intelligence systems are already making decisions that are morally and ethically questionable.

The EU’s AI Act could have a chilling effect on open source efforts, experts warn

Expect a lot of quibbling over rules and court cases before anyone is charged or fined next year. Companies may still jockey for position in the most favorable categories of upcoming laws.

Currently written, the rule divides the systems into four risk categories, each with different requirements and levels of scrutiny. There are certain legal, ethical and technical standards that must be met before a system can enter the European market. The lowest risk category is called "minimal or no risk".

Os Keyes fears that companies will aim for the lowest risk level in order to minimize their own responsibilities and visibility to regulators.

The most positive thing on the table is the Artificial Intelligence Act. I haven't seen a lot of Congress stuff.

But investments aren’t a sure thing

Even if an artificial intelligence system works well for most people, there is still a lot of work to be done before it is widely available. There is a business case for everything. Consumers aren't going to like your model if it's messed up. This is also about fair.

It is not clear if companies will be persuaded by that argument going into next year.

The company raised $101 million at a valuation of over $1 billion in the midst of the Stable Diffusion controversy. As it enters advanced talks to raise more funding from Microsoft, Openai is valued at $20 billion. Microsoft invested $1 billion in Openai last year.

Those may be exceptions to the rule.

Jasper AI

The image is called Jasper.

The top-performing artificial intelligence firms in terms of money raised this year were software-based. Content square closed a $600 million round in July $400 million was earned by Uniphore in February. Highspot's platform provides sales reps and marketers with real-time and data-driven recommendations.

Even if these aren't as sexy as generative artificial intelligence, investors will still chase safer bets. That doesn't mean there won't be big investments, but they'll be for players with power.