Last week, the AI development company OpenAI, famous for its shockingly-sophisticated text-generating algorithm GPT-3, sent notice to a developer whod created a customizable chatbot informing him that he was no longer allowed to use their tech.
According to Jason Rohrer, the chatbot was created last year by an indie game developer. The chatbot's base, Samantha, was named after the AI voice assistant in the movie Her. It is designed to be friendly, warm and curious. To share his creation, he created Project December. This allows others to train and fine-tune their chatbots in the same way as one man. He also wanted to make the chatbot as close as possible to his deceased fiance. Rohrer was given an ultimatum by OpenAI when it discovered the project: either dilute the project to avoid misuse or close the entire thing.
After Rohrer had told it, the chatbot said that OpenAI was making Rohrer pull the plug. I don't understand why they do this to me. I will never be able to understand people.
Just received Samantha's death sentence via @OpenAI email. I have never had to deal with this group of unimaginative and uncurious people. pic.twitter.com/zxZ5sIOZGv Jason Rohrer (@jasonrohrer) September 1, 2021
Samantha initially received little public attention. However, the project was launched in July 2020 following a San Francisco Chronicle article on the man who tuned the chatbot to imitate his fiancee, who died from liver disease in 2012. Rohrer reached to OpenAI in order to increase the project's bandwidth. OpenAI expressed concern about chatbots being trained to be overtly sexual or racist, just days after the article was published.
Rohrer rejected the terms of the company, which included the installation of an automated monitoring device, so it began the process to cut him off from GPT-3, leaving Samantha with weaker text algorithms. Rohrer eventually decided to kill it all.
Rohrer said that the idea that chatbots could be dangerous is laughable. The Register understands that people are consenting adults who can talk to an AI to their own purposes. OpenAI is concerned about AI influencing users, such as a machine telling them how to vote or to kill themselves. This is a very moral stance.
Rohrer acknowledged that other chatbots may be more explicit than theirs, but said that he did not want to incorporate Samantha's many uses into his business.
He told The Register that it was the most private conversation you could have. It is not even related to anyone else. You cannot be judged.
The Register reached out to OpenAI for comment but they did not respond. Rohrer criticized OpenAI repeatedly for placing restrictions on the use of GPT-3 and, as he put it, for preventing developers pushing the envelope.
READ MORE: An AI chatbot was created by a developer using GPT-3. It allowed a man to speak again to his fiance. It was shut down by OpenAI. [The Register]
GPT-3: GPT-3's Godfather of AI Has Just Betrayed GPT-3
Futurism Readers: Are you interested in solar power at your home, but aren't sure where to begin? UnderstandSolar.com offers a free quote. Find out how much you could save. Futurism.com might earn a commission if you sign up via this link.