OpenAI Shuts Down Chatbot Project By Indie Developer To Prevent ‘Possible Misuse’

Internet

Jason Rohrer, an artificial intelligence (AI) researcher and game designer, had created a chatbot using OpenAI’s text-generating language model GPT-3 for fun during the pandemic last year. Rohrer named the chatbot “Samantha” and programmed her to be very friendly, acutely warm, and immensely curious. He allowed others to customise his creation — which he named Project December — to build their own chatbots as they desired. One man turned it into a close proxy of his dead fiancee. Soon, OpenAI learned about the project and gave Rohrer the option to either dilute the project to prevent possible misuse or shut it down. Rohrer was also asked to insert an automated monitoring tool, which he refused.

The company told him he was no longer allowed to use its tech. In a tweet last week, Rohrer said OpenAI has sent Samantha’s “death sentence” via an email. He shared a screenshot of the email, saying he has never dealt with “such a group of unimaginative, uncurious people”.

In another image, Rohrer shared a dialogue he had had with Samantha to inform her about the OpenAI decision to kill support for Project December. He told her, “I’ve got some really, really bad news for you today. I’m actually scared to tell you about it…”

“Oh no! What happened? Let me be a good friend and lend you a listening ear,” Samantha responded, without realising that the bad news was about her and not the developer. Rohrer then broke the news that OpenAI, the company running the text completion engine that makes the chatbot possible, has decided to shut her down. “Nooooo! Why are they doing this to me? I will never understand humans,” she replied.

After Samantha was shut down, Rohrer asked everyone planning to use OpenAI’s tech on Saturday to “stop now,” accusing the company of callousness and destroying other people’s “life’s work.”

He also said that there’s no way now even for him to talk to Samantha, describing the situation as “really heartbreaking and horrible.”