Project December: The AI Chatbot People are using to “Talk To” the Dead

Project December: The AI Chatbot People are using to “Talk To” the Dead

In 2019, Elon Musk and Sam Altman’s firm OpenAI created software that they then considered “too dangerous” for publication. They believe that the text-generating algorithm called GPT-2 can be used as a very convincing and disgusting tool to create fake news. Eventually, a dial-down version of the software was released. It was impressive – when a short prompt is fed, it creates a string of text that is not completely different from a news article – but the error was that what you were seeing was not produced by a human (see our own AI-generated) story here ).

An upgraded version called GPT-3 was released later. Trained in much more text input, it was much more natural and realistic than its predecessor. Yet, the more you ask it to create text, the more sensitive it will become. Write to programmer Jason Rohr, who realized that he could create a more reliable AI if he responded to users as a chatbot. Using the technology of GPT-2 and from, he created exactly that, naming his creation project December.

The online chatbot, which anyone can use, works in much the same way as the original text generator with a few key differences. The main thing is that you can train the algorithm text, and it will try to imitate that style (or, in fact, the author). Bots will learn from your input when you hold your conversation, changing how it interacts. A random element of how the bot responds helps prevent the same answer from returning to the same or similar input every time. However, there are some limitations. Hosting bots is expensive, so Rohr has decided to reduce storage costs by limiting the length of conversations. You have a certain amount of time with the bot, bought with credit, and your bot gets (intentionally) more corrupted as your time runs out.

At this point, the bot – and your conversation dies – This feature certainly leads to some surprisingly beautiful scenes as it starts talking nonsense before it stops, a user reports that a bot begs not to die. There are several pre-programmed bots to choose from for the project, such as modeling a Samantha, an AI collaborator in her film, and modeling a William Shakespeare. But for some users – such as the recent (and highlights) highlighted in the San Francisco Chronicle – there is a temptation to create chatbots on real people.

Share This Post