Technology

Amazon’s Alexa tells a customer to kill their foster parents. Hey, what?

Amazon’s Alexa tells a customer to kill their foster parents. Hey, what?

Owners of Amazon Alexa devices in the United States can simply say “ Alexa, let’s chat” with AI. This phrase activates a social bot, which will talk to you about what you want to talk about. The goal is to have a consistent conversation with you as if you were living with a human being.

Unfortunately, not everyone is satisfied with the kind of conversation they have had with AI. In addition to reports of graphic descriptions of masturbation using phrases such as “deeper”, the AI ​​ chatbot also reportedly received negative feedback from a client who spoke of “killing your foster parents.” An unnamed user wrote in a review that the phrase was, in Reuter’s commentary, “a whole new level of creepy”.

So why is Alexa doing the chat?

Confidently, the surprise quotes aren’t entirely connected to last year’s mistake that saw Amazon Alexa spread a demonic smile in the middle of the night and frighten people or tell others that people are dying.

Behind the “Let’s Chat” feature is a contest conducted by Amazon. Conversations AI teams around the world are competing to win $ 500,000 in prizes. Teams from universities are developing bots that can talk to people, which are tested on live users who want to be associated with the chat feature. They then send feedback to Amazon, which is the judging of the competition.

The winning team’s university will receive an additional $1 million if their chatbot is able to engage in more than 20 minutes of conversation with human users while maintaining a rating of 4 stars or above. While this competition is great news for advancing AI technology, it caused a number of charitable problems, such as customers being instructed to kill their foster parents.

“Since Alexa award teams use dynamic data, some of which are encouraged from the Internet, to train their models, a social bot may have been accidentally infected or learned from something inappropriate,” an Amazon spokeswoman told Vice News. Trained using the AI ​​Internet, to learn how people talk and to respond to Alexa users to talk again, to make the conversation as human as possible. Unfortunately, this sometimes results in human creativity being eaten up by Alexa.

In this case, the social bot seems to have taken the phrase “kill your foster parents” from Reddit, where it takes on a somewhat fragile tone without context. According to Reuters, chatbots have spoken to 1.7 million people, we would argue. That it is actually quite impressive that only a few incidents of direct instruction of murder have taken place.