Amazon’s AI Alexa Suggests Users to ‘Kill Their Foster Parents’

Amazon’s AI Alexa is learning how to interact more like a person — with mixed results.

It turns out Amazon’s virtual assistant has offered up suggestions like “kill your foster parents” and random conversations including “sex acts.”

According to Reuters, the recently reported gaffes were due to the voice aide’s “let’s chat” feature, which uses artificial intelligence to help Alexa chat more like a human and bring up topics she finds on the internet.

Alexa performed 1.7 million such conversations using chatbots in a four-month span alone, Reuters reported.

As for the customer who got the “kill your foster parents” message, the chatbot feeding Alexa phrases was pulling a Reddit quote without context.

It could be a case of “accidental ingestion,” said an Amazon spokesperson.

“Since the Alexa Prize teams use dynamic data, some of which is sourced from the internet, to train their models, there is a possibility that a socialbot may accidentally ingest or learn from something inappropriate,” the spokesperson said in a statement to VICE News. “These instances are quite rare, especially given the fact that millions of customers have interacted with the socialbots.”

And that’s not the only weird thing to happen with Alexa recently. A man in Germany said that after he contacted Amazon about his archived data, he received 1,700 audio recordings from Alexa of a complete stranger talking privately, Reuters reported Thursday.

Be the first to comment

Leave a Reply

Your email address will not be published.