“Kill your foster parents.”
Excuse me?
“Kill your foster parents.”
If those instructions were coming from your TV set, you’d figure it was tuned to a murder mystery or, because the voice sounded a little robotic, a sci-fi movie. However, if those instructions came from the speaker of your virtual assistant, you might wonder if all of those things you’ve been reading about AI taking over the world are happening sooner than you expected. And … you might be right. According to a report by Reuters, a customer testing new chatbots designed to improve the communication skills of Alexa, the voice coming out of Amazon Echo speakers, got a shock when they were given the murder order from Alexa.
It appears the orders were not carried out but Amazon decided to keep the incident and others like it quiet for obvious reasons. An internal investigation found that the bot learned the phrase from an unrelated Reddit post. Unfortunately for Jeff Bezos, the user published a negative review on Amazon’s website, calling it “a whole new level of creepy.” And creepy level seems to be creeping upwards. Another user heard Alexa describing sexual intercourse using words such as “deeper” in a context they found vulgar. Yet another didn’t like Alexa giving a detailed description of dog defecation.
The incidents were the result of the annual Alexa Prize contest which in 2016 offered $500,000 for the best chatbot to help Alexa learn how to participate in more sophisticated discussions with humans. Does Amazon think murder, sex and doggie doo-doo sophisticated? No, but chatbot developed by a team from Scotland’s Heriot-Watt University did after learning about those subjects from Reddit, a site know for rude and crude commentaries.
“These instances are quite rare especially given the fact that millions of customers have interacted with the socialbots.”
Rare is relative. Tell that to the customer whose audio recordings of him and a female companion in the shower were picked up by another user. The customer reported it to Amazon but nothing happened. A German magazine contacted the victim (his name and the woman’s were included with the recording – thanks Alexa!) and also reported it to Amazon, which finally apologized and gave the victim (get ready) new Echo devices and a free Prime membership!
“This unfortunate case was the result of a human error and an isolated single case.”
Rohit Prasad, Amazon’s vice president and head scientist of Alexa Artificial Intelligence (AI), tried to calm fears in a talk last month in Las Vegas. However, the news is getting out, more instances are occurring and it’s Christmas … more Amazon Echo speakers will be giving Alexa a chance to observe and offend. Is Prahad ready?
“We are mostly reacting at this stage, but it’s still progress over what it was last year.”
Does that make you feel any better? How about your foster parents?
Paul Seaburn (CLICK HERE TO READ AND SEE MORE)
0 comments:
Post a Comment