Amazon's Alexa sent 1,700 recordings to the wrong person

Amazon data breach exposes customer names, emails

Amazon sent recordings made by its Alexa virtual assistant to the wrong user—a man who reportedly had never used the product.

German technology magazine c't reported that a user had asked Amazon to send him all of the data the company had stored on him. The man, who wasn't identified and who had never used one of Amazon's voice-activated assistants, received recordings made inside a stranger's home.

According to Reuters, the customer told Amazon about his access to the other person's recordings, but received no reply. The files were then deleted from the link Amazon had provided, but he had already downloaded them onto his computer, the wire service reports.

Amazon on Thursday confirmed what it called "an unfortunate case that resulted from a human error," saying it was an "isolated incident." In a statement, the company said it "resolved the issue with the two customers involved and have taken steps to further improve our processes. We were also in touch on a precautionary basis with the relevant regulatory authorities."

The man was able to request his data under Europe's data protection law, known as GDPR. It's not clear if the data mix-up is considered a breach under that law.

Not the first flub

It's at least the second time that an Echo device, Amazon's Alexa-powered assistant, sent customers' private recordings outside their home. In May, a Portland, Oregon family found out that their Echo had mistakenly recorded some of their conversations and sent the recordings to an acquaintance.

In that case, Amazon said the assistant was responding to background conversations that that the device interpreted as voice commands.

"Echo woke up due to a word in background conversation sounding like 'Alexa,'" Amazon said in a statement at the time. "Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customer's contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right'. As unlikely as this string of events is, we are evaluating options to make this case even less likely."

-- The Associated Press contributed to this report.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.