ADVERTISEMENT
ADVERTISEMENT

Alexa, no! Amazon explains how an echo shared a couple's private conversation

They’re always listening. They’re on the internet. But what happens when digital assistants like Alexa go rogue? Could they share our private conversations without our consent?

Skeptics were quick to say we told you so, as the news rocketed through the connected world.

Now, Amazon says it knows what happened: As the woman, identified only as Danielle, chatted away with her husband, the device’s virtual assistant, Alexa, mistakenly heard a series of requests and commands to send the recording as a voice message to one of the husband’s employees.

“Echo woke up due to a word in background conversation sounding like ‘Alexa,'” Amazon said in a statement. “Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customer’s contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

ADVERTISEMENT

In a follow-up interview, though, Danielle told KIRO7 that the Echo that shared her conversation was right next to her at the time with the volume set to seven out of 10. It never requested her permission to send the audio, she said.

The family had several Echoes in their home, using them to control the heat, lights and security system. But, two weeks ago, Danielle’s husband received a call from the employee in Seattle, who reported receiving audio of their conversation.

“At first, my husband was like, ‘No, you didn’t,'” Danielle told KIRO7. “And he’s like, ‘You sat there talking about hardwood floors.’ And we said, ‘Oh gosh, you really did!'”

The family disconnected the devices and contacted Amazon, prompting an investigation. Now, Danielle is asking for a refund.

“I’m never plugging that device in again,” she told KIRO7. “I can’t trust it.”

ADVERTISEMENT

An Amazon help page explains that if you own an Echo and are concerned about what it might be recording, you can review, listen and delete the audio and other interactions in the settings menu.

News of the error was met with a mix of alarm and humor on social media.

Amazon’s main home assistant devices — the Echo, Echo Plus and Echo Dot — are each equipped with seven microphones and noise-canceling technology. Amazon and Google are the leading sellers of such devices.

This is not the first report of an Echo mishearing commands, with unusual results. Amazon offered a similar explanation in March after several users reported hearing Alexa laugh at random times.

The assistant, the company said, had “in rare circumstances” mistakenly heard “Alexa, laugh.” As a result, Amazon changed the phrase for that command to “Alexa, can you laugh?” and had the device verbally acknowledge such requests.

ADVERTISEMENT

This month, researchers at the University of California, Berkeley, said in a published paper that they had proved that the technology could be exploited, too.

The researchers said that they were able to hide commands in recordings of music or spoken text that went unnoticed by humans but were understood by personal assistants such as Apple’s Siri, Google’s Assistant and Amazon’s Alexa.

This article originally appeared in The New York Times.

NIRAJ CHOKSHI © 2018 The New York Times

Enhance Your Pulse News Experience!

Get rewards worth up to $20 when selected to participate in our exclusive focus group. Your input will help us to make informed decisions that align with your needs and preferences.

I've got feedback!

JOIN OUR PULSE COMMUNITY!

Unblock notifications in browser settings.
ADVERTISEMENT

Eyewitness? Submit your stories now via social or:

Email: eyewitness@pulse.ng

ADVERTISEMENT
ADVERTISEMENT