How secure are your "smart home" speakers?

Apple unveils Siri-controlled speaker at annual conference

Smart speakers are pretty good listeners, too. Which makes you wonder if your conversations are staying at home.

Apple raised the question when it talked up the security and privacy aspects of its newly unveiled HomePod, a Siri-powered speaker that takes aim at the Amazon Echo and Google Home.

It's an intriguing point to consider at a time when millions of consumers have purchased a smart speaker for their home. While Google and Amazon spend a lot of time talking about the intelligence of their respective assistants and the convenience they offer, there's little mention of security or privacy.

These voice assistants for the home all generally work the same: They're only listening after you activate with a wake word, and then the audio is recorded, sent to the company's servers and given a response.

Recorded audio is already a topic of debate. In Amazon's battle over Echo data in a murder case, the company argued that the First Amendment protected voice commands but eventually handed over the recordings.

Here's where the three major smart speakers stand on protecting a person's privacy, both from the government and from hackers.

Encryption

Apple's HomePod, Google Home and Amazon Echo all encrypt the voice recordings sent to their respective servers. But there are varying degrees of how they keep the data secret.

Smart home speakers, from left: Apple HomePod, Amazon Dot and Echo, and Google Home. Reuters / CNET / Google

At Apple's Worldwide Developers Conference earlier this week,  Phil Schiller, the head of Apple's marketing, said the HomePod's data would be encrypted, but he did not go into detail. A person familiar with HomePod's development said it would have the same level of encryption as Siri and HomeKit.

In its iOS security guide from March, Apple noted that Siri communications occur on servers over HTTPS, which encrypts data between an iPhone and another device.

Data for the Google Home is encrypted in transit and at rest, which means that it's protected as it heads to Google's servers and encoded again where it's stored.

On the Amazon Echo, conversations with Alexa are also encrypted in transit and at rest from your device to Amazon's cloud servers and "securely stored," a spokeswoman said in an email.

This mostly means that your data is unlikely to be stolen or spied on as it's being sent to Apple's, Google's or Amazon's servers. But when it comes to protecting people from government requests, that's a different story.

ID, please?

Amazon was able to provide data for a murder trial because all the recordings, even though they were encrypted, are linked to individuals.

"The recordings are securely stored in the [Amazon Web Services] cloud and tied to your account to allow the service to be personalized for each user," an Amazon spokeswoman said in an email.

Can Amazon Echo data be used as evidence in murder case?

Similarly, Google Home collects data from your apps, your search and location history, and your voice commands, which are all tied to your Google account.

Each Google Home requires an account tied to it, but it's possible to create dummy accounts that wouldn't have all your personal information. That's different from the Echo, for which you need an Amazon account -- which has your credit card information and shipping address.

If a government agency requests data from Google or Amazon from a voice assistant, they can point to accounts associated with the user.

It's a different situation with the HomePod. The data sent from Apple's speaker is anonymized, meaning there's no name or Apple ID attached to your commands. It works just like Siri, with random identifiers used only within the device.

So if the government requests Siri data on a specific user, Apple would not be able to pick that info out of millions of random numbers. That's useful, considering Apple gets slammed with thousands of national security requests every year.

Amazon and Google both have policies for dealing with demands for data on the Echo and the Home. Amazon won't release data unless there's a "valid and binding legal demand," while Google fights to narrow down more than 45,550 requests a year. And for both companies, the recordings are saved until you decide to delete them manually.

For Siri, voice recordings are saved for six months on Apple's voice recognition servers to understand a user better. After that, they're deleted automatically and another copy -- without any identifiers -- helps improve Siri for up to two years.

With anonymized IDs, Apple's speakers have a much more compelling argument for not handing over data: They can't find it.

In the game of hide and seek with your voice data, the advantage -- for now -- goes to Apple.

This article originally appeared on CNET.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.