SBN

Who’s Listening? When Voice Assistants Invade Your Privacy

A couple stands near voice assistants that may have been accidentally triggered.

Just this January, Apple featured a billboard at the Consumer Electronics Show in Las Vegas. “What happens on your iPhone stays on your iPhone,” the billboard stated. The recent news that Apple had contractors listening to Siri conversations for quality control—and that these contractors sometimes had access to deeply personal conversations—debunks this claim. While this news is unsettling, it brings up something important that we all need to know and talk about: when we use voice assistants like Siri, whatever we say no longer belongs to us. The voice-to-text travels off the device, into the cloud and outside of our control. 

Here’s a rundown on what happened with Apple: while a lot of Siri’s processing occurred on device (a good explanation on how it works here), a small percentage—Apple states less than 0.2%—of Siri requests were being sent to Apple contractors for “grading” or to see how accurately the assistant recognized people’s voices, whether it could help with the query and if it responded appropriately. A whistle blower working for Apple expressed concern over the amount of times the assistant was triggered accidentally (something as simple as saying “Syria” can trigger Siri—example here), resulting in contractors hearing extremely personal conversations including business deals and conversations between doctors and patients. While the data contractors were handling was supposedly anonymized, the whistle blower noted that it would be easy to identify the speaker because of the nature of these personal conversations. 

Apple’s response to the situation has been to apologize, introduce a default opt-out for Siri recordings and a guarantee that no contractors will have access to any (opted-in) recordings. But Apple isn’t an isolated situation. Whenever we use any kind of voice-to-text we lose control of our data. Case in point: Amazon and Google also employ staff to listen to their digital voice assistant recordings.

It’s easy to assume that talking to a voice assistant is an innocuous act.  After all, when we have a conversation—whether in-person or over the phone—we don’t expect our voices to be recorded and sent anywhere for analysis. But the fact of the matter is that talking to a voice assistant or even just talking while a voice assistant is nearby isn’t the same as a regular conversation. Our voices become data and that data gets sent off of the device and beyond our control. 

Deciding which tools to use to communicate needs to be a conscious decision. The general default is that a tool is collecting data on you to use for a company’s benefit. When you want to communicate—especially about sensitive topics like healthcare or business deals—the right tool is really important. At Vaporstream, we understand this and provide you with a tool that protects your conversations at all times. Learn what makes us secure here

Contributor: Avi Elkoni


*** This is a Security Bloggers Network syndicated blog from Vaporstream authored by Avi Elkoni. Read the original post at: https://www.vaporstream.com/blog/voice-assistants-and-privacy/

Secure Guardrails