1,000 False Wakewords: A Letter! Buy 200 Toilet Rolls

Researchers have found a thousand ways to say smart-speaker wakewords: Alexa, okay Google, hey Siri, and so on. It highlights the problem of misheard speech causing private audio to be squirreled away on corporations’ servers for later analysis.

We talked about this before, 14 months ago. And there’s no reason to believe things are any better today.

As we said back then, the companies’ analysis teams include many employees and contractors in low-wage, overseas economies. So do the math. In today’s SB Blogwatch, we count on our fingers (not on our voice).

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: Time travel is real.


Election! Confirm Purchase.

What’s the craic? Dan Goodin reports—“1,000 phrases that incorrectly trigger Alexa, Siri, and Google”:

Claroty

 As Alexa, Google Home, Siri, and other voice assistants have become fixtures in millions of homes, privacy advocates have grown concerned [about] their near-constant listening to nearby conversations. … New research suggests the privacy threat may be greater than previously thought.

The findings demonstrate how common it is for dialog in TV shows and other sources to produce false triggers … sending nearby sounds to Amazon, Apple, Google, or other manufacturers. … Researchers uncovered more than 1,000 word sequences … that incorrectly trigger the devices.

Examples of words or word sequences that provide false triggers include
Alexa: “unacceptable,” “election,” and “a letter.”
Google Home: “OK, cool,” and “Okay, who is reading.”
Siri: “a city,” and “hey jerry.”

When devices wake, the researchers said, they record a portion of what’s said and transmit it to the manufacturer. … Fragments of potentially private conversations … may then be transcribed and checked by employees.

Unseren Lebensabschnittpartneren? Svea Eckert, Eva Köhler, Jan Lukas Strozyk and Henning Wirtz are lost in translation—“The companies know about the problem”:

 Google explains that English-speaking users can set the control over the sensitivity of the hotword recognition of their device according to the needs of the users — in addition, voice recordings would only be processed manually if users have activated their voice and audio settings.

Apple explains: “We focus on doing as much as possible on the device itself and minimizing the amount of data we collect with Siri.”

Amazon points out that the voice recordings would never be linked to customer-specific information. But why offer them weak activation words like “Computer” or “Amazon”? “We want customers to be able to choose the most suitable activation word for them,” it says.

From the researchers’ point of view, a solution to the problem could be a simple voice command: “Alexa, ignore me.” Everything spoken should then remain private — you could switch off this privacy mode with a security voice command, in which you say the activation word three times as if you were calling the Hollywood film ghost Beetlejuice: “Alexa, Alexa, Alexa.”

Says who? Lea Schönherr, Maximilian Golla, Jan Wiele, Thorsten Eisenhofer, Dorothea Kolossa, und Thorsten Holz—“Unacceptable, where is my privacy?”:

 Voice assistants … analyze every sound in their environment for their wake word, e.g., “Alexa” or “Hey Siri,” before uploading the audio stream to the cloud. This supports users’ privacy by only capturing the necessary audio and not recording anything else.

In July 2019, we started measuring and analyzing accidental triggers of smart speakers. We automated the process of finding accidental triggers and measured their prevalence across 11 smart speakers from 8 different manufacturers using professional audio datasets and everyday media such as TV shows and newscasts.

In our paper, we analyze a diverse set of audio sources, explore gender and language biases, and measure the reproducibility of the identified triggers. … By reverse-engineering the communication channel of an Amazon Echo, we are able to provide novel insights on how commercial companies deal with such problematic triggers.

Finally, we analyze the privacy implications of accidental triggers and discuss potential mechanisms to improve the privacy of smart speakers. … Our work is currently under submission. We will update this website, post a preprint, and share our dataset.

Can you relate? WaPo’s Travis M. Andrews shouts, “Alexa, just shut up”:

 Surely, you can relate. … So many of us can.

A certain animosity for these devices has grown. … [But] for all these devices’ ills … not everyone wants to chuck these faceless robots out the window and into the bed of a passing truck in hopes that they’ll be taken far, far away.

In fact, according to both Google and Amazon, people are using them even more than usual for music, entertainment and especially cooking. Alexa fielded more culinary questions in a week in April 2020 than during Thanksgiving week last year.

(Repeat after us: Jeff Bezos, the founder and chief executive of Amazon, owns The Washington Post.)

But Crackhead Johny wonders why people are surprised:

 I installed a bug in my house and now I worry someone is listening in on me.

And Rosco P. Coltrane agrees:

 Well, duh. … If you buy one of those awful devices, you know whatever you say to it will be shipped off to some server somewhere, and the data mishandled, and whatever private information you give it milked for all it’s worth — even if it records exactly what you want to say to it.

I don’t get people who are okay to ask “Alexa, what’s the best cure for the clap?” … but get their pants in a knot when [it] triggers unexpectedly when their TV blurts out something unfortunate. If you’re truly concerned, stay clear the hell away.

If you use them, don’t come crying afterwards when Big Data invades your privacy. It’s that simple.

Looking for some Schadenfreude? oli5679 found some for you:

 Google WFH during pandemic has lead to funny Google Assistant issues. Since many Googlers have one in their kitchen, mentioning ‘Google’ on a call often triggers a chain reaction.

It’s still pretty darn sci-fi, though. Yes and no, thinks Boskone:

 All I want is the computer from Star Trek. It’s the one piece of tech we have any hope of reproducing. But given the current state of affairs, Picard would be talking in his office about repairs to the cetacean bay, and the computer would misinterpret something and launch a barrage of torpedoes.

Do you think that your recordings will stay private? changoplatanero proves employees listen to them:

 Someone on the Alexa team told me that they had false triggers from the phrase ‘put your pajamas on’ sounding like Amazon.

Meanwhile, Brian Bixby boggles:

 Oh good grief, your conversations at home aren’t really that interesting to the rest of us. You’re not an international spy or CEO of a multinational corporation so your conversation with the cat are perfectly safe because no one really gives a ****.

And Finally:

Sam explains relativistic gravitational time dilation (I know, right?)

Previously in And Finally


You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or [email protected]. Ask your doctor before reading. Your mileage may vary. E&OE. 30.

Image sauce: Michael Bußmann (via Pixabay)

Richi Jennings

Richi Jennings is a foolish independent industry analyst, editor, and content strategist. A former developer and marketer, he’s also written or edited for Computerworld, Microsoft, Cisco, Micro Focus, HashiCorp, Ferris Research, Osterman Research, Orthogonal Thinking, Native Trust, Elgan Media, Petri, Cyren, Agari, Webroot, HP, HPE, NetApp on Forbes and CIO.com. Bizarrely, his ridiculous work has even won awards from the American Society of Business Publication Editors, ABM/Jesse H. Neal, and B2B Magazine.

richi has 664 posts and counting.See all posts by richi