Whistleblower Says Apple Built Secret Dossier on You, via Siri

An ex-Apple contractor has doubled-down on his warning that Apple is misusing audio recordings from the Siri voice assistant. He’s goading European regulators into taking action.

“What happens on your iPhone, stays on your iPhone.” Even if these allegations are partly true—which Apple admits—Tim’s slogan is misleading, at best.

The whistleblower claims recordings were linked to other PII, including behavioral information, “in huge data sets, ready to be exploited by Apple.” In today’s SB Blogwatch, we speak truth—or otherwise—to power.

Your humble blogwatcher curated these bloggy bits for your entertainment. Not to mention: 21×TDc.


GDPR Toothless?

What’s the craic? Laura Kayali reports—“Apple whistleblower calls for European privacy probe”:

 A former Apple contractor called on EU privacy watchdogs … to investigate the firm’s “past and present” use of Siri recordings nine months after reports emerged of the firm listening in without users’ knowledge. … Despite [Apple] publicly acknowledging that users might not have been fully aware … at the time, no investigation was opened by the privacy regulators in charge.

Thomas Le Bonniec, who from May to July 2019 worked on a Siri transcription project in Cork, Ireland … chose May 20 to send his letter because the date is days ahead of the European flagship privacy reform’s second anniversary, he told [me. He] hopes his public letter will pressure regulators to act and says he’s ready to testify in future probes.

“We have legislation at EU level that is supposed to be good for something. Up until now, we’re under the impression that there was no repercussion,” … he said, adding that he was breaching his nondisclosure agreement to help regulators investigate.

Apple resumed its own Siri program in October with changes: Recordings are no longer stored by default, unless users agree to do it, and only Apple employees, instead of contractors, have access to the samples. The company declined to disclose how many users had opted in. … A spokesperson for the Irish privacy regulator said, “We’re still engaged with Apple on a number of fronts, we’re still getting answers to questions.”

So far, so mealy-mouthed. Kieren McCarthy cuts to the chase—“Hey Siri, are you still recording people’s conversations?”:

 Apple may still be recording and transcribing conversations captured by Siri on its phones, despite promising to put an end to the practice … claims a former Apple contractor. … “Nothing has been done to verify if Apple actually stopped. … Sources already confirmed to me that Apple has not,” he said.

How did Apple justify what would appear to be a transparently illegal act carried out daily on millions of people? It didn’t.

What actually happened nine months ago? Alex Hern reminds us—“Apple whistleblower goes public over ‘lack of action’”:

 Le Bonniec, 25 … quit in the summer of 2019 due to ethical concerns with the work. “They do operate on a moral and legal grey area … and they have been doing this for years on a massive scale. They should be called out in every possible way.”

Following the revelations … Apple promised sweeping changes to its “grading” program, which involved thousands of contractors listening to recordings made, both accidentally and deliberately, using Siri. The company apologised, brought the work in-house, and promised that it would only grade recordings from users who had explicitly opted-in. … But, Le Bonniec argues, the company never really faced the consequences for its years-long programme.

“I listened to hundreds of recordings every day, from various Apple devices. … These recordings were often taken outside of any … intention from the user to activate [Siri]. These processings were made without users being aware of it, and were gathered into datasets. … The recordings were not limited to the users of Apple devices, but also involved relatives, children, friends, colleagues, and … recorded everything: names, addresses, messages, searches, arguments, background noises, films, and conversations. I heard people talking about their cancer, referring to dead relatives, religion, sexuality, pornography, politics, school, relationships, or drugs. … Apple has not been subject to any kind of investigation to the best of my knowledge.”

Yikes. WiscoNative summarizes:

 People want Apple punished because they told users that Siri protected their privacy, but then had third-party contractors listening to Siri recordings.

Apple … shouldn’t be able to just say “Oops, you caught us. We’ll change that now,” and be done with it. When there is no punishment, companies see that there’s no real downside to misleading users, and it will happen more.

Apple … should suffer consequences for misleading users.

But “it’s actually not even that bad,” says TomWinTejas:

 One of the reasons Siri sucks compared to Google Assistant or Alexa is that Siri actually has no knowledge of who you are. Your Siri experience is tied to a unique identifier, but one that is not tied to your personal information.

If you have location services enabled your location is shared along with the unique identifier … but it’s optional and the intent is to improve experience by understanding requests in relation to your location. … The way it was initially reported made it seem as if the contractors knew that Sally Jones at 456 Main St in Springfield had told her husband to stop leaving the toilet seat up because they were listening at all times and had so much personal information attached with each recording.

Wait. Pause. It gets worse: Most commentators seem to have missed the really damning part of Le Bonniec’s open letter:

 Furthermore, other workers were employed on another project (called “Development data”). In the context of this project, words were tagged in the recordings to be linked to users’ data, such as their phone contacts, locations, or music.

In other words, staff assigned to the project had access to personal user information, and used it to be able to link it to Siri commands. This means that users’ playlists, contact details, notes, calendars, photos, maps, etc. were gathered in huge data sets, ready to be exploited by Apple for other projects.

Apple issued a statement on 28th August 2019 … tacitly acknowledging that they were using illegal recordings.

Double-yikes. doublelayer isn’t half-surprised: [You’re fired—Ed.]

 Unfortunately, it’s not just Apple doing this. Amazon and Google were both caught keeping databases of this stuff and they’re almost certainly still doing it. Microsoft probably doesn’t have a database because who uses Cortana?

People are going to have to learn that data is stored and analyzed and monetized and published and leaked and they should probably care. So far, they don’t seem to have figured that out.

Yeah, but … oh never mind. Instead, Anonymous Brave Guy drips with irony:

 On the bright side, now that our elected representatives and lawyers and doctors and accountants and senior managers and leading researchers are doing lots of work from home and communicating remotely, what could possibly go wrong with embedding listening devices with network access and Internet connections in those homes?

It’s not as if these people deal with any sensitive or confidential information, and we can surely count on the moral integrity of the companies that were all doing this stuff until they were caught and publicly shamed for it to protect us against abuse.

Meanwhile, Pat Riot doesn’t believe the whistleblower:

 Siri still sucks so much at recognition that I just can’t believe they have anyone actually working to improve it.

And Finally:

21 covers of Teenage Dirtbag (plus two versions by the original singer/songwriter, Brendan B. Brown)

Hat tip: Thom Dunn

Previously in And Finally


You have been reading SB Blogwatch by Richi Jennings. Richi curates the best bloggy bits, finest forums, and weirdest websites … so you don’t have to. Hate mail may be directed to @RiCHi or sbbw@richi.uk. Ask your doctor before reading. Your mileage may vary. E&OE.

Image sauce: ValueWalk (cc:by-sa)

Featured eBook
Managing the AppSec Toolstack

Managing the AppSec Toolstack

The best cybersecurity defense is always applied in layers—if one line of defense fails, the next should be able to thwart an attack, and so on. Now that DevOps teams are taking  more responsibility for application security by embracing DevSecOps processes, that same philosophy applies to security controls. The challenge many organizations are facing now ... Read More
Security Boulevard

Richi Jennings

Richi is a foolish independent industry analyst, editor, writer, and fan of the Oxford comma. He’s previously written or edited for Computerworld, Petri, Microsoft, HP, Cyren, Webroot, Micro Focus, Osterman Research, Ferris Research, NetApp on Forbes and CIO.com. His work has won awards from the American Society of Business Publication Editors, ABM/Jesse H. Neal, and B2B Magazine.

richi has 180 posts and counting.See all posts by richi