Q&A: The troubling implications of normalizing encryption backdoors — for government use
Should law enforcement and military officials have access to a digital backdoor enabling them to bypass any and all types of encryption that exist today?
We know how Vladmir Putin, Xi Jinping and Kim Jung-un would answer: “Of course!”
Related: Nation-state hacks suggest cyber war is underway
The disturbing thing is that in North America and Europe more and more arguments are being raised in support of creating and maintaining encryption backdoors for government use. Advocates claim such access is needed to strengthen national security and hinder terrorism.
But now a contingent of technology industry leaders has begun pushing back. These technologists are in in full agreement with privacy and civil rights advocates who argue that this is a terrible idea
They assert that the risk of encryption backdoors ultimately being used by criminals, or worse than that, by a dictator to support a totalitarian regime, far outweighs any incremental security benefits. I had an invigorating discussion with Jeff Hudson, CEO of Venafi, about this at Black Hat USA 2018.
Venafi is the leading provider of machine identity protection. Machine to machine connection and communication needs to be authenticated to access systems, so this technology is where the rubber meets the road, with respect to this debate. For a full drill down, please listen to the accompanying podcast. Here are excerpts edited for clarity and space:
LW: What’s wrong with granting governments the ability to break encryption?
Venafi: It has been established over a long period of time that the minute you put a backdoor in, and you think it’s secure, it almost immediately will fall into the wrong hands. Because it’s there, the bad guys will get to it. This makes backdoors the worst possible things for security.
The government wants to be able to surveil network traffic and They want backdoors so they can see everything. If they can see all the traffic all the time, they can just sit back and surveil everything.This lack of privacy is what we see in China, Russia and North Korea, where the government sees and inspects everything.
LW: Not exactly democratic.
Venafi: We have the Fourth Amendment that protects against illegal search and seizure. So when the United States government, or any Western government, says, ‘we need encrypted tunnels and we will hold on to the keys to those tunnels’, that’s moving in the direction of total surveillance.
And that is really dangerous to individual privacy; it just absolutely tilts the balance of power completely towards a centralized surveillance capability – and away from the individual.
LW: What is ‘encryption key escrow’ and how does it fit into this debate?
Venafi: An escrow service is something that holds onto something else. In this case it’s an encryption key. The government wants those keys to be in a place where they can get to them, in escrow, so to speak.
So the minute that key is in the hands of a third party you’d have to ask, ‘where is the protection?’ The bad guys will now know exactly where to go to get the keys. You’ll never be able to make any kind of a guarantee that if you put something in an escrow service it is going to be safe.
Also, you have to ask what reasons the government might invoke to use the keys? National security? Is someone’s life at risk? Does political partisanship come into play?
LW: So that’s where this debate is focused?
Venafi: That is one of the discussion items, put encryption keys in escrow and then, under certain conditions, somebody can go get them. We already have those structures in place; they’re called search warrants. But from a technology standpoint, it’s a bad idea. If, say, Apple had put all of its encryption keys into an escrow service, those keys would be gone by now, and maybe in the hands of bad guys.
LW: Is there anything the tech industry can do to reframe the debate?
Venafi: The problem is there’s this unholy alliance between the big cloud providers, who want to know all identities and be able to see everything that goes on, so they can monetize them, Meanwhile, governments want that same thing, so they can be in complete control.
If technology companies work too closely with the government, they could actually destroy the confidentiality that we’ve all come to expect and want in our lives.
LW: Does Apple deserve some credit for resisting the FBI’s demand for the encryption keys to the San Fernando terrorist shooter’s iPhone?
Venafi: Apple has taken a stance against coughing up individual information. But Apple sells devices, so they don’t monetize information the same way that other organizations do.
LW: What about Facebook and the Cambridge Analytica scandal?
Venafi: Cambridge Analytica is just an amazing case study of what can happen when machines can actually pull all this information together, and the massive power of that. Machines can be used to target individuals, and change people’s perceptions — and to do it instantaneously, sweeping across nations.
That’s brand new. That changes the first principle, we lived by to this point; which is you have to expend energy, get on a soap box, and print stuff to persuade someone.
Not anymore.
One person can send out a single tweet that gets out to 40 million people, and immediately change the public’s perception. It’s just a really interesting time for humanity.
(Editor’s note: Last Watchdog has supplied consulting services to Venafi.)
*** This is a Security Bloggers Network syndicated blog from The Last Watchdog authored by bacohido. Read the original post at: https://www.lastwatchdog.com/qa-the-troubling-implications-of-normalizing-encryption-backdoors-for-government-use/