DoJ’s Irresponsible ‘Responsible’ Encryption
On Oct. 8, 2017, David Patterson, Sr. died on a reservation in New Mexico. Patterson was one of the last surviving World War II “Navajo code talkers” employed by the government because of the unique method of communication which, to the outside world, was undecipherable. Communications of the Navajo, even if intercepted, could not be understood by the uninitiated. No search warrant would have helped the Japanese understand the communications. No technology would help them make sense of the Lakota language. The code, like the Chocktaw code used by Americans in World War I was, for all intents and purposes, unbreakable. That’s what made it valuable.
Throughout the history of “encryption,” whether it was Phonetician clay tablets, Roman ciphers or German Enigma machines, the party seeking to have confidentiality used the best available technology to keep a secret and the party seeking to exploit it used the best available technology to break the code. And that was true whether the exploiter was a foreign government, an evil-doer or a law enforcement or intelligence agency. While no encryption scheme is perfect, and modern crypto is vulnerable to, among other things, the ravages of Moore’s law, there’s a new threat to the confidentiality of documents and communications: The U.S. Department of Justice (DoJ).
The DoJ wants to mandate that all entities use encryption products that have a “back door”—or, something that isn’t a back door but gives the U.S. government (and presumably no “bad” government) with an appropriate warrant or court order (or, you know a really really good reason) the ability to crack the crypto. Don’t worry: They would never use it for bad purposes.
In a recent speech to the U.S. Naval Academy, Deputy Attorney General Rod Rosenstein called upon hardware and software developers to create what he termed “responsible” encryption products—those which create the ability of the government, armed with a search warrant or other order, to decrypt the contents of communications or files. DAG Rosenstein noted:
When encryption is designed with no means of lawful access, it allows terrorists, drug dealers, child molesters, fraudsters and other criminals to hide incriminating evidence. Mass-market products and services incorporating warrant-proof encryption are now the norm. Many instant-messaging services employ default encryption designs that offer police no way to read them, even if an impartial judge issues a court order. The makers of smartphones previously kept the ability to access some data on phones, when ordered by a court to do so. Now they engineer away even that capability.
Rosenstein decried the fact that the contents of thousands of lawfully seized computers remain unexamined because they are encrypted, and that the contents of billions of instant messages remain unsurveilled because of end-to-end encryption. Rather than calling for a “back door,” the DoJ called for the deployment of technologies already in place that permit corporate or institutional monitoring of communications, such as the central management of security keys and operating system updates; the scanning of content, such as your e-mails, for advertising purposes; the simulcast of messages to multiple destinations at once; and key recovery when a user forgets the password to decrypt a laptop.
He’s right, of course. Strong encryption—particularly that which enables individual users to control who can and cannot see their files or documents—prevents others, including law enforcement and intelligence agencies, from seeing these communications for “good” purposes. It thwarts the ability of courts to effectively issue warrants for these documents or communications. He’s also right when he observes that internet companies collect and use a massive amount of data when it suits their needs—big data analytics, marketing, advertising, etc.—but won’t build in the ability to conduct either criminal investigations or mass surveillance by governments. So your email can be read by Google so the company can promote the new Star Trek Discovery series, but not to the FBI under court order to prevent child molestation. What law enforcement calls the “going dark” problem is real and growing, and will result in crimes being unable to be investigated, prosecuted and, in some limited cases, prevented. A sensible, reasonable and responsible mechanism to allow legitimate and limited law enforcement access to encrypted files, computers, communications and other information is sane, balanced and a good idea.
It’s also impractical, unworkable and makes the entire world less safe.
Despite Rosenstein’s exhortation that he is not looking for a “back door,” the truth is that “accessible encryption” is weak encryption and is fundamentally inconsistent with the basic purposes of encryption. Regardless of whether this weakened encryption is achieved through key management, back doors, data analytics, pre- or post-encryption analytics or other technical mechanisms, at the end of the day the data is either strongly encrypted or it’s not. Telling is Rosenstein’s comment, “Technology companies almost certainly will not develop responsible encryption if left to their own devices. Competition will fuel a mindset that leads them to produce products that are more and more impregnable. That will give criminals and terrorists more opportunities to cause harm with impunity.”
Developing products that are “more and more impregnable” is exactly what we want companies to do. This is what prevents data breaches. This is what secures the power grid. This is what keeps the government’s secrets secret. The government’s efforts to weaken encryption in the 1980s and the restriction of the export of commercial encryption products may have set back the security industry decades in developing ubiquitous encryption of communications by default. We need more security (which means more encryption), not less.
The DoJ speaks of the good old days when it could get warrants (or even the better old days when telephone wiretaps required no warrants) to listen in on people’s phone calls or read their mail. Never in the history of the world could people have absolute security of communications free from warrants. That is partially true.
But never in the history of the world has the government—all governments (and, by extension, hackers)—had the ability to know so much about so many with so little effort. Despite Rosenstein’s protestations to the contrary, we are in a “Golden Age” of surveillance—except that it’s getting more golden by the minute. The government has access to third-party data that shows our location, communications, pulse rate, friends, politics, purchases … just about everything. And the government can get this from third parties or by hacking IoT devices, using Stingray or other devices to suck up the data, and retrieving data from years and decades past.
Imagine if the government obtained the communications of code talker David Patterson. To comprehend the communications, the government would either have to crack the code or get Patterson or a co-communicator to break it for them. What it wouldn’t do is make it illegal for Patterson to communicate in a way that the government can’t understand. And that’s what the DoJ proposal amounts to. If we weaken encryption for the DoJ, we weaken it for everyone. And that’s just not responsible.