U.S. AG Wants Legal Back Door to All Facebook Postings (and Everything Else)

When it comes to encryption, we are having another Groundhog Day. The U.S., UK and Australian governments are pressuring Facebook to stop its plans to include “end to end” encryption, which would enhance the privacy and security of the social media platform. Such encryption would ensure—or at least help to ensure—that users themselves maintained control over the privacy of data stored on the company’s networks and devices; that breaches of security of Facebook servers would not necessarily expose such information; and that both the privacy of this information and the security and reliability of the information would be better protected. Facebook users would have a much greater ability to control how their data is used, who has access to it, with whom it is shared and how it is protected. All of this is part of Facebook’s “Privacy First” program, implemented at least in part because of efforts by the U.S. Federal Trade Commission (FTC) to punish Facebook and impose billions of dollars of fines to the social media giant for not protecting users’ data. As a result, Facebook wants to empower users to encrypt and secure their own data.

And that’s what concerns these Western democracies.

On Oct. 4, U.S. Attorney General Bill Barr and Acting Secretary of Homeland Security Kevin McAleenan, together with the UK Secretary of State Priti Patel and the Australian Minister for Home Affairs Peter Dutton, sent a letter to Facebook CEO Mark Zuckerberg formally requesting “that Facebook does not proceed with its plan to implement end-to-end encryption across its messaging services …” In particular, these government officials want to make sure that, in implementing its “Privacy-Focused Vision for Social Networking,” as mandated by its FTC settlement, Facebook makes sure that it provides a “back door” for law enforcement and intelligence agencies to snoop on any and all communications stored on or processed through Facebook, including Facebook’s facial recognition, data analytics, private messages and others—in other words, that it does not implement end-to-end encryption. At least the way that system is designed to work.

The regulators explain that “[c]ompanies [like Facebook] should not deliberately design their systems to preclude any form of access to content …” This is because, as the letter indicates, “law enforcement rel[ies] on obtaining the content of communications, under appropriate legal authorization, to save lives, enable criminals to be brought to justice, and exonerate the innocent.” As a result, the regulators ask that Facebook essentially embed some kind of “back door” into the design of its network, in a way that would enable both Facebook and law enforcement to access all consumer content “in a readable and useable format,” allow the governments to direct how this is done (consult with them in a way that “genuinely influences” the design) and not implement any security and privacy measures until the governments are satisfied that it meets their needs for access to everything.

The letter comports with the U.S. Department of Justice’s (DoJ) long-stated concern about the so-called “going dark” problem—that is, that government agencies, equipped with warrants or other legal authority (including warrantless intelligence intercepts) not be thwarted in its efforts to obtain the full access to data because that data is encrypted by the user. This would thwart governments’ efforts to prevent child pornography and abuse, stop terrorist attacks or prevent a disaster of biblical proportions—fire and brimstone coming down from the skies, rivers and seas boiling and 40 years of darkness. Earthquakes, volcanoes, the dead rising from the grave, human sacrifice, dogs and cats living together – mass hysteria.

Now the government definitely does not want Facebook to build in a “back door.” A back door would be bad. A back door would pretty much defeat all of the security and privacy controls that Facebook would be implementing—at least if someone obtained the back door key. I mean, try to imagine all life as you know it stopping instantaneously and every molecule in your body exploding at the speed of light—total protonic reversal. Right. That’s bad. Okay. All right. Important safety tip. Thanks, Egon.

So the governments don’t want a back door. They just want the ability to decrypt anything and everything on the system, to see everything in plain text, to analyze everything that passes through the system, to have access to data analytics, to see communications in real time and to do anything with that data. But definitely not a back door. More like, say, a hall pass.
The problem is, the technology to permit law enforcement to access networks, devices, data, and communications—and only to permit law enforcement to do so, and only to do so when accompanied by a search warrant, and only when that warrant is proper, and only when the warrant has been signed by a “neutral and detached magistrate,” and only when the warrant is supported by probable cause, and only when the warrant specifies precisely what is to be seized and examined, and only permits those things to be seized and examined and nothing else, and only when the data subject has been appropriately informed of the existence of the warrant and provided an opportunity to challenge the warrant, and only when the warrant is executed within the territorial jurisdiction of the court with the authority to issue it, and only to permit access to those files specifically mentioned in the warrant, and only when the warrant is supported by probable cause under oath or affirmation—does not exist.

Moreover, the Facebook data is frankly not the Facebook data. It’s the data of the users. One of the big issues when it comes to any internet-accessible user-generated content is, Who “owns” the data? Currently, the government can compel any “custodian” of information to produce records or alternatively can search any place it has either the jurisdiction or authority to search irrespective of ownership. This is what is called the third-darty doctrine. While a government agency can compel me to produce my records (and I have certain rights with respect to those records, including some rights against self-incrimination), it gets a bit dicey when the government seeks to compel someone else to provide their records about me. Historically, my phone records, my bank records, my accounting records, my credit records and my spending records can be compelled for production by a third party because, even though they are about my activities, they are records of the phone company, the bank, the accounting firm or whomever about me. The compelled process—whether a subpoena, a search warrant, an NSL or or the like—is directed at them, for their records.

For social media posts, cloud storage, direct messages and the like, however, not so much. So if I have a cache of child porn in my home, the cops could (and would) get a search warrant, kick in my door, take the documents and related information (and me). I would then be able to challenge the scope of the warrant, the probable cause, the execution of the warrant, etc., because I would know that the search occurred. This would likely be true if I owned or rented my flat.

If I stored that same cache of records in a U-Haul storage facility, the warrant would be directed at U-Haul and I might (or might not) be notified of the search, but would have the opportunity to object.

Now take a company that stores its business records on a cloud server. Storage as a service. As part of its good practice, it stores these documents in an encrypted manner. While the government can compel the storage facility to “produce” these records (or give the government access to the storage), the customer holds the key to decryption of the records. It’s this that the DoJ wants to change. It wants third-party intermediaries—social media, telecom and other communications media, cloud providers and other third parties—to develop, deploy, implement and mandate technologies that will give the third parties full access to the contents of what they have the ability to “touch” so that the third-party intermediary can both monitor what its customers are doing and so it can give these governments full access (with a warrant) to everything the customers are doing.

Even if you trust the government (well, all of the governments) to only use this power for good and not evil, there are two problems. First, it won’t work. Second, it might.

While there are multiparty multifactor encryption key exchange mechanisms, they all rely on the fact that the “master key” (which can be a single key, a multipart key, a one-time-use key) is securely generated and protected. One key to rule them all. You are essentially building in a weakness in both security and authentication into a mathematical formula that is designed not to do so—if properly designed. It’s like coming up with a biometric passport system that is both 100% accurate and capable of being spoofed (so that Jason Bourne can sneak into the country). It’s unlikely that such a system could be developed, and, if developed, that the system would be secure—or even securable. With hundreds of thousands of law enforcement and intelligence agencies around the world, and millions of people employed by such agencies, it’s inconceivable to think that every use of the back door key would be appropriate.

You want my Facebook postings? Get them from me. Haul my butt into court and compel me to produce it. Force me to decrypt it and produce it. We have had this battle before. We had the ability—at least in theory—to have encryption of all communications—email, VOIP, text, messaging, etc., by default embedded into any POP or SMTP protocols. All communications would be secure. But that would mean that the government couldn’t do surveillance. And so the government(s) did not want that to be the default. As a result, we have plain text communications, with all of the privacy and security problems inherent therein. And, of course, bad guys (and many good guys) simply add a layer of encryption over the transport layer anyway. So do we want security and privacy by default with a limit on government access or do we want government access by default with little security and privacy. Which is better for society overall?

I’m all for the government getting access to records of terrorists and pedophiles. I’m all for preventing crime. But making all communications insecure because you want some of them is likely not the best strategy for overall security.

End-to-end encryption, properly deployed and implanted, and deployed by default with little need for user input is currently the best overall security—particularly if it also includes encryption of data and not just of connections. If there were a “magic bullet” that would search the mind of the person who sought to access the data and files and determine if they were of pure heart and mind, then I would be all for it. Until then, more universal end-to-end encryption is good for users, good for Facebook, good for overall security, good for privacy and good for each of the countries that sent the letters.

Mark Rasch

Avatar photo

Mark Rasch

Mark Rasch is a lawyer and computer security and privacy expert in Bethesda, Maryland. where he helps develop strategy and messaging for the Information Security team. Rasch’s career spans more than 35 years of corporate and government cybersecurity, computer privacy, regulatory compliance, computer forensics and incident response. He is trained as a lawyer and was the Chief Security Evangelist for Verizon Enterprise Solutions (VES). He is recognized author of numerous security- and privacy-related articles. Prior to joining Verizon, he taught courses in cybersecurity, law, policy and technology at various colleges and Universities including the University of Maryland, George Mason University, Georgetown University, and the American University School of law and was active with the American Bar Association’s Privacy and Cybersecurity Committees and the Computers, Freedom and Privacy Conference. Rasch had worked as cyberlaw editor for SecurityCurrent.com, as Chief Privacy Officer for SAIC, and as Director or Managing Director at various information security consulting companies, including CSC, FTI Consulting, Solutionary, Predictive Systems, and Global Integrity Corp. Earlier in his career, Rasch was with the U.S. Department of Justice where he led the department’s efforts to investigate and prosecute cyber and high-technology crime, starting the computer crime unit within the Criminal Division’s Fraud Section, efforts which eventually led to the creation of the Computer Crime and Intellectual Property Section of the Criminal Division. He was responsible for various high-profile computer crime prosecutions, including Kevin Mitnick, Kevin Poulsen and Robert Tappan Morris. Prior to joining Verizon, Mark was a frequent commentator in the media on issues related to information security, appearing on BBC, CBC, Fox News, CNN, NBC News, ABC News, the New York Times, the Wall Street Journal and many other outlets.

mark has 203 posts and counting.See all posts by mark