Cybersecurity & Tech Surveillance & Privacy

The Five Eyes Statement on Encryption: Things Are Seldom What They Seem

Susan Landau
Wednesday, September 26, 2018, 5:05 PM

Earlier this September, law enforcement officials from the Five Eyes intelligence alliance—made up of Australia, Canada, New Zealand, the United Kingdom, and the United States—met in Australia and issued a Statement of Principles on Access to Evidence and Encryption.

Published by The Lawfare Institute
in Cooperation With
Brookings

Earlier this September, law enforcement officials from the Five Eyes intelligence alliance—made up of Australia, Canada, New Zealand, the United Kingdom, and the United States—met in Australia and issued a Statement of Principles on Access to Evidence and Encryption. The statement is strongly worded, concluding with a warning that if industry does not make it easier for governments with lawful access to content to acquire decrypted versions, the nations “may pursue technological, enforcement, legislative or other measures to achieve lawful access solutions.” Though the statement has garnered much public attention, there are a number of curiosities about it, and I believe there is much less here than it seems.

There's a bit of a backstory to the statement, which came out from the law enforcement ministers of the Five Eyes just as the Australian government was about to put forth a bill requiring companies to circumvent encryption protections in order to provide law enforcement and intelligence with lawful access to encrypted devices. U.S. law enforcement has pressed hard for such access, and the Australian bill may well be a stalking horse for the rest of the Five Eyes. Australia is the perfect candidate: the country’s lack of a comprehensive set of human rights protections means that Australia does not face the balancing requirements of privacy and civil liberties protections that the U.K. and U.S. do. And if the Australian bill moves forward, U.S. law enforcement may use their doing so to push forward their own legislation along similar lines.

But the bill is not only highly invasive of privacy—it ignores technical realities as well as what is really needed for security. While the statement is an effort to show support from the Five Eyes for the legislation, this support is missing a crucial player: the intelligence agencies.

The statement says that, “The Governments of the United States, the United Kingdom, Canada, Australia and New Zealand are committed to personal rights and privacy, and support the role of encryption in protecting those rights” and acknowledges the importance of encryption in protecting personal, business, and government information. But then the statement goes on to say that “privacy is not absolute.” The ministers assert three principles for guiding encryption policy: “Mutual responsibility,” “Rule of law and due process are paramount,” and “Freedom of choice for lawful access.”

Under the heading “Mutual responsibility,” the statement declares that “diminished access to the content of lawfully obtained data is ... a mutual responsibility for all stakeholders,” and “Providers of information and communications technology and services—carriers, device manufacturers or over-the-top service providers—are subject to the law, which can include requirements to assist authorities to lawfully access data.”

It is not clear what the statement means by “requirements to assist authorities to lawfully access data.” Encryption systems protect data using a function that relies on a secret: the key. These systems are designed to prevent access without knowledge of this key, and the most secure systems are those in which only the participants in a communication or the owners and users of a device can retrieve to the encrypted data. It is one thing for a company to help deliver unencrypted content if the company itself has access to the encryption key. It is quite another if the manufacturer has no access to the encryption key, as some manufacturers do not.

So does “requirements to assist authorities to lawfully access data” imply that if providers of information and communications technologies and services have access to decrypted content, they must deliver it when served with a proper warrant? If so, this is already the case—so why did the authors of the statement feel the need to reiterate it? Alternately, does the statement mean that providers must develop methods to undo security protections in order to provide content when the government has the right to access that content—the same issue that was at the heart of the the Apple-FBI case? Or does it mean that providers must design their equipment so that the content can be accessed under proper legal authority? Only the first of these possibilities—requiring the delivery of decrypted content if providers have the key—does not raise serious security risks.

The second principle in the statement—“Rule of law and due process are paramount”—commits to the rule of law, due process, oversight by independent authorities or judicial review. But hidden inside the discussion is a rather peculiar statement: “Access to information, subject to this principle, is critical to the ability of governments to protect our citizens by investigating threats and prosecuting crimes.”

Critical to whom? And under what circumstances? The way the principle is argued points to cryptography’s role in impeding investigations. But cryptography first and foremost protects people and their data, providing security, preventing crime and thus enhancing public safety. That side of the equation is not addressed.

The third principle, “Freedom of choice for lawful access solutions,” would appear to be a carrot towards industry: rather than dictating a particular lawful access solution, it seems to permit industry to design one of their own choosing. But this principle is just a restatement of the first under another guise.

To explain this, we must look back to the 1990s, when the U.S. government made its first explicit attempt to develop a scheme in which the government would have lawful access to encrypted data. The Clinton administration proposed the Escrowed Encryption Standard—more commonly known as “Clipper”—for encrypting digital telephone communication. In this scheme, encryption keys for telephone communications would be split and held by two U.S. government agencies.

The effort failed. As it turned out, there were numerous problems with the idea. Other nations were not interested in “secure” communications to which the U.S. government held the keys—and neither were Americans. Very few Clipper-equipped devices were sold. Matt Blaze, then a researcher at AT&T, demonstrated flaws in the design, showing how to use the system to encrypt without providing decryption keys to the government.

The lesson that the law-enforcement agencies took from the Clipper episode was that specifying a particular lawful access solution was a poor strategy. Doing so enabled the computer security community to do what it does best: find security vulnerabilities in the technique. Because lawful access solutions inherently disrupt the security provided by encryption, any proposal for a lawful-access solution will carry serious security risks. And so law enforcement has sought to dodge that discussion by avoiding specifics of how a solution would work. In the current Going Dark effort, law enforcement's strategy has been to shun specifics. FBI and Justice Department officials instead argue that as Silicon Valley is full of very smart people; undoubtedly they could solve the problem of lawful access if they put their minds to it.

The Five Eyes statement takes a similar tack. The reference to “[f]reedom of choice for lawful access solutions” must be understood in this context: it is part of a strategy to press providers of information and communications technologies and services to enable lawful access whenever government has legal authority to access said content, and threaten a legislative “fix” if technical solutions are not found.

The problem is that building lawful access into encryption systems undermines security. I have made this point many times in these pages, so I will keep my discussion here brief. Testifying to Congress in 2016, my colleague Matt Blaze noted that,

The design and implementation of even the simplest encryption systems is an extraordinarily difficult and fragile process. Very small changes frequently introduce fatal security flaws. Ordinary (end-to-end, non-escrowed) encryption systems have conceptually rather simple requirements and yet, because there is no general theory for designing them, we still often discover exploitable flaws in fielded systems.

Nor is the problem limited to situations in which the key is escrowed. Blaze noted,

Aside from cryptographic weaknesses, there are significant operational security issues. Third-party access, by its nature, makes encrypted data less secure because the third party itself creates a new target for attack.

The problems raised by lawful access are well understood by the intelligence agencies, which exploit weaknesses in the computer systems of other nations. And because these agencies use such techniques, they well understand the vulnerabilities that lawful access solutions would introduce—and how easily law-enforcement targets could get around these “solutions.” A leaked 2015 National Security Council draft paper, for example, noted that, “Overall, the benefits to privacy, civil liberties, and cybersecurity gained from encryption outweigh the broader risks that would have been created by weakening encryption.

While law enforcement describes the problems posed by encryption as “going dark,” the NSA view is not as dire as that of FBI. In 2016, then-NSA Deputy Director Rick Ledgett said that the world was growing “dimmer,” but not “dark.” (Disclosure: I served on the National Academies committee to which Ledgett spoke on this issue). It is not surprising that NSA might have better tools for managing targets' increased use of encryption. But even beyond NSA, many members of the national security establishment—and even former criminal investigators—have publicly disagreed with law enforcement's position on encryption. Public supporters of strong encryption without the front, back or side doors include former Secretary of the Department of Homeland Security Michael Chertoff and former CIA and NSA Director Michael Hayden, along with former NSA Director Mike McConnell; former CIA Directors David Petraeus and R. James Woolsey; and Richard Clarke, who served as national coordinator for security, infrastructure protection, and counterterrorism in the George W. Bush White House.

Nor is this view limited to the U.S. The former director-general of the U.K. intelligence agency MI5 said much the same: “I'm not personally one of those who believes we should weaken encryption, because I think there is a parallel issue which is cybersecurity more broadly.”

However, some of the Five Eyes intelligence agencies have issued separate statements: GCHQ Director Jeremy Fleming said on Sept. 7 that “the UK Government strongly supports encryption” but added that, “[T]here has to be close co-operation and agreement with technology companies. We're confident these solutions exist … They should be limited in scope and scalability, supported by modern legislation, and with strong oversight to maintain public confidence.”

Nevertheless, as readers of Lawfare well know, U.S. law enforcement has been pressing for lawful access requirements for quite some time. So has the Australian government, which, as I noted above, was preparing to introduce legislation requiring lawful access as the statement was introduced. In this context, it is clear that the statement is a political effort by law enforcement to gain investigative capabilities, both within Australia (to the extent that the statement aims to shore up support for the Australian legislation) and across the Five Eyes (to the extent that the Australian legislation may presage similar legislation in the other four nations). But while the statement was signed by law enforcement officials, it was not signed by the intelligence and defense ministries of the Five Eyes. In other words, it was not signed by those ministries that have shown themselves to be the most aware of the dangers of backdoors and the importance of encryption.

Early in the statement, the ministers write, “Privacy is not absolute”—and this is absolutely correct. But privacy is not at issue here. The encryption debate is about not privacy versus security, but rather, the efficiency of law enforcement investigations versus personal, business, and national security. In other words, it is a debate over security versus security. The wide public availability of strong encryption must be understood as critically necessary for security.

The Australian legislation, and the broader push for backdoors across the Five Eyes, reflects the deeply felt desires of law enforcement. But this demand for access to content ignores technical and security realities. The most important aspect of the statement is the fact that the defense and intelligence agencies didn't sign—and why they might not have.


Susan Landau is Professor of Cyber Security and Policy in Computer Science, Tufts University. Previously, as Bridge Professor of Cyber Security and Policy at The Fletcher School and School of Engineering, Department of Computer Science, Landau established an innovative MS degree in Cybersecurity and Public Policy joint between the schools. She has been a senior staff privacy analyst at Google, distinguished engineer at Sun Microsystems, and faculty at Worcester Polytechnic Institute, University of Massachusetts Amherst, and Wesleyan University. She has served at various boards at the National Academies of Science, Engineering and Medicine and for several government agencies. She is the author or co-author of four books and numerous research papers. She has received the USENIX Lifetime Achievement Award, shared with Steven Bellovin and Matt Blaze, and the American Mathematical Society's Bertrand Russell Prize.

Subscribe to Lawfare