Keys Under Doormats: Mandating Insecurity

Susan Landau
Tuesday, July 7, 2015, 9:00 AM

Two decades ago US law enforcement sought laws requiring communication providers to be able to decrypt communications when served with a court order. The proposed technology to accomplish this was escrowed encryption — keys stored by the government — and the methodology is the now infamous Clipper chip.

Published by The Lawfare Institute
in Cooperation With
Brookings

Two decades ago US law enforcement sought laws requiring communication providers to be able to decrypt communications when served with a court order. The proposed technology to accomplish this was escrowed encryption — keys stored by the government — and the methodology is the now infamous Clipper chip.

In 1997 a group of cryptographers and security experts — including our own Bruce Schneierwarned that such escrowed encryption created serious security risks and that it was infeasible for an international setting; after all, which nation would hold the keys? It would be fun to claim the computer scientists were prescient; a more sober assessment is that they were realistic. Two years later the US government agreed, ending its efforts on escrowed encryption. I have heard intelligence officials remark that that mistaken effort is a partial cause for our current poor state of computer security. Certainly the attempt to force escrowed encryption was no help in securing communication or computer systems.

It seems that FBI Director Comey and UK Prime Minister Cameron have not learned the lessons of the past. Both are pressing hard for laws requiring "exceptional access" mechanisms. This is some form of technology that would enable government access to content even if the content was encrypted. Yesterday Director Comey again wrote about his concerns, explaining the dangers that would ensue if devices and communications are locked, with law enforcement having no ability to get at the data even in an emergency. That's not exactly the case. But whether or not law enforcement can tackle encrypted systems — and there is evidence that they can in many cases — is not the issue I want to discuss today.

I'm concerned about the issue of how exceptional access might work. You've heard how important such access is, but you haven't been told how such exceptional access would work. There's good reason for the silence. The difficulty is that as soon as an actual proposal for getting at the plaintext of encrypted data — whether in motion or at rest — is presented, the problems with the "solution" are exposed.

Maybe exceptional access breaks forward secrecy, a technique employed by Google, Microsoft, and others that prevents a key exposure from enabling the decryption of all previously encrypted material. Or maybe to enable unlocking mobile devices whenever a policeman needs it, there is an assumption being made about authentication at scale, while in fact remotely unlocking devices would require vendors be able to authenticate any policeman anytime anywhere, a security risk of enormous proportions.

The problem is that once one gets into the nitty gritty of how exceptional access might actually work, the idea of exceptional access looks more like magical thinking than a realistic solution to a complex technical problem. Exceptional access is being pushed at a time when the real cybersecurity issue is securing our systems, all the time, everywhere. The contradiction between what Director Comey and Prime Minister Cameron are pressing for and what's actually needed an issue I've written about extensively (e.g., this post), so I won't emphasize it here.

Instead I offer a reading assignment. Today the 1997 group, joined by several others (including me), release a report that examines the exceptional access demand. It finds the proposal sorely lacking from a security vantage point. Our most important finding is the one I've hinted at: no legislation requiring exceptional access should be considered unless the particulars of the proposal — the technical particulars — are presented. Otherwise we would be mandating insecurity when what we need is just the opposite.

Keys Under Doormats: Mandating Insecurity Report


Susan Landau is Bridge Professor in The Fletcher School and Tufts School of Engineering, Department of Computer Science, Tufts University, and is founding director of Tufts MS program in Cybersecurity and Public Policy. Landau has testified before Congress and briefed U.S. and European policymakers on encryption, surveillance, and cybersecurity issues.

Subscribe to Lawfare