Cybersecurity & Tech Surveillance & Privacy

Thoughts on Encryption and Going Dark, Part II: The Debate on the Merits

Benjamin Wittes
Sunday, July 12, 2015, 2:00 PM
Deputy Attorney General Sally Yates and FBI Director James Comey testify this week before the Senate Judiciary Committee

Published by The Lawfare Institute
in Cooperation With
Brookings

On Thursday, I described the surprisingly warm reception FBI Director James Comey got in the Senate this week with his warning that the FBI was "going dark" because of end-to-end encryption. In this post, I want to take on the merits of the renewed encryption debate, which seem to me complicated and multi-faceted and not all pushing in the same direction.

Let me start by breaking the encryption debate into two distinct sets of questions: One is the conceptual question of whether a world of end-to-end strong encryption is an attractive idea. The other is whether—assuming it is not an attractive idea and that one wants to ensure that authorities retain the ability to intercept decrypted signal—an extraordinary access scheme is technically possible without eroding other essential security and privacy objectives. These questions often get mashed together, both because tech companies are keen to market themselves as the defenders of their users' privacy interests and because of the libertarian ethos of the tech community more generally. But the questions are not the same, and it's worth considering them separately.

Consider the conceptual question first. Would it be a good idea to have a world-wide communications infrastructure that is, as Bruce Schneier has aptly put it, secure from all attackers? That is, if we could snap our fingers and make all device-to-device communications perfectly secure against interception from the Chinese, from hackers, from the FSB but also from the FBI even wielding lawful process, would that be desirable? Or, in the alternative, do we want to create an internet as secure as possible from everyone except government investigators exercising their legal authorities with the understanding that other countries may do the same?

Conceptually speaking, I am with Comey on this question—and the matter does not seem to me an especially close call. The belief in principle in creating a giant world-wide network on which surveillance is technically impossible is really an argument for the creation of the world's largest ungoverned space. I understand why techno-anarchists find this idea so appealing. I can't imagine for moment, however, why anyone else would.

Consider the comparable argument in physical space: the creation of a city in which authorities are entirely dependent on citizen reporting of bad conduct but have no direct visibility onto what happens on the streets and no ability to conduct search warrants (even with court orders) or to patrol parks or street corners. Would you want to live in that city? The idea that ungoverned spaces really suck is not controversial when you're talking about Yemen or Somalia. I see nothing more attractive about the creation of a worldwide architecture in which it is technically impossible to intercept and read ISIS communications with followers or to follow child predators into chatrooms where they go after kids.

The trouble is that this conceptual position does not answer the entirety of the policy question before us. The reason is that the case against preserving some form of law enforcement access to decrypted signal is not only a conceptual embrace of the technological obsolescence of surveillance. It is also a series of arguments about the costs—including the security costs—of maintaining the capacity to decrypt captured signal.

Consider the report issued this past week by a group of computer security experts (including Lawfare contributing editors Bruce Schneier and Susan Landau), entitled "Keys Under Doormats: Mandating Insecurity By Requiring Government Access to All Data and Communications." The report does not make an in-principle argument or a conceptual argument against extraordinary access. It argues, rather, that the effort to build such a system risks eroding cybersecurity in ways far more important than the problems it would solve. The authors, to summarize, make three claims in support of the broad claim that any exceptional access system would "pose . . . grave security risks [and] imperil innovation." What are those "grave security risks"?

  • "[P]roviding exceptional access to communications would force a U-turn from the best practices now being deployed to make the Internet more secure. These practices include forward secrecy—where decryption keys are deleted immediately after use, so that stealing the encryption key used by a communications server would not compromise earlier or later communications. A related technique, authenticated encryption, uses the same temporary key to guarantee confidentiality and to verify that the message has not been forged or tampered with."
  • "[B]uilding in exceptional access would substantially increase system complexity" and "complexity is the enemy of security." Adding code to systems increases that system's attack surface, and a certain number of additional vulnerabilities come with every marginal increase in system complexity. So by requiring a potentially complicated new system to be developed and implemented, we'd be effectively guaranteeing more vulnerabilities for malicious actors to hit.
  • "[E]xceptional access would create concentrated targets that could attract bad actors." If we require tech companies to retain some means of accessing user communications, those keys have to stored somewhere, and that storage then becomes an unusually high-stakes target for malicious attack. Their theft then compromises, as did the OPM hack, large numbers of users.

The strong implication of the report is that these issues are not resolvable, though the report never quite says that. But at a minimum, the authors raise a series of important questions about whether such a system would, in practice, create an insecure internet in general—rather than one whose general security has the technical capacity to make security exceptions to comply with the law.

There is some reason, in my view, to suspect that the picture may not be quite as stark as the computer scientists make it seem. After all, the big tech companies increase the complexity of their software products all the time, and they generally regard the increased attack surface of the software they create as a result as a mitigatable problem. Similarly, there are lots of high-value intelligence targets that we have to secure and would have big security implications if we could not do so successfully. And when it really counts, that task is not hopeless. Google and Apple and Facebook are not without tools in the cybersecurity department.

The real question, in my view, is whether a system of the sort Comey imagines could be built in fashion in which the security gain it would provide would exceed the heightened security risks the extraordinary access would involve. As Herb Lin puts it in his excellent, and admirably brief, Senate testimony the other day, this is ultimately a question without an answer in the absence of a lot of new research. "One side says [the] access [Comey is seeking] inevitably weakens the security of a system and will eventually be compromised by a bad guy; the other side says it doesn’t weaken security and won’t be compromised. Neither side can prove its case, and we see a theological clash of absolutes." Only when someone actually does the research and development and tries actually to produce a system that meets Comey's criteria are we going to find out whether it's doable or not.

And therein lies the rub, and the real meat of the policy problem, in my view: Who's going to do this research? Who's going to conduct the sustained investment in trying to imagine a system that secures communications except from government when and only government has a warrant to intercept those communications?

The assumption of the computer scientists in their report is that the burden of that research lies with the government. "Absent a concrete technical proposal," they write, "and without answers to the questions raised in this report, legislators should reject out of hand any proposal to return to the failed cryptography control policy of the 1990s." Indeed, their most central recommendation is that the burden of development is on Comey. "Our strong recommendation is that anyone proposing regulations should first present concrete technical requirements, which industry, academics, and the public can analyze for technical weaknesses and for hidden costs."

In his testimony, Herb supports this call, though he acknowledges that it is not the inevitable route:

the government has not yet provided any specifics, arguing that private vendors should do it.  At the same time, the vendors won’t do it, because [their] customers aren’t demanding such features. Indeed, many customers would see such features as a reason to avoid a given vendor. Without specifics, there will be no progress.  I believe the government is afraid that any specific proposal will be subject to enormous criticism—and that’s true—but the government is the party that wants . . . access, and rather than running away from such criticism, it should embrace any resulting criticism as an opportunity to improve upon its initial designs."

Herb might also have mentioned that lots of people in the academic tech community who would be natural candidates to help develop such an access system are much more interested in developing encryption systems to keep the feds out than to—under any circumstances—let them in. The tech community has spent a lot more time and energy arguing against the plausibility and desireability of implementing what Comey is seeking than it has spent in trying to develop systems that deliver it while mitigating the risks such a system might pose. For both industry and the tech communities, more broadly, this is government's problem, not their problem.

Yet reviving the Clipper Chip model—in which government develops a fully-formed system and then puts it out publicly for the community to shoot down—is clearly not what Comey has in mind. He is talking in very different language: the language of performance requirements. He wants to leave the development task to Silicon Valley to figure out how to implement government's requirements. He wants to describe what he needs—decrypted signal when he has a warrant—and leave the companies to figure out how to deliver it while still providing secure communications in other circumstances to their customers.

The advantage to this approach is that it potentially lets a thousand flowers bloom. Each company might do it differently. They would compete to provide the most security consistent with the performance standard. They could learn from each other. And government would not be in the position of developing and promoting specific algorithms. It wouldn't even need to know how the task was being done.

The disadvantage, of course, is that if the critics are correct and the ultimate task is not accomplishable, the performance requirement would effectively mandate the building of a series of exploitable vulnerabilities. A legal requirement to, as the title of the computer scientists' report suggests, hide keys under doormats, is a requirement of insecurity, after all, one that would show up just as we need a global effort at improving computer security.

One can, in short, summarize the problems before us—if one accepts conceptually Comey's policy objective but also accepts that there are serious, perhaps prohibitive, technical barriers to achieving it without eroding other cybercurity goods—as follows: Which actor or actors should we expect to conduct the necessary research and development to try to create an operable system and how should we incentivize that research? Should we assume the task is achievable until proven impossible or should we assume it out of reach until someone comes forward with a developed and promising strategy for achieving it?

Here are five possible strategies based on different assumptions as to the answers to these questions:

If we begin—as the computer scientists do—with a posture of great skepticism as to the plausibility of any scheme and we place the burden of persuasion on Comey, law enforcement, and the intelligence community to demonstrate the viability of any system, the obvious course is government-sponsored research. What we need here is not a Clipper Chip-type initiative, in which the government would develop and produce a complete system, but a set of intellectual and technical answers to the challenges the technologists have posed. The goal here should be an elaborated concept paper laying out how a secure extraordinary access system would work in sufficient detail that it can be evaluated, critiqued, and vetted; think of the bitcoin paper here as a model. Only after a period of public vetting, discussion, and refinement would the process turn to the question of what sorts of companies we might ask to implement such a system and by what legal means we might ask.

Conversely, if we begin with the assumption that the challenges are resolvable and that Silicon Valley is the appropriate engine of innovation and expertise to resolve them, a regulatory mandate makes a lot of sense. The theory is that companies have every incentive for market reasons to protect consumer privacy, but no incentives at all to figure out how to provide law enforcement access in the context of doing so. If you simply require the latter as a matter of law, they will devote resources to the question of how to do so while still providing consumer security. And while the problems are hard, they will prove manageable once the tech giants decide to work them hard—rather than protesting their impossibility.

Another, perhaps softer, possibility is to rely on the possibility of civil liability to incentivize companies to focus on these issues. At the Senate Judiciary Committee hearing this past week, the always interesting Senator Sheldon Whitehouse posed a question to Deputy Attorney General Sally Yates about which I've been thinking as well: "A girl goes missing. A neighbor reports that they saw her being taken into a van out in front of the house. The police are called. They come to the home. The parents are frantic. The girl's phone is still at home." The phone, however, is encrypted:

WHITEHOUSE: It strikes me that one of the balances that we have in these circumstances where a company may wish to privatize value by saying, "Gosh, we're secure now. We got a really good product. You're going to love it." That's to their benefit. But for the family of the girl that disappeared in the van, that's a pretty big cost. And when we see corporations privatizing value and socializing cost so that other people have to bear the cost, one of the ways that we get back to that and try to put some balance into it, is through the civil courts, through a liability system.

If you're a polluter and you're dumping poisonous waste into the water rather than treating it properly, somebody downstream can bring an action and can get damages for the harm that they sustain, can get an order telling you to knock it off. I'd be interested in whether or not the Department of Justice has done any analysis as to what role the civil-liability system might be playing now to support these companies in drawing the correct balance, or if they've immunized themselves from the cost entirely and are enjoying the benefits. I think in terms of our determination as to what, if anything, we should do, knowing where the Department of Justice believes the civil liability system leaves us might be a helpful piece of information. So I don't know if you've undertaken that, but if you have, I'd appreciate it if you'd share that with us, and if you'd consider doing it, I think that might be helpful to us.

YATES: We would be glad to look at that. It's not something that we have done any kind of detailed analysis. We've been working hard on trying to figure out what the solution on the front end might be so that we're not in a situation where there could potentially be corporate liability or the inability to be able to access the device.


WHITEHOUSE: But in terms of just looking at this situation, does it not appear that it looks like a situation where value is being privatized and costs are being socialized onto the rest of us?


YATES: That's certainly one way to look at it. And perhaps the companies have done greater analysis on that than we have. But it's certainly something we can look at.

I'm not sure what that lawsuit looks like under current law. I, like the Justice Department, have not done the analysis, and I would be very interested in hearing from anyone who has. Whitehouse, however, seems to me to be onto something here. Might a victim of an ISIS attack domestically committed by someone who communicated and plotted using communications architecture specifically designed to be immune, and specifically marketed as immune, from law enforcement surveillance have a claim against the provider who offered that service even after the director of the FBI began specifically warning that ISIS was using such infrastructure to plan attacks? To the extent such companies have no liability in such circumstances, is that the distribution of risk that we as a society want? And might the possibility of civil liability, either under current law or under some hypothetical change to current law, incentivize the development of secure systems that are nonetheless subject to surveillance under limited circumstances?

Still another approach is to let other governments do the dirty work. The computer scientists' report cites the possibility of other sovereigns adopting their own extraordinary access regimes as a reason for the U.S. to go slow:

Building in exceptional access would be risky enough even if only one law enforcement agency in the world had it. But this is not only a US issue. The UK government promises legislation this fall to compel communications service providers, including US-based corporations, to grant access to UK law enforcement agencies, and other countries would certainly follow suit. China has already intimated that it may require exceptional access. If a British-based developer deploys a messaging application used by citizens of China, must it provide exceptional access to Chinese law enforcement? Which countries have sufficient respect for the rule of law to participate in an international exceptional access framework? How would such determinations be made? How would timely approvals be given for the millions of new products with communications capabilities? And how would this new surveillance ecosystem be funded and supervised? The US and UK governments have fought long and hard to keep the governance of the Internet open, in the face of demands from authoritarian countries that it be brought under state control. Does not the push for exceptional access represent a breathtaking policy reversal?

I am certain that the computer scientists are correct that foreign governments will move in this direction, but I think they are misreading the consequences of this. China and Britain will do this irrespective of what the United States does, and that fact may well create potential opportunity for the U.S. After all, if China and Britain are going to force U.S. companies to think through the problem of how to provide extraordinary access without compromising general security, perhaps the need to do business in those countries will provide much of the incentive to think through the hard problems of how to do it. Perhaps countries far less solicitous than ours of the plight of technology companies or the privacy interests of their users will force the research that Comey can only hypothesize. Will Apple then take the view that it can offer phones to users in China which can be decrypted for Chinese authorities when they require it but that it's technically impossible to do so in the United States?

There's a final, non-legal factor that may push companies to work this problem as energetically as they are now moving toward end-to-end encryption: politics. We are at very particular moment in the cryptography debate, a moment in which law enforcement sees a major problem as having arrived but the tech companies see that problem as part of the solution to the problems the Snowden revelations created for them. That is, we have an end-to-end encryption issue, in significant part, because companies are trying to assure customers worldwide that they have their backs privacy-wise and are not simply tools of NSA. I think those politics are likely to change. If Comey is right and we start seeing law enforcement and intelligence agencies blind in investigating and preventing horrible crimes and significant threats, the pressure on the companies is going to shift. And it may shift fast and hard. Whereas the companies now feel intense pressure to assure customers that their data is safe from NSA, the kidnapped kid with the encrypted iPhone is going to generate a very different sort of political response. In extraordinary circumstances, extraordinary access may well seem reasonable. And people will wonder why it doesn't exist.

Which of these approaches is the right way to go? I would pursue several of them simultaneously. At least for now, I would hold off on any kind of regulatory mandate, there being just too much doubt at this stage concerning what's doable. I would, however, take a hard look at the role that civil liability might play. I think the government, if it's serious about creating an extraordinary access scheme, needs to generate some public research establishing proof of concept. We should watch very carefully how the companies respond to the mandates they will receive from governments that will approach this problem in a less nuanced fashion than ours will. And Comey should keep up the political pressure. The combination of these forces may well produce a more workable approach to the problem than anyone can currently envision.


Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance Studies at the Brookings Institution. He is the author of several books.

Subscribe to Lawfare