Cybersecurity & Tech Surveillance & Privacy

How Congress Can De-Escalate the Second Crypto War: Fund Research and Broker a Crypto Armistice

Alan Z. Rozenshtein, Mayank Varia, Charles Wright
Tuesday, June 5, 2018, 9:00 AM

There’s been a flurry of encryption news over the past few months. In February, the National Academies released a report that discussed early-stage research into the design of secure cryptographic systems that would nevertheless allow the government access under certain circumstances (what we’ll call third-party-access systems).

Photo: Architect of the Capitol/Wikimedia

Published by The Lawfare Institute
in Cooperation With
Brookings

There’s been a flurry of encryption news over the past few months. In February, the National Academies released a report that discussed early-stage research into the design of secure cryptographic systems that would nevertheless allow the government access under certain circumstances (what we’ll call third-party-access systems). Steven Levy, who wrote the definitive history of the first crypto war, published a long Wired article providing more detail on one such third-party-access system currently being developed by Ray Ozzie, former chief technical officer at Microsoft. The security-research community has reacted negatively, pointing out flaws in Ozzie’s proposal and generally downplaying the significance of the research described in the National Academies report, emphasizing, as Susan Landau did in April on Lawfare, that the research is preliminary and “more akin to sketches than a system architecture.” Speaking through a trade group, the major technology companies also criticized “new proposals to engineer vulnerabilities into devices and services” as a way to provide law-enforcement access to encrypted data.

Meanwhile, there are signs that multiple branches of the government are re-seizing themselves of the encryption issue. The Justice Department and the FBI are reaching out to security researchers who are working on designing third-party access systems, several members of the Senate Judiciary Committee are reportedly working on legislation that would regulate encryption in light of law enforcement’s need for access to encrypted data, and, on the other side of the debate, a bipartisan group in the House of Representatives has proposed a sweeping bill that would ban the government from forcing any company to redesign its systems to facilitate surveillance (more on that below).

Against this backdrop, it’s useful to consider what useful, short-term steps the government, and in particular Congress, could take to advance the important and difficult debate around law-enforcement access to encrypted data. (Although our proposals could be at least partially carried out by the executive branch, we, like many other commentators on this issue, think Congress has a special role to play, since it has ultimate control over the government’s spending and regulatory authority. Thus, for example, Landau has previously argued that Congress should increase funding to improve law enforcement’s technical capabilities.)

Our proposals are organized around what we see as the two main impediments to a healthy policy debate on the issue: (1) a lack of precise specifications as to the problem to be solved and how to solve it, and (2) the toxic relationship between the technology community and the government’s law enforcement and foreign intelligence agencies.

How Congress can help generate knowledge

At the core of the encryption debate lie several factual questions: To what extent does encryption stymie government investigations? What level of access does the government want technology companies to provide, and across what platforms and systems? Most importantly, to what extent would providing such access necessarily degrade information security? These are, of course, not the only relevant questions, and some important additional questions are about values, not facts: Should we make it easier for the government to engage in surveillance? Assuming that government access will necessarily degrade information security by some amount, how much is too much? But we can’t make meaningful progress on these questions until we get our facts straight.

Congress could do much to help generate the answers we need. It could use a combination of carrots—increased funding—and sticks—legislative mandates—to require federal, state, and local agencies to keep detailed statistics on situations in which encryption impeded government investigations. In the wake of reports that the FBI seriously overestimated the number of encrypted devices it could not access, legislative mandates for more accurate reporting are in order. Congress could also require agencies to keep data on how the government responded to encryption (for example, by dropping the investigation, using different investigative techniques, or defeating the encryption, whether through its own lawful hacking or the purchase of third-party tools), and it could require agencies to specify and prioritize what capabilities they need.

Congress could also directly fund research into whether secure third-party-access systems are possible. Perhaps stung by the failure of the government-developed Clipper Chip during the 1990s, the government has studiously avoided putting forward any concrete proposals of its own, instead arguing that only the technology community is capable of the relevant research. FBI Director Christopher Wray captured this hands-off approach by citing Silicon Valley’s record of innovation as evidence that it is up to the challenge of building secure third-party-access systems:

We have the brightest minds doing and creating fantastic things. If we can develop driverless cars ... ; if we can establish entire computer-generated virtual worlds ... , surely we should be able to design devices that both provide data security and permit lawful access with a court order.

But this sells the government’s role short. From the beginning of the internet—which, after all, started as a Defense Department project—to the present day, the government has played a key role in technological innovation, and there’s no reason why this situation should be any different. Indeed, to use Wray’s own analogy, the government-sponsored DARPA Grand Challenges during the 2000s stimulated much of the foundational work in autonomous vehicles.

There exist several avenues by which Congress can appropriate funds to invest in generating knowledge about the question of access to encrypted data. Basic science research funding agencies could administer new challenges in this space in order to develop viable options and to understand fundamental limitations. Exchange programs could send technologists from the private sector and the academy into the government to help think through the technological and policy issues around third-party access. Eventually, cryptographic standards organizations such as the National Institute of Standards and Technology (NIST, part of the Commerce Department) could hold a competition to evaluate the security of third-party-access systems in the same vein as their broadly supported Advanced Encryption Standard and Cryptographic Hash Algorithm competitions.

None of this will result in viable third-party-access systems overnight. It’s important to recognize that, whether with driverless cars or realistic virtual reality (Director Wray’s examples of Silicon Valley innovation), it took years—sometimes decades—to develop the technologies, which even now remain works in progress. Secure third-party-access systems may similarly be years away from viability. That doesn’t mean we shouldn’t invest in the necessary research and development, just that we should have a realistic timetable in mind.

Some would argue that additional research isn’t worth the money, because secure third-party access was exhaustively analyzed and found to be impossible back in the 1990s, during the debates over the Clipper Chip. Many have criticized Ozzie’s proposal on such grounds, arguing that it is merely a warmed-over version of key escrow, the approach that was rejected during the first crypto war.

Ozzie’s proposal may yet prove to have significant advantages over earlier attempts at key escrow. But even if it doesn’t, not all approaches rely on key escrow (and the implementation challenges that it would entail, which are often the greatest vulnerability of any third-party-access system). Two of us have recently published a paper outlining a proof-of-work based approach that would place most of the burden involved in recovering message contents on the government, rather than providers. The scheme provides several desirable features that key escrow does not: As a technical matter, it restricts use only to targeted (not mass) surveillance, it requires no communication from users or providers to the government, and it is compatible with existing cryptographic best practices such as forward secrecy or the use of hardware modules. This work is only one step in a long research path, and we welcome future enhancements that add even more desirable technological features and that pair the technology with transparent legal constraints and oversight processes.

The need for Congress to focus on generating knowledge and supporting researchers carries with it an important corollary: It would be dangerously premature for Congress to limit encryption, as through the ill-fated “Compliance with Court Order Act of 2016” (also known as Burr-Feinstein, after its sponsors, Sens. Richard Burr and Dianne Feinstein). Such bans—whether in the form of stand-alone bills like Burr-Feinstein or updates to the Communications Assistance for Law Enforcement Act (CALEA) that would apply the law to companies like Apple and remove the current exemption for end-to-end encryption—are politically unrealistic. They are also, from a policy perspective, irresponsible so long as neither the technology industry, the security-research community, nor the government has a workable solution that could be securely designed and implemented at scale (as is the case).

A crypto armistice could help bridge the divide between the government and the technology community

Anyone who has followed the debate over encryption will have noticed that, more than with any other tech-policy issue, trust between the various sides is fundamentally broken. Both sides frequently believe the other is acting in bad faith. Many in the technology community believe the government is overstating the extent to which encryption is actually a problem for investigations and wilfully ignoring the security risks of third-party access. (See, for example, the FBI miscount discussed above.) On the other side, many in law enforcement believe that technology companies are hiding behind inflated claims about the impossibility of secure third-party access to advance their own ideological and business interests. Both sides have a point, but the bigger problem is that the poisonous relationship between the technology community and the government has made cooperation far more difficult than it should be, and more difficult than will allow for long-term solutions to be found. So the question is: What can Congress do to improve the relationship?

The support for research we advocate above would certainly help. If security researchers try to develop secure third-party-access systems—even if they ultimately decide that such systems are infeasible—they will likely naturally develop an appreciation for the government’s legitimate need for some degree of access to encrypted data. Conversely, if law enforcement feels that the technology community is trying in good faith to solve the problem, they will be more inclined to believe when they’re told that a particular solution won’t work. And to the extent that government-funded research brings individuals from the technology community and the government to work together—whether through research grants, innovation challenges, or exchange programs—the personal relationship will help break down the us vs. them group identities that make cooperation that much more difficult.

But no amount of collaboration will do the trick if the technology community believes that the government is poised to force insecure or unvetted “backdoors” into encrypted products and services. And the best example of how government attempts to force a resolution can backfire is the government’s 2016 attempt to use a court order to force Apple to modify the operating system of the iPhone of one of the San Bernardino terrorists. Not only did the government fail to bring Apple to heel, but it put the technology community on the defensive. More generally, the litigation illustrated why complex technological and policy decisions should not be made in the courthouse: The adversarial nature of litigation encourage each side to take maximalist positions and rhetorically demonize each other, rather than work together. Ultimately, the government will be better served by realizing that only when there is at least a partial consensus among technologists that secure third-party access is possible will any sort of top-down mandate will be possible. Until then, the government may be best off supporting an armistice that takes design mandates off the table.

What would a legislative armistice look like in practice? A recently introduced bipartisan bill in the House of Representatives provides one model. The “Secure Data Act of 2018” would prohibit any regulations or court orders (other than those already permitted under CALEA) that would require a company “to design or alter the security functions in its product or service to allow the surveillance of any user of such product or service, or to allow the physical search of such product” by the government. By removing the specter of design mandates, the bill might actually make it more likely that research on secure third-party access might go forward (whether or not that is the intention of the bill’s sponsors).

To be clear, we are not supporting the bill in its current form. It unnecessarily goes far beyond encryption, prohibiting design modifications that may pose no security threats. It is permanent, when instead a time-limited bill (for example with a sunset provision after some number of years) might better ensure that the issue remains on Congress’s radar. And it misses an opportunity to fund and otherwise support the research that we’ve advocated above. But it is nevertheless a useful thought experiment as to how a pause in hostilities between the government and technology sector may be in everybody’s interest—and in particular why the government may want to support such an armistice.


Alan Z. Rozenshtein is an Associate Professor of Law at the University of Minnesota Law School, a senior editor at Lawfare, and a term member of the Council on Foreign Relations. Previously, he served as an Attorney Advisor with the Office of Law and Policy in the National Security Division of the U.S. Department of Justice and a Special Assistant United States Attorney in the U.S. Attorney's Office for the District of Maryland.
Mayank Varia is a research associate professor of computer science at Boston University and the co-director of BU's Center for Reliable Information Systems & Cyber Security. He holds a bachelor's degree from Duke University and a PhD from MIT.
Charles Wright is an assistant professor in the Computer Science Department at Portland State University, specializing in security and privacy.

Subscribe to Lawfare