Cybersecurity & Tech Surveillance & Privacy

The East West Institute's New Report on Encryption: A Review

Herb Lin
Friday, February 23, 2018, 7:00 AM

On Feb.

Published by The Lawfare Institute
in Cooperation With
Brookings

On Feb. 15, 2018, the East West Institute (EWI) released its report “Encryption Policy in Democratic Regimes: Finding Convergent Paths and Balanced Solutions,” which the group presented as offering “normative recommendations on encryption policy to ensure strong cybersecurity while enabling lawful law enforcement access to the plain text of encrypted information in limited circumstances.” In addition, the report offered two different policy regimes that would “enable legally authorized law enforcement access to the plain text, or unencypted form, of encrypted data in limited cases,” under which “access would occur within a clear legal framework embedded with human rights safeguards, while mitigating the risk that third parties could gain unauthorized access to encrypted data and communications.”

There is much to commend about the report. It provides a clear explanation of many issues that are clearly relevant—indeed, central—to the debate over encryption policy. It explicitly articulates tradeoffs. It describes a process to help formulate policy regimes based on three factors: techniques that law enforcement might use to obtain plain text; limitations that might be placed on these techniques; and specifics of the particular information technology environment in which these techniques might be applied. It cautions that the benefits and costs of inaction need to be part of any analysis. And perhaps most importantly, the EWI uses this framework to propose two policy regimes that support a clearly stated and EWI-preferred policy outcome—enabling legally authorized law enforcement access to the plain text of encrypted data. One regime involves compelled provider assistance and lawful hacking to gain access, and another involves compelled provider assistance and design mandates.

  • A policy involving compelled provider assistance requires IT service providers or product vendors to help law enforcement authorities decrypt information stored in or passing through their products, services or devices. The scope and nature of the assistance that may be compelled may be clear or unclear, depending on the specific rules that accompany the policy.
  • A policy involving lawful hacking provides law enforcement authorities with the capability of exploiting vulnerabilities in systems and devices, whether remote or local, or using social engineering to circumvent security protections.
  • A policy involving design mandates requires service providers and device manufacturers to design, build and deploy only products and services with the capability to accommodate future lawful access requests.

These aspects of the report led me to believe that it should be one of the required readings for anyone engaged in the debate over encryption policy.

Yet the report left a number of important things unsaid.

To understand the policy stance of the report, it is helpful to distinguish between “hard” measures and “soft” measures. “Hard” measures are actions that impose legally enforceable requirements for certain visible and operational behaviors by parties with various equities in the debate—law enforcement, consumers, vendors, and so on. And a key element of enforcement is the ability to impose penalties for noncompliance—such as fines or jail terms. In contrast, “soft” measures are actions that require equity-holding parties to adhere to various processes but do not mandate any particular outcome.

Most of the hard measures described in the report’s proposed policy regimes are directed towards vendors and technologists. Both regimes provide for compelled provider assistance—and compellence by law enforcement necessarily entails hard measures directed toward the parties being compelled. Design mandates make even further use of hard measures regarding the design of products and services.

But with few exceptions, the report only proposes soft measures for law enforcement. For example, the report states that:

  • Governments should use compelled provider assistance as a fundamental approach to facilitate law enforcement access, but only with clear rules as to where and to what extent compelled provider assistance is applicable under the legal framework. Requests for compelled provider assistance must be targeted and limited to a particular case. Compelled assistance should be the preferred technique to facilitate lawful access to third-party encryption products, services and ephemeral communications (i.e., communications that vanish after the intended receiver has read the contents).
  • Governments must recognize lawful hacking as a tool for use only in extraordinary circumstances, particularly when used for remote or extraterritorial applications. Lawful hacking must be embedded in a strict legal framework with limitations on its use to the most serious cases (i.e., testing the application against the principles of proportionality, necessity and legality; assessing international and human rights implications), and be subject to comprehensive vulnerability management, independent judicial authorization and oversight, and public summary reporting to the legislature. Effective state-of-the-art safeguards to prevent loss or theft of lawful hacking tools and the vulnerabilities they utilize must be deployed.
  • Design mandates that require service providers and device manufacturers to retain capabilities to produce decrypted data must be limited to designated services and scope. Design mandates should be imposed through a public regulatory process and be subject to annual recertification and assessment of their implications on cybersecurity and human rights.

Astute readers will notice that law enforcement authorities are already supposed to be doing everything described above. One will search for a very long time before finding a law enforcement official stating for the record that his or her agency does not follow measures designed to find a proper balance between public safety and privacy or civil liberties.

In other words, the report presumes that processes to ensure a proper balance between public safety and privacy or civil liberties will not be abused, and, further, that processes to ensure that only properly authorized law enforcement officials gain access to plain text cannot be hacked or circumvented. In doing so, the report assumes away some of the most important concerns of those in privacy and technology communities. Its main recommendation to them is essentially: Trust the process and how it will operate in practice.

But this does not address the concerns of privacy advocates and many technologists, who begin from a position of skepticism about the government’s willingness to actually follow the processes it claims to be adhering to. These parties are concerned about possible malice, incompetence, indifference, and inertia on the part of key government officials. Furthermore, to the extent that trust depends on faith that government agencies will behave in practice in accordance with the letter and the spirit of the law, the attitudes and posture of the administration in office at any given time might well influence the degree of that trust.

Consider, for example, the report’s description of the above-quoted need for “clear rules as to where and to what extent compelled provider assistance is applicable under the legal framework.” The report describes the value of clear rules, but it does not propose any particular rule, clear or not. Certainly, everyone would agree that clear rules are necessary—but privacy advocates and law enforcement officials will have very different views of how narrowly or broadly drawn such rules should be. Nor would they ever agree on the definition of what counts as “reasonable” assistance. These vast differences in outlook and perspective are never mentioned in the report, and yet they underlie the entire debate over compelled provider assistance.

In the absence of trust, hard measures directed at law enforcement may offer a better way to address concerns raised by privacy advocates and technologists. One hard measure, for example, could include provisions to provide redress for citizens and organizations that may have been victimized by an improper compromise of the process to enable or authorize exceptional access—meaning the ability to gain access to unencrypted information under exceptional circumstances, such as the granting of a warrant. Perhaps such citizens and organizations could be given the right to sue the government officials responsible for decisions that led to the compromise. Perhaps the government officials could be made personally liable for monetary damages that could be awarded as the result of such lawsuits. Perhaps public advocacy organizations should have the right to conduct discovery on the operation of processes that enable exceptional access.

Another hard measure could limit the number of times that law enforcement could request exceptional access. The report mentions some possible limitations—only for certain kinds of crimes, only for certain kinds of devices or services, only after other means for gaining plain text have been exhausted, only when the request is narrowly tailored, only within a certain time window, only with procedures to discard extraneous data in place, and only within certain territorial boundaries. Of these, only the first two could be plausibly characterized as hard. Other cases hinge on soft definitional issues—what it means to “narrowly tailor” a request or to “exhaust” other methods, what constitutes “extraneous” data, and so on.

But why not implement hard limits on the number of times that exceptional access can be granted in a year (e.g., access would be allowable only if fewer than X requests have occurred in the last year)? Or why not charge law enforcement agencies a large fee from their discretionary budgets for every request for exceptional access, so agencies have to make choices about when to use the tool? All agencies place high value on their discretionary budgets—taking exceptional access fees out of discretionary budgets would force hard tradeoffs. The report itself nods in this direction when it proposes that law enforcement authorities reimburse providers for the cost of compelled assistance. Such measures would help to ensure that exceptional access would be granted only for important cases.

Law enforcement authorities would argue forcefully that none of these hard measures are reasoanble or feasible. Indeed, I can imagine difficulties with each of them myself, and I would not necessarily advocate any of the specific proposals described above. But the privacy community regards any concession facilitating exceptional access under any circumstances as unreasonable and not feasible for the purposes of protecting privacy—so without significant attention to hard measures that limit possible damage to privacy equities, the report becomes a roadmap for a one-sided outcome that favors law enforcement equities. Compromises require both sides to give up something. The report should have proposed a richer suite of hard measures directed towards protecting privacy equities and noted that even with such measures in place, law enforcement authorities would still be in a better place than they would be without any provisions at all for exceptional access.

Second, the report repeatedly refers to the desirability of “strong encryption” without ever recognizing that the term has two meanings in the debate over encryption policy. For law enforcement authorities, the term “strong encryption” only refers to techniques to create ciphertext from plain text such that the plain text cannot be recovered without the concurrence of the intended recipient of the ciphertext under all but exceptional circumstances. For the privacy community, the term “strong encryption” refers to techniques to create ciphertext from plain text such that the plain text cannot be recovered without the concurrence of the intended recipient of the ciphertext, who would normally use a decryption key that he or she, and he or she alone, controls.

In other words, the privacy community defines strong encryption as encryption with zero likelihood that the ciphertext it produces can be decrypted by anyone other than the intended recipient. The law enforcement community defines strong encryption as encryption that will allow plain text to be recovered 100 percent of the time by law enforcement authorities under exceptional circumstances, but zero percent of the time by anyone outside of those circumstances.

These definitions are inconsistent with each other, but the report implicitly adopts the latter definition without noting that doing so is friendly to law enforcement equities. In a previous Lawfare post, I noted that “the degree of security available with an exceptional access requirement for encryption will be less than without such a requirement.” One can argue about how much less, but no technical person has ever disputed this point. At the same time, to the best of my knowledge, the law enforcement community has never acknowledged it either. By adopting the definition preferred by the law enforcement community without explicitly mentioning this point, it feels to me that the report is hiding the ball: It never mentions the reduction in security that “strong encryption,” as defined by law enforcement, can provide relative to “strong encryption” as defined by the privacy community. Making that tradeoff could be a reasonable policy choice, but hiding the tradeoff in the first place is not entirely fair.

Notably, the report does acknowledge the existence of multiple stakeholder groups concerned with cybersecurity, law enforcement and public safety, commerce, and privacy and human rights. As the report points out, all these groups have interests in the outcome of any debate over encryption. Nevertheless, to date, the terms of the debate have been set by privacy advocates, many technologists fearful of improper law enforcement access to encrypted information, and law enforcement authorities fearful that encryption will impede their access to information they need to prevent, investigate, and prosecute criminal activity. Recognizing that law enforcement has interests in using encryption to reduce cyber-enabled crime (a cybersecurity issue) and that the protection of human rights and civil liberties often relies on strong law enforcement to enforce those rights, the largely incommensurate interests of the two communities have taken center stage. For this reason, I have accepted for purposes of this post the report’s somewhat oversimplified (but nevertheless true-at-its-core) framing of the debate.

I predict that the report will receive a relatively warm embrace from law enforcement authorities and a relatively cool response from privacy and civil liberties advocates as well as technologists. If so, it will fall short of the oft-mentioned standard that a balanced report is one that is equally disliked by both sides but for different reasons. Readers should review this important document for themselves and see if they agree with me on this prediction.


Dr. Herb Lin is senior research scholar for cyber policy and security at the Center for International Security and Cooperation and Hank J. Holland Fellow in Cyber Policy and Security at the Hoover Institution, both at Stanford University. His research interests relate broadly to policy-related dimensions of cybersecurity and cyberspace, and he is particularly interested in and knowledgeable about the use of offensive operations in cyberspace, especially as instruments of national policy. In addition to his positions at Stanford University, he is Chief Scientist, Emeritus for the Computer Science and Telecommunications Board, National Research Council (NRC) of the National Academies, where he served from 1990 through 2014 as study director of major projects on public policy and information technology, and Adjunct Senior Research Scholar and Senior Fellow in Cybersecurity (not in residence) at the Saltzman Institute for War and Peace Studies in the School for International and Public Affairs at Columbia University. Prior to his NRC service, he was a professional staff member and staff scientist for the House Armed Services Committee (1986-1990), where his portfolio included defense policy and arms control issues. He received his doctorate in physics from MIT.

Subscribe to Lawfare