Cybersecurity & Tech Surveillance & Privacy

What if Responsible Encryption Back-Doors Were Possible?

Josh Benaloh
Thursday, November 29, 2018, 3:36 PM

This is part of a series of essays from the Crypto 2018 Workshop on Encryption and Surveillance.

Published by The Lawfare Institute
in Cooperation With
Brookings

This is part of a series of essays from the Crypto 2018 Workshop on Encryption and Surveillance.

One of the fundamental constitutional precepts that the U.S. Supreme Court has recognized is the presumption of privacy. This presumption is manifested as limits on government intrusion into the private lives of American citizens. But these limits are not an absolute in American jurisprudence, nor are they present in all democracies. For instance, my conversation in a public place may be overheard, but there is nothing to stop me from taking actions and employing tools to enhance the privacy of my effects and communications. Absent extraordinary circumstances, I have a right to hide my artifacts and conceal my conversations, and I may also engage the assistance of a third party as an agent in doing so. Manufacturers of curtains and blinds may sell their products without building in features that make them transparent to law enforcement authorities; safes may be sold without retaining keys or combinations to provide exceptional access against the will of the purchaser; and encryption products may be sold that protect the privacy of data without restriction.

Under exceptional circumstances and with appropriate judicial review, law enforcement may be permitted to attempt to violate my privacy. But a search warrant is so-named because it grants a right to search—not a guarantee to find. Law enforcement authorities may also request and even compel my agent to provide information on any assistance rendered to me. But there is no prior restriction on the advice or tools that my agent may offer.

Let us now posit the existence of a responsible exceptional access technology, one that secures and protects the privacy of data with encryption, but also provides law enforcement authorities with access to that data. “Responsible” here describes a technology that achieves the desired effect of providing designated authorities with controlled access to data without creating undue risks of data being released to unauthorized parties. It should be noted that data breaches are all too frequent today and that complexity is regarded as the enemy of security. Thus, despite the dearth of proposals to provide responsible access and the expert analyses that enumerate reasons why it is likely unattainable, let us assume that such technology is possible. The next step is to consider the consequences of mandating its use. Even if we could build it, the question remains of whether we should build it.

In the current landscape, the security interests of technology vendors and their customers are generally aligned. Vendors act as their customers’ advocates. The relationship is, of course, imperfect. There are cases where vendors fail to adequately protect their customers and suffer consequences in the marketplace. Just as an attorney who provides poor counsel may not fare well, vendors who are careless with their customers’ data may not survive. Vendors have incentives to secure their customers’ data, and customers have incentives to purchase products and services from vendors who protect them well. Prices are certainly a consideration, and customers will not always pay a premium for better security, but all other things being equal, a rational consumer will select a vendor that provides better security.

Privacy and security are partners, but they are not interchangeable. An agent who is incented to protect my security may also have incentives to violate my privacy. However, when I seek to engage an agent to maintain the confidentiality of my data, an agent who does so steadfastly will be more valuable to me than one who protects my confidentiality only with caveats and conditions.

Introducing exceptional access technology alters the marketplace by increasing costs and reducing protections. It transforms the vendor from its role as an unqualified advocate to that of an equivocal actor who may or may not betray the confidence of its customers. The trust relationship is compromised, and vendors are prevented from serving as unambiguous and full-throated advocates of their customers and their interests.

If customers can choose between vendors offering products that are otherwise comparable, those that include provisions for law enforcement access will be at a competitive disadvantage. To be effective, therefore, all comparable products within a market (e.g. all mobile phones purchased or used with the U.S.) must be required to incorporate the technology.

A government could ban the sale of curtains and window shades and instead insist that those who want to block the view must purchase windows which can be made opaque electronically—with the stipulation that exceptional access features allow for the opacity to be overridden remotely. This is not impossible, but it would add significant costs, create a risk of windows becoming transparent at inopportune times (either due to malfunction or malicious attack), and establish a booming market for fabric stores to sell other materials that happen to be sized to perfectly fit windows.

The analogy to encryption is not far afield. The greatest difference may be that encryption technologies are virtual and are therefore easier to reproduce and transport. Ciphers that are beyond the ability of governments to break are described in detail in millions of textbooks that have been used to teach untold numbers of students around the globe.

The point here is that a customer who wants privacy can still utilize a device in which a law enforcement access technology has been embedded. A customer need only pre-encrypt sensitive data before using the device. The device can then be used precisely as intended, and a second layer of encryption will be applied. If a lawful exceptional access process is undergone, only the second encryption layer will be removed—revealing not the clear data but instead the pre-encrypted data produced by the customer.

The interesting question is the extent to which vendors will go to facilitate this alternative, and the likely answer is that many will go as far as legally permitted. Their customers will demand nothing less. Twenty years ago, U.S. regulators used export controls to thwart dissemination of encryption tools. Such tools were classified as munitions, and vendors were required to register as arms dealers to export them. This had a chilling effect on domestic distribution of encryption tools since U.S. vendors did not want to risk the legal jeopardy that might ensue should a single instance of a product be exported—whether inadvertently by vendors themselves or by third parties.

Americans could freely import and use products that included strong encryption, and U.S. vendors could not effectively compete with these imports. This allowed overseas vendors to be better advocates for U.S. customers than domestic vendors. In 2000, the export control regime was largely abandoned due to the harm it caused to U.S. vendors and the negative impact on data security. An exceptional access mandate today would sever the advocacy that vendors currently offer their customers and do substantial harm to both. The impact would be worse than it was in the pre-2000 era, when vendors were simply limited in the kinds of security they were able to offer—not required to provide explicit exceptional access.

As we have seen from numerous accounts, law enforcement authorities already have access today to unencrypted data. Keyloggers and other malware can be surreptitiously placed on devices of targeted individuals, and tools exist to crack open locked mobile phones. These means of access can be resource intensive, but that is a desirable property. The plea to mandate exceptional access technology is an attempt to remove these resource constraints and enable simple, economical, push-button access. But whether they recognize such or not, what officials are seeking when they call for easier access is mass-surveillance capabilities. This may not be their intent, but if it is easy and inexpensive to surveil one individual, then surveilling many is affordable and manageable, and the temptation will be great.

Americans should have an unfettered right to protect their own data, vendors should have the right to provide law-abiding citizens with tools and services to support their rights, and law enforcement authorities should have to expend resources when they are authorized to attempt to circumvent these protections. Make no mistake: Even if it could be built, “responsible” law enforcement access technology is not responsible at all.


Josh Benaloh is Senior Cryptographer at Microsoft Research and an elected director of the International Association for Cryptologic Research. He earned his S.B degree from the Massachusetts Institute of Technology and M.S., M.Phil. and Ph.D. degrees from Yale University, where his 1987 doctoral dissertation, "Verifiable Secret-Ballot Elections," introduced the use of homomorphic encryption to enable end-to-end verifiable election technologies.

Subscribe to Lawfare