Cybersecurity & Tech Surveillance & Privacy

A Biometric Approach as a Partial Step Forward in the Encryption Debate

Herb Lin
Thursday, December 3, 2015, 3:22 AM

The contours of the present encryption debate are well known. Especially in the wake of the Paris shootings, law enforcement and national security (LE/NS) officials are worried that terrorist use of encryption will prevent them from gathering information that could help thwart a terrorist event in the making.

Published by The Lawfare Institute
in Cooperation With
Brookings

The contours of the present encryption debate are well known. Especially in the wake of the Paris shootings, law enforcement and national security (LE/NS) officials are worried that terrorist use of encryption will prevent them from gathering information that could help thwart a terrorist event in the making. To forestall this possibility, they are asking major technology companies such as Apple and Google to build into their products the capability of accessing unencrypted information so that such information can be given to government authorities after a valid legal process has been executed—a capability often dubbed “exceptional access.” Moreover, these officials are not arguing for any particular technological approach to building this capability, saying that it should be up to the technology companies to implement this capability in a manner that is tailored to the particular technologies they produce.

On the other side, technologists point out that any conceivable mechanism for exceptional access requires that some third party (i.e., a party other than the user) has access to the relevant decryption keys, and thus leaves open the possibility that the third party will be one that is *not* entitled to have them. Technology vendors further point out that their customers are increasingly protective of their privacy, and many want to maintain sole control over access to their information.

For purposes of this piece, I’m going to offer an architecture that would meet some of the government’s stated needs. This architecture has many flaws, both from a civil liberties perspective and a government perspective, and as stated, may not be practical. But I hope it can help to separate the technical debate from the values debate.

The overall architecture distinguishes between data in flight (communications) or data at rest (files). Consider requirements for access to each separately.

Exceptional Access to Encrypted Communications

For encrypted communications, the recovery of plaintext is possible in one of three ways: (a) it can be obtained before the sender encrypts it, (b) it can be obtained after the receiver decrypts it, which is necessary for the receiver to understand it, or (c) it can be obtained copied while it is in transit and then decrypted by another party with access to the decryption key. Approach (c) is the approach favored by law enforcement authorities. However, Approaches (a) or (b) are possible if there is software on the sending or receiving devices that can copy the information before or after decryption.

How can such software be made to appear on the sending or receiving devices? Vendors build devices with the ability to receive over-the-air software updates, and it is in principle possible for particular software to be pushed to specific devices. Thus, if government authorities can technically identify the device that must be monitored, that device can be forced to install software that will copy and forward all communications that are sent or received through that device. Further, such installation can usually be done surreptitiously, just as LE/NS authorities require.

This approach to exceptional access for encrypted communications has much in common with (and is inspired by) an approach laid out in the “Lawful Hacking” paper by Bellovin, Blaze, Clark, and Landau.

Exceptional Access to Encrypted Files

For encrypted files, the recovery of plaintext requires knowledge of the key, which can be obtained in only two ways: (a) from the owner of the file, or (b) from another party or parties with whom the owner has shared the key prior to the point at which the recovery of plaintext is needed. Approach (b) is the approach apparently favored by law enforcement authorities, and the arguments against Approach (b) are well-known (cf., Keys Under Doormats).

Approach (a) is an alternative. If law enforcement authorities have the owner of the file in custody, he or she can be legally compelled to turn over the key. Some commentators suggest that under some circumstances, such an act may violate the file owner’s 5th Amendment rights regarding self-incrimination. Others argue that a judge could order the file owner to turn over the key or be held in contempt of court and thus imprisoned. But if the file owner is willing to accept whatever penalty the judicial system is willing to impose on him or her, no key will be turned over.

Implicit in this analysis is the key is something like a password—something that the file owner knows. But authentication can also be based on something that the file owner is, such as a fingerprint or a full DNA sequence. Consider then the idea of a biometric identifier, such as a fingerprint, being used to generate encryption keys.

From the standpoint of LE/NS authorities, this technical approach has many benefits, not the least of which is that taking biometrics from suspects is well established as a routine practice that does not raise 5th Amendment issues. Thus, in one common scenario offered by law enforcement and intelligence authorities in which they apprehend a terrorist suspect with an encrypted cell phone in hand, they can take his or her fingerprints as they always do, and those prints can be used to decrypt the phone.

Of course, what LE/NS authorities see as a feature, the privacy community would see as a bug. That is, because fingerprints are not protected by the 5th Amendment, they are inferior to passwords in this context. Further, technically inclined privacy advocates would quite properly raise at least two issues:

  • People leave fingerprints in the environment as they progress through their daily lives, and those wanting to compromise the protection that encryption offers will easily obtain fingerprint samples that can then be used on the encrypted device. Indeed, the prints may well be on the case of the device itself, thus eliminating the value of device encryption in the first place.
  • The generation of a cryptographic key from a fingerprint may be problematic. Biometric measurements are inherently imprecise, and the second time a fingerprint is read may not result in a reading identical to the first time it is read. Also, it may be true that a fingerprint can provide only the equivalent of weak cryptographic keys. How strong a key can be made from a fingerprint is an open technical question, and is also subject to many judgments about operational tradeoffs (such as the hassle factor for the user).

Whether other biometrics are more suitable for the generation of strong cryptographic keys is an open technical question. (It may also be the case that a strong cryptographic key per se is not necessary—under current designs used to protect many personal devices, the use of an 8-character code to access a smart device is equivalent to a cryptographic key that no one would regard as strong—instead, the access code is used to encrypt a much stronger cryptographic key, and few people complain about this “weakness” in current approaches to encrypting smart devices.)

Note also that this approach does not deal with some of the other scenarios posited by LE/NS authorities (e.g., when they have recovered an encrypted device but do not have access to fingerprints of its owner or when they want surreptitious access to the contents of an encrypted device). This approach also does not prevent an *application* from performing encryption that denies exceptional access. Users wishing to deny exceptional access could obtain such apps from a variety of sources, though of course many would simply fail to do so. Prohibiting users from obtaining such applications would be highly problematic in a globally connected world.

Discussion

By construction, this approach obviates all technical objections to exceptional access that are based on the idea of giving government authorities exceptional access to decryption keys. In this approach, exceptional access to encrypted communications and files is based on accessing data in the clear (in the case of communications) and by legally compelled production of a decryption key from the owner of an encrypted device (in the case of files).

On the other hand, this approach introduces other new technical objections, some of which may be insurmountable. It posits that strong encryption keys can be derived uniquely and reliably from biometrics that are inherently analog in nature, that biometric scanners can be added to devices relatively inexpensively, and that these biometrics will be difficult to access without proper legal foundation. Since these propositions are technical in nature, research may help to shed light on their validity.

I freely acknowledge that the costs of this approach may be too high given that it only addresses a portion of the LE/NS problem with encryption, and it also doesn’t accept as a given that LE/NS should not have exceptional access. These are both policy judgments about which this post is silent. But I hope that this approach at least suggests that the current entrenched positions described at the beginning of this post are not the only possible ways of thinking about the issue. To me, that realization would be progress in the encryption debate.


Dr. Herb Lin is senior research scholar for cyber policy and security at the Center for International Security and Cooperation and Hank J. Holland Fellow in Cyber Policy and Security at the Hoover Institution, both at Stanford University. His research interests relate broadly to policy-related dimensions of cybersecurity and cyberspace, and he is particularly interested in and knowledgeable about the use of offensive operations in cyberspace, especially as instruments of national policy. In addition to his positions at Stanford University, he is Chief Scientist, Emeritus for the Computer Science and Telecommunications Board, National Research Council (NRC) of the National Academies, where he served from 1990 through 2014 as study director of major projects on public policy and information technology, and Adjunct Senior Research Scholar and Senior Fellow in Cybersecurity (not in residence) at the Saltzman Institute for War and Peace Studies in the School for International and Public Affairs at Columbia University. Prior to his NRC service, he was a professional staff member and staff scientist for the House Armed Services Committee (1986-1990), where his portfolio included defense policy and arms control issues. He received his doctorate in physics from MIT.

Subscribe to Lawfare