Surveillance & Privacy

Apple's Cloud Key Vault, Exceptional Access, and False Equivalences

Michael A. Specter
Wednesday, September 7, 2016, 11:06 AM

Author’s note: Despite appearing under my byline, this post actually represents the work of a larger group. The Keys Under Doormats group includes Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze,Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A.

Published by The Lawfare Institute
in Cooperation With
Brookings

Author’s note: Despite appearing under my byline, this post actually represents the work of a larger group. The Keys Under Doormats group includes Harold Abelson, Ross Anderson, Steven M. Bellovin, Josh Benaloh, Matt Blaze,Whitfield Diffie, John Gilmore, Matthew Green, Susan Landau, Peter G. Neumann, Ronald L. Rivest, Jeffrey I. Schiller, Bruce Schneier, Michael A. Specter, and Daniel J.Weitzner, who jointly authored the report “Keys Under Doormats: Mandating Insecurity” last year. The following is a follow-up in light of recent events.

***

Apple recently made a technical announcement in a talk at Blackhat that has led to confusion regarding the efficacy and technical feasibility of exceptional access schemes. For instance, Matt Tait argues that Apple appears to have resolved the Going Dark issue by proving that they can handle user credential backups. But we should be careful to dispel arguments that such backups are equivalent to law enforcement exceptional access schemes.

IT'S OUR SIXTH BIRTHDAY!

Support Lawfare so we can continue bringing you articles like this one.

In the Blackhat talk, Apple used the word “escrow” to describe their backup scheme, causing some to jump to incorrect conclusions about the efficacy of exceptional access schemes. The thinking goes that if Apple can securely store sensitive user data like passwords in a way that only a user could then decrypt them, then it should also extend that same functionality to law enforcement requests for access to devices. Specifically, Tait intimates that because Apple has proven that they can maintain high-value keys, this invalidates arguments against exceptional access:

For a while now, the most compelling technical arguments against “Going Dark” and any technical “front-door” designs have been that the technology community simply does not know how to securely store high-value encryption keys such that they can be used, but can’t be stolen or misused by hackers or malicious insiders.

Here Tait is making two claims:

  1. Opponents of exceptional access mechanisms have only one valid compelling technical argument for why backdoors are untenable—that it’s hard to keep private keys private.
  2. That Apple’s scheme somehow invalidates the above premise.

Let’s take a step back to understand what happened here. The problem started when Apple discussed a clever method they’d created, called the Cloud Key Vault (CKV), for backing up user passwords to websites and email accounts on Apple iCloud. Apple’s engineers had a difficult technical challenge to overcome: to be user-friendly and function across all of a user’s devices (your iPhone, iPad and laptop), these backups had to be protected by an encryption algorithm that relied on the user’s (likely weak) iCloud password. In other words, if Apple had a misstep in their design, a malicious insider or other adversary could get these backups and try every possible passcode to get in.

The traditional approach to dealing with potentially weak passwords is to rate-limit how quickly they can be tried and to invalidate them—for instance, by locking an account—after too many unsuccessful attempts. Apple similarly needed a way to rate-limit and destroy these backups in the event of a brute force attack. For those interested in additional technical details of Apple’s scheme, both Matt Green and Steven Bellovin have helpful write-ups, and Apple’s Blackhat presentation slides, video, and iOS 9 Security Guide (particularly pages 46-47) are available as well. The gist is that Apple uses a hardened device explicitly made for storing high-value keys, known as a Hardware Security Module (HSM), to hold a sort of hardwired dead-man’s switch—10 incorrect tries and the keys to decrypting the user’s passwords are destroyed permanently. This is not unlike what the iPhone’s “Secure Enclave” already does, though in this case the machines are outside of the user’s physical control.

Tait’s argument conflates backup schemes with exceptional access schemes. In Apple’s scheme, there is no third party with access—all keys are stored either in an entirely inaccessible hardware device or in the mind of the user. There are private keys involved in initially programming the HSM, which Apple destroys before they come in contact with user secrets. In other words, Apple doesn’t even trust itself with decryption keys, and has gone out of its way to make it physically impossible even for anyone who works for Apple to access them.

That’s why this system (assuming you trust Apple’s initial programming, their HSMs, etc.) could be very secure.

In other words, Apple’s design intentionally solved the problems that come from exceptional access schemes by removing itself from the equation in the same way that the Secure Enclave does for iPhones. Rather than providing an exceptional access solution, Apple took the radical step of destroying those keys in order to have an acceptable level of protection. And it took these extreme measures for a mere optional backup system for a subset of Apple users’ personal information.

The problem with Tait’s argument becomes clearer when you actually try to turn Apple’s Cloud Key Vault into an exceptional access mechanism. In that case, Apple would have to replace the HSM with one that accepts an additional message from Apple or the FBI—or an agency from any of the 100+ countries where Apple sells iPhones—saying “OK, decrypt,” as well as the user’s password. In order to do this securely, these messages would have to be cryptographically signed with a second set of keys, which would then have to be used as often as law enforcement access is required. Any exceptional access scheme made from this system would have to have an additional set of keys to ensure authorized use of the law enforcement access credentials.

Managing access by a hundred-plus countries is impractical due to mutual mistrust, so Apple would be stuck with keeping a second signing key (or database of second signing keys) for signing these messages that must be accessed for each and every law enforcement agency. This puts us back at the situation where Apple needs to protect another repeatedly-used, high-value public key infrastructure: an equivalent situation to what has already resulted in the theft of Bitcoin wallets, RealTek’s code signing keys, and Certificate Authority failures, among many other disasters.

Repeated access of private keys drastically increases their probability of theft, loss, or inappropriate use. Apple’s Cloud Key Vault does not have any Apple-owned private key, and therefore does not indicate that a secure solution to this problem actually exists.

It is worth noting that the exceptional access schemes one can create from Apple’s CKV (like the one outlined above) inherently entail the precise issues we warned about in our previous essay on the danger signs for recognizing flawed exceptional access systems. Additionally, the Risks of Key Escrow and Keys Under Doormats papers describe further technical and nontechnical issues with exceptional access schemes that must be addressed. Among the nontechnical hurdles would be the requirement, for example, that Apple run a large legal office to confirm that requests for access from the government of Uzbekistan actually involved a device that was located in that country, and that the request was consistent with both US law and Uzbek law.

My colleagues and I do not argue that the technical community doesn’t know how to store high-value encryption keys—that’s the whole point of an HSM. Rather, we assert that holding on to keys in a safe way such that any other party (i.e. law enforcement or Apple itself) can also access them repeatedly without high potential for catastrophic loss is impossible with today’s technology, and that any scheme running into fundamental sociotechnical challenges such as jurisdiction must be evaluated honestly before any technical implementation is considered.


Michael A. Specter is a PhD student in the Electrical Engineering and Computer Science department at the Massachusetts Institute of Technology, and a research assistant in MIT CSAIL's Internet Policy Research Initiative. Specter’s research focuses on systems security issues related to cryptography, privacy, economics, and vulnerability discovery. Prior to joining MIT as a graduate student, he spent six years at MIT's Lincoln Laboratory where his research focused on operating system security, malware analysis, reverse engineering, and vulnerability discovery.

Subscribe to Lawfare