Cybersecurity & Tech Surveillance & Privacy

The Encrypted Living Will

Nicholas Weaver
Wednesday, April 27, 2016, 10:09 AM

 Security developers tussle between security and usability every day, but security and not weakness has to be the starting point.

Published by The Lawfare Institute
in Cooperation With
Brookings

The father of a colleague of mine once suffered a stroke, and was quite unexpectedly rendered incapacitated. His Windows laptop contained critical information, and was protected by a password known only to him. Fortunately, because there was no disk encryption in the laptop, my colleague was able to reset the Windows password and access the needed information. In that case, the ability to access the information was a relief in an otherwise very stressful situation. But the solution comes with an unfortunate trade off. A thief could have opened the laptop just as easily as my colleague did. Because my colleague's father was a lawyer, the laptop contained sensitive client documents the theft of which would have been catastrophic.

It is not possible to have a generic backdoor to allow a dutiful son access to a laptop, without also opening up the risk of exploitation by anyone else—including a malicious actor. The techniques available to my colleague are likewise available to a thief. Apple is relatively up-front about the tradeoffs: they clearly state that without a backup or password you will not be able to get into a device. Likewise, if the phone is physically destroyed without a backup, the data is lost forever. If we elect not to preserve backups, and a device gets ruined, we typically don't blame Apple for data loss.

But, from a security standpoint, there is a genuine difference between a backdoor that works on any system and one that works only on a single system or a set of systems that I own and which I've explicitly enabled. I use two "master key" passwords—sequences of five randomly selected words—to protect my devices.  These passwords unlock my computers, my backups, my phone, and my password manager. Without them, all my data is lost. What if something happens to me? There is certain information and work which I need to keep safe now, but would want my family or law enforcement to access if I was dead or incapacitated. And so, I wrote down a copy of my master keys, which I’ve sealed it in an envelope and locked in a safe. I’ve designated a family member who knows where they are, so if I'm brutally murdered, or mangled in a car crash, or pick your catastrophe, there is a way to recover the data.

Those developing encrypted systems offer such options all the time. iPhones can backup to the cloud and do so by default, Apple's FileVault can backup the key to the cloud and does so by default, Windows 10 consumer disk encryption can backup the key to the cloud and does so by default (and unfortunately, it did it invisibly to the user). So for all these systems, developers explicitly weaken security by default in order to protect naive users in exactly the scenarios Ben envisions here. Law enforcement or grieving relatives can access the user’s data but hopefully a thief cannot. Note, that the attempt to secure data from bad actors while preserving back up access doesn’t always succeed.

It is disingenuous to say those who advocate encryption don't understand these risks and tradeoffs: security developers tussle between security and usability every day. I can recover WhatsApp messages from a cloud backup—they've selected 'usability' over 'security' by enabling message backup in the cloud by default—but not Signal messages—they've selected 'security' over 'usability' and never knowingly allow messages to leave the phone—even though both apps use the same encryption.

In an ideal world, computers would recognize the different between a dutiful son and a thief. But constrained by technological realities, those who design systems are faced with the choice of either keeping everyone out or allowing the potential to let anyone in. Starting with an insecure system and then working to secure it is a demonstrable failure when it has been attempted. Instead, the discipline requires starting with a secure systems and then relaxing security as needed to balance usability.

The security designer's motto is effectively "We need to lock everyone out, but then the user can explicitly allow someone in if they desire."

Ben can have his encrypted living will, I already have mine.


Nicholas Weaver is a senior staff researcher focusing on computer security at the International Computer Science Institute in Berkeley, California, and Chief Mad Scientist/CEO/Janitor of Skerry Technologies, a developer of low cost autonomous drones. All opinions are his own.

Subscribe to Lawfare