Cybersecurity & Tech Surveillance & Privacy

Not a Slippery Slope, but a Jump off the Cliff

Nicholas Weaver
Wednesday, February 17, 2016, 4:51 PM

When I first read the court order in the San Bernardino case, I thought it was reasonable, as it is both technically plausible and doesn't substantially impact user security for most people. Even if Apple's code escapes it only compromises security for those who have a weak passcode on an older phone which is then captured by an adversary.

Published by The Lawfare Institute
in Cooperation With
Brookings

When I first read the court order in the San Bernardino case, I thought it was reasonable, as it is both technically plausible and doesn't substantially impact user security for most people. Even if Apple's code escapes it only compromises security for those who have a weak passcode on an older phone which is then captured by an adversary. As backdoors go, its one that I can (*GASP*) actually live with!

The problem is this is a direct invocation of Benjamin Wittes's world of government-mandated malicious updates. The request seems benign but the precedent catastrophic.

The request to Apple is accurately paraphrased as "Create malcode designed to subvert security protections, with additional forensic protections, customized for a particular target's phone, cryptographically sign that malcode so the target's phone accepts it as legitimate, and run that customized version through the update mechanism". (I speak of malcode in the technical sense of "code designed to subvert a security protection or compromise the device", not in intent.)

The same logic behind what the FBI seeks could just as easily apply to a mandate forcing Microsoft, Google, Apple, and others to push malicious code to a device through automatic updates when the device isn't yet in law enforcement's hand. So the precedent the FBI seeks doesn't represent just "create and install malcode for this device in Law Enforcement possession" but rather "create and install malcode for this device".

Let us assume that the FBI wins in court and gains this precedent. This does indeed solve the "going dark" problem as now the FBI can go to Apple, Cisco, Microsoft, or Google with a warrant and say "push out an update to this target". Once the target's device starts running the FBI's update then encryption no longer matters, even the much stronger security present in the latest Apple devices. So as long as the FBI identifies the target's devices before arrest there is no problem with any encryption. But at what cost?

Currently, hacking a target has a substantial cost: it takes effort and resources. This is one reason why I don't worry (much) about the FBI's Network Investigative Technique (NIT) malcode, they can only use it on suitably high value targets. But what happens in a world where "hacking" by law enforcement is as simple as filling out some paperwork?

Almost immediately, the NSA is going to secretly request the same authority through the Foreign Intelligence Surveillance Court using a combination of 702 to justify targeting and the All Writs Act to mandate the necessary assistance. How many honestly believe the FISC wouldn't rule in the NSA's favor after the FBI succeeds in getting the authority?

The NSA's admittedly restrictive definition of "foreign intelligence" target is not actually all that restrictive due to the "diplomatic" catch-all, a now unfortunately public cataloging of targets, and a close association with the GCHQ. So already foreign universities, energy companies, financial firms, computer system vendors, governments, and even high net worth individuals could not trust US technology products as they would be suceptible to malicious updates demanded by the NSA.

Yet the problems don't end with the economic impact on US businesses. Every other foreign law enforcement and intelligence agency would demand the same access, pointing to the same precedent. At least for other countries, Silicon Valley may succeed in restricting these updates to only targets in the country giving the order. This still means that US travelers overseas would face greatly increased risk: a US based Lawfare reader could not install an OS update if touring France or Israel, as the DGSE or Unit 8200 could invoke the same authorities and precedents to attack what they would term a "lawful foreign intelligence target" under French or Israeli domestic law.

The situation grows worse when one considers the "Athens Affair" problem with law enforcement "exceptional access" mechanisms. What happens to US government systems if an adversary manages to surreptitiously gain access to Microsoft's "All Writs Lawful Update" mechanism in the same way that unknown attackers accessed Vodafone Greece's CALEA interface or the Chinese hacked Google for surveillance?

Yet the true disaster doesn't end with US interests placed at risk but extends to the general software ecosystem. Perhaps the greatest innovation in computer security in the past 15 years are automatic updates. It is automatic updates that protect the overall ecosystem, and anything which makes automatic updates untrustworthy would prove a boon to attackers.

This case is very different from the other All Writs case Apple is fighting. In that one, I agree with Susan Hennessey that Apple seems to be deliberately obnoxious mostly for the sake of public perception, although I'll argue that the Justice Department's refusal to fire up their own copy of EnCase Forensic is also troubling.

The San Bernardino case, however, is not a tip-toe down a slippery slope but a direct leap into a dangerous world, one which would compromise all our security under an incredibly ambitious reading of the law.


Nicholas Weaver is a senior staff researcher focusing on computer security at the International Computer Science Institute in Berkeley, California, and Chief Mad Scientist/CEO/Janitor of Skerry Technologies, a developer of low cost autonomous drones. All opinions are his own.

Subscribe to Lawfare