Cybersecurity & Tech Surveillance & Privacy

Secure By Default is Security for Us

Nicholas Weaver
Thursday, December 10, 2015, 4:13 PM

Technologists don't oppose cryptographic backdoors based on some vague allegiance to civil liberties. We oppose them because we've been burned before. Backdoors fail, often quite badly. And the community is still cleaning up the mess from the first crypto war. It is hard enough to build secure systems without introducing deliberate weaknesses. No amount of technical smarts will fix this.

Published by The Lawfare Institute
in Cooperation With
Brookings

Technologists don't oppose cryptographic backdoors based on some vague allegiance to civil liberties. We oppose them because we've been burned before. Backdoors fail, often quite badly. And the community is still cleaning up the mess from the first crypto war. It is hard enough to build secure systems without introducing deliberate weaknesses. No amount of technical smarts will fix this.

FBI Director Comey appears to be realizing this critical fact. But this understanding has simply resulting in him adopting a new tactic. Rather than calling for backdoors, Comey now takes aim at security by default. In recent testimony, Comey characterized the deployment of secure by default systems a "business model" decision on the part of Apple and others. The inference here is that the patriotic solution is to force users to opt-in to security. But this would only make the job of securing the world against bad guys harder, without necessarily helping the FBI against their specific targets.

Making systems like the iPhone insecure by default invites trouble. We know many people simply won’t know, or bother, to opt-in. If law enforcement can unlock an iPhone without the user present, then foreign intelligence and ordinary crooks can probably do the same. The FBI director, by the way, also leads the nation’s counterintelligence efforts. Does he really want to gift foreign actors the ability to gain all information in a target’s phone by simply stealing it? Apple's switch to device encryption over the past few years is likely aimed to prevent precisely this type of "informational mugging."

And federal agents are as susceptible to non-default security as anyone. Federal law enforcement uses P25 radios for secure tactical communication, but because the default setting, Special Agent Johnny often still forgets to encrypt his communications. These stories, among many, many others are why the security community focuses on creating simple and secure defaults. iMessage and Signal encrypt everything because that way the users never screws up and forgets.

Besides, insecurity by default will not prevent careful bad guys from using secure systems. They can seek out encryption applications produced outside the US or use open-source systems. During the first crypto war, we saw US companies develop cryptographic code through foreign subsidiaries which were then free to import to the US or bypass ITAR restrictions and import directly to foreign countries. So even if Silicon Valley adopted a business model of leaving their customers vulnerable, the FBI would still face at least some dark bad guys.

But there are other partial alternatives for law enforcement beyond cryptographic backdoors.

During his testimony, Director Comey discussed a case in which encrypted content impacted a counterterrorism investigation. Comey reports that, on the morning of the Garland, Texas terrorist attack attempt, one of the attackers sent 109 encrypted messages to a known terrorist. Because those communications are securely encrypted, the FBI may never learn the contents. But the same would be true of 109 brief phone calls unless the FBI already had a wiretap in place. And Comey does not appear to be calling for phone companies to record and retain every call just in case one might prove useful to a future investigation.

Furthermore, in Garland, the cryptography did not obscure what is perhaps the most critical information: on the day of the attack, one of the perpetrators sent 109 encrypted messages to a known terrorist. Here, it would seem, the metadata is the message. And most secure communications systems cannot hide metadata from those with lawful access. Those systems which do conceal metadata are substantially more expensive, so much so that they are rarely used and the very fact of employing them may be naturally suspicious.

Personally, I preferred the days when the FBI just alleged that smart people could solve the vulnerabilities of backdoors. At least that rhetoric was manifestly false; it was akin to the notion that if we just set our best minds off to "math harder” we could then divide by zero. The argument that security by default is a business decision—and thus, something that can and should be changed—strikes me as far worse. By framing security as a business strategy, Comey attempts to impose a business cost to companies who value security. He is saying to the fearful masses that Apple and similar companies are not being patriotic, and if you, Citizen, want the FBI to catch terrorists maybe you should express that through its bottom line and not buy those products. Or maybe he is even trying to convince companies themselves that this is the right thing. But what everyone needs to understand—and what Comey needs to understand—is that removing default security only makes the rest of us (even federal agents) more vulnerable without truly helping the FBI catch more bad guys.


Nicholas Weaver is a senior staff researcher focusing on computer security at the International Computer Science Institute in Berkeley, California, and Chief Mad Scientist/CEO/Janitor of Skerry Technologies, a developer of low cost autonomous drones. All opinions are his own.

Subscribe to Lawfare