Cybersecurity & Tech Surveillance & Privacy

A Response to a Modest Backdoor Proposal

Nicholas Weaver
Thursday, February 4, 2016, 4:12 PM

Ben Wittes makes a modest proposal to backdoor encryption by removing legal protections from service providers, but the modest proposal has some major problems. First, a backdoor mandate by any other name is still a backdoor mandate. Second, the mandate would only apply to default behavior because those who desire to evade surveillance are able to use foreign third-party services beyond the reach of US law enforcement.

Published by The Lawfare Institute
in Cooperation With
Brookings

Ben Wittes makes a modest proposal to backdoor encryption by removing legal protections from service providers, but the modest proposal has some major problems. First, a backdoor mandate by any other name is still a backdoor mandate. Second, the mandate would only apply to default behavior because those who desire to evade surveillance are able to use foreign third-party services beyond the reach of US law enforcement. Third, CDA 230 immunity does not actually implicate those services which law enforcement claims are a problem.

Call it "exceptional access," call it "complying with legal processes to wiretap", call it "technical assistance" or "golden keys”—a backdoor is still a backdoor. The policy community continues to adhere to the belief that creating backdoors for the US or UK is not problematic because companies are certain to add backdoors anyway, when other countries demand they do so. Technology companies believe the opposite—that refusing to offer backdoors in the US and UK is precisely what allows them to refuse elsewhere. This is not a question of "threading that needle," rather it presents a binary choice: Offer backdoors for nobody or offer them for everybody.

Thus far, the technologists have been right. Why was Blackberry forced to fight Pakistan over bulk access to Blackberry messages? It’s because Blackberry includes a backdoor by design. The only reason Pakistan was able to ask for bulk data in the first place, is because Blackberry already offered the exact function for "targeted" communications. In Brazil, the rapid banning and then unbanning of WhatsApp demonstrates that is possible to resist local governmental pressures. But WhatsApp was only able to maintain hardline protection because its encryption was not designed to support any form of wiretapping.

Ben’s proposal is the mafioso response to Going Dark: "Nice communication service you got here, it would be a pity if it got buried in frivolous lawsuits." And these lawsuits would be frivolous. Ben acknowledges that, by his read of "material support," Apple is unlikely to be liable for a Jihobbiest using iMessage— it would be similar logic to holding GM liable for car bombs. Therefore, this proposal is explicitly designed to punish corporate behavior by opening the floodgates of unrelated causes of action. Should Congress use the plaintiffs’ bar as a substitute for hired goons?

There is another limitation to the proposal: it only affects the default services provided by major US companies, and fails to address the great many third-party services located overseas, which are largely immune from US attempts to compel backdoors short of an outright ban on foreign software and services. And, as a general matter, default security—such as iMessage encryption—protects two groups, dumb criminals and the rest of us. This leaves the problem of the even moderately smart criminal. A bad guy with two brain-cells will utilize services which are beyond the reach of US law enforcement by virtue of geographic location.

Protonmail, for example, happily complies with court orders … issued by Swiss courts. Likewise, Telegram will provide US law enforcement all the information it can, but only through an MLAT. The fact that qq only encrypts data in transit is irrelevant when the Chinese response to a US data request is typically "How about never? Is never good for you?" Efforts to raise the communication security bar for bad guys are inherently futile if they don’t account for service arbitrage. To raise the bar a millimeter—by forcing criminals to use a different, non-US service—is not an effective defense against competent criminals and places the rest of us at risk.

Removing a US liability shield from these foreign services doesn’t offer any meaningful defense against security arbitrage. These companies operate in ways that neither require nor benefit from CDA immunity. First, few actually publish user content. Second, even where CDA 230 might otherwise be employed as a defense, because of the limitations on enforcing a judgement, many will treat a US civil claim with approximately the same respect they extend US law enforcement demands: a one-finger salute.

Finally, companies like Apple do not rely on CDA immunity often anyway. CDA 230 protects publishers from civil liability resulting from user content and, thus, is relevant primarily to systems like Twitter and Facebook that rebroadcast user-generated content. CDA 230 alleviates the obligation for services to police their users, because absent this protection the service might not exist at all. Apple, for its part, actually only offers only minor services involving publication—notably iPhoto publishing—and those could be easily replaced with a third-party provider.

The services which benefit from CDA are not "dark" to legal process in the first place, since the CDA benefit is focused entirely on services that redistribute data. But even these services, which do not pose significant problems, do host some encrypted content which the service provider cannot access. Should Google lose the CDA shield if a terrorist posts an encrypted message that Google is unable to decrypt for law enforcement?

If a company must certify it can provide readable data in order to be entitled to immunity, then companies like Google face a difficult choice. Google would either have to either ban all encrypted content—including all third-party services offering encryption— or independently verify that such third-party encryption is weak enough to crack. Thus, the choice is between no encryption at all, or a super backdoor—one so trivial that Google is able to unlock it. And presumably, Google would have to ensure, on an ongoing basis, that all content could be decrypted on demand—so there is also a requirement to rattle the doorknob on every backdoor.

Semantic tricks to make backdoor proposal more palatable by deeming it "voluntary" doesn’t make it any less of a backdoor. And it doesn’t make it any less of a bad idea.


Nicholas Weaver is a senior staff researcher focusing on computer security at the International Computer Science Institute in Berkeley, California, and Chief Mad Scientist/CEO/Janitor of Skerry Technologies, a developer of low cost autonomous drones. All opinions are his own.

Subscribe to Lawfare