Cybersecurity & Tech Surveillance & Privacy

Getting Encryption onto the Front Burner

Daniel Richman
Thursday, October 26, 2017, 7:00 AM

I’m happy to be wrong, but I don’t expect the Deputy Attorney General’s recent speech to spark productive engagement in the standoff over encryption.

Published by The Lawfare Institute
in Cooperation With
Brookings

I’m happy to be wrong, but I don’t expect the Deputy Attorney General’s recent speech to spark productive engagement in the standoff over encryption. Federal, state and local authorities will keep highlighting their increasing inability to obtain critical data (in motion and at rest) by means of legal process and will try to demonstrate the critical public safety price they (meaning we) pay for “warrant-proof” platforms. Tech firms, for their part, will continue to focus on customer and shareholder value, which is both completely natural and consistent with widespread libertarian preferences in that sector. Recent calls for a congressional commission—usually a tactic for kicking a can down the road—appear to have receded. Apparently a decision has been made that this can will kick itself.

In the meantime, the range of data and communications inaccessible to law enforcement and other regulators without the cooperation of customers and users will increase by leaps and bounds. We are fast moving from a world in which both the platform provider and the customer could access data to one where only the customer can. It’s not just that Apple moved from an iPhone operating system that allowed the company to retrieve data from the device to one for which only the customer has the key, or that end-to-end encrypted messaging platforms like WhatsApp have become common. It’s also the steady move to customer-controlled cloud encryption. Smartphone and user data that used to be backed up to cloud—reducing, though not eliminating enforcers’ need to access the device—will still be in cloud, but with access controlled by the consumer. The corporate data previously stored on company owned and controlled services, and thereafter moved to the cloud, will continue to reside these, but cloud providers will soon disable themselves from accessing it, even during computation.

We are thus moving to a world in which customers and users of all stripes become the exclusive gatekeepers of their own data and communications. For a range of customers and users, this state of affairs may not impede the many public safety and regulatory projects we rely on government to pursue. When presented with a warrant or other appropriate legal process, many firms and individuals will comply. Investigators, particularly in white collar investigations, frequently use subpoenae rather than search warrants, working through lawyers and firms and trusting that those on whom they serve process will comply. But when pursuing not just terrorism, violent crime, and child exploitation cases, but also many white collar ones, law enforcers justifiably lack this trust, and regularly fear the obstruction that ensues when a data request tips a target off, or some combination of partial compliance and deletion.

In an effort to avoid refighting the Crypto Wars, and recognizing the value of innovation and the problem with top-down mandates, Rosenstein, like former FBI Director James Comey, took pains not to demand any particular solution, looking only for some key management technique or other arrangement that the government could require the provider to draw on. Such access already exists when a provider’s business model requires a product to have it—for, say, key recovery (for devices), content scanning, or updates—and the Justice Department wants this to occur even absent that business model.

Perhaps because neither this White House nor Congress is ready to do anything, the DAG didn’t quite call for legislation or regulation. Indeed, he quickly limited his proposal only to “mass-market consumer devices and services that enable warrant-proof encryption by default” and was pretty vague about how even those platforms would be addressed. He doubtless hopes that this move will spark some sort of voluntary compliance by industry—the specific consideration of public safety and other social costs.

If history is any guide, however, only actual legislative or regulatory proposals will spark constructive engagement. Signals from the Obama White House that the private sector took as meaning legislation was “off the table” helped ensure that nothing happened during the past administration. But one has only to look to the story of the Symphony messaging platform to see the effect of credible threats. When, soon after the prosecution of several banks for a foreign exchange bid-rigging conspiracy accomplished through chatroom conversations, a consortium of several large financial institutions started their own platform that touted end-to-end encryption and "guaranteed data deletion,” New York State’s Department of Financial Services used its considerable regulatory power to intervene and demand that Symphony retain message data for seven years and that individual banks agreed to store decryption keys for the messages with independent custodians. They agreed, and Symphony flourishes, backed by a number of investors, including Google.

As my colleague Steve Bellovin regularly reminds me, “crypto is hard,” and any efforts to engineer access in the public interest might well add cybersecurity risks beyond those faced—and rarely eliminated—by firms trying to engineer systems for their own purposes. Yet I would expect firms to build and deliver only those products their engineers tell them can be made securely and within the context of their own processes. And I question why we should normalize the risks firms take as they roll out products to serve their customers while problematizing those they would face if they were required to take social costs into account. Proofs of concepts for addressing the engineering challenges are beginning to circulate, and need to be carefully considered, even if they are not perceived as singular or complete solutions to these complex challenges.

The systems design challenges are even greater because of the international dimensions of any voluntary or compelled regime of exceptional access. Regulating devices sold around the world alone would be hard enough, as we need to consider the competitive disadvantage a U.S. mandate would place on firms. But at least obtaining data from a device with lawful process can be limited to the jurisdiction with control of that device. This is why, although the deputy attorney general lumped access to stored data on devices with access to “in-flight” encrypted real-time communications and messaging, progress (legislative or otherwise) may require dealing with the two on separate tracks. For no such restrictive principle can easily organize which jurisdiction has access to data in motion or data in a delocalized cloud.

Yet the international dimensions of the challenge are actually a reason for engagement, not forbearance. Consider the current state of play: China’s encryption policy has been inexorably moving toward mandated government access. To be sure, we don’t look to China to set normative standards for balancing privacy and cybersecurity with public safety and other asserted public interests. But tech firms have been increasingly ready to accommodate China’s sovereign demands, and there is no reason to expect that forbearance by the U.S. will be matched by forbearance by the Chinese Government, which has not been looking to the U.S. for guidance on cyberpolicy. And it’s not just China that is demanding full access to the data use and traffic of its citizens. Nor just Russia. In the United Kingdom, the Investigatory Powers Act allows the government to serve “technical capability notices” to obtain plaintext. Proposals for exceptional access legislation seem to be moving forward in Canada, Australia and New Zealand. Germany and France have asked for EU legislation. To be sure, the impact of this legislative wave can’t yet be discerned, and the extent to which firms will actually be forced to engineer a solution to allow them access customer data, regardless of their current architecture, remains unclear. But trend is there, and one or more of these nations are surely an incident or two away from that.

It’s a mistake to paint increasing governmental regulatory demands as simply the product of the “security state.” In liberal democracies, they are efforts to protect liberal values. Just as jurisdictions impose data protection rules on data flows to advance the autonomy of citizens, so will they seek access to data as means of denying impunity to individuals and entities that seek to exercise illegitimate power. Each country—liberal or illiberal—will have its own demands. It’s hard to imagine what the result of these possibly conflicting demands will be. And customer and shareholder pressure from one country may force a vendor to change its operations in another. But the problem cannot be wished away.

Nor can the problem be fairly denied by pointing to some “golden age of surveillance”—metadata, IoT and GPS data and the like—that the government can use instead of content. The deputy attorney general did a nice job highlighting how content matters, both to exclude and include people from suspicion. Perhaps if non-content data comes to dominate the available evidence, juries will still regularly convict. But that’s hardly a future to embrace.

Then there are those who think governmental hacking can substitute for a regime of authorized access. It’s certainly true that on both the intelligence and the criminal investigation sides, the government will indeed work to identify and exploit vulnerabilities. But these methods cannot always be counted on. And those who would make them the primary recourse for federal, state and local authorities seeking to investigate criminal activity need to consider the implications of that approach.

A private market for hacking already exists. Do we want it to swell with the demands of federal, state and local authorities? Do we want a world in which an engineer (working for a company or open source) creates vulnerabilities that he can turn around and sell? We already face the risk of government hacking tools escaping. How much greater is the risk when the market expands? Moreover, if forced to rely on vulnerability exploitation, law enforcement cannot be expected to tolerate the disclosure of each tool—developed or bought—whenever they bring a prosecution using its fruits. Authorities would justifiably push for a law-enforcement-sensitive version of CIPA, leaving defense lawyers to complain about their inability to fully litigate chain of custody and evidence integrity issues.

Larger issues of the government’s relationships with others loom. We need more threat information sharing between government and industry, not less. But a world in which intelligence and criminal agencies need regularly to rely on vulnerability exploitation only increases the cost of sharing. Even now, one can question calls to move the Vulnerabilities Equities Process out of the White House—which can internalize all government interests—to Homeland Security, whose cybersecurity mission aligns it more closely to companies. Forcing law enforcers of all stripes to rely on hacking—not just to access data from the less popular platforms but from the ones that Americans most commonly use—would surely galvanize intelligence and law enforcement agencies even more against the move.

Then there is the government’s relationship with its citizens. Even those troubled by the range of investigatory powers provided to law enforcement should prefer that the powers are clearly enumerated and understood so that we have full transparency on how those powers are exercised. We should not backtrack on the long road from “black bag jobs” and informal data sharing to Title III surveillance warrants and other formalized investigative processes. Citizens need to know the rules and, to the greatest extent possible, be able to determine compliance with them. A world in which hacking and its accompanying opacity becomes the rule, not the exception, for obtaining encrypted data takes a large step away from that goal, at a time when we can least afford further erosion of trust in government processes at all levels.

Any legislative proposals relating to encryption made in wake of a terrorist attack or other heinous crime will surely be condemned as product of “moral panic.” But in the US, at least, moral panics are often simply the way long-overdue policies get enacted. We only overreact or underreact, and rarely get it right. Let’s try before the panic.


Daniel Richman is the Paul J. Kellner Professor of Law at Columbia Law School.

Subscribe to Lawfare