Cybersecurity & Tech Surveillance & Privacy

Punching the Wrong Bag: The Deputy AG Enters the Crypto Wars

Susan Landau
Friday, October 27, 2017, 7:00 AM

Deputy Attorney General Rod Rosenstein gave a rather remarkable speech on encryption two weeks ago. Arguing that encryption's creation of "warrant-proof" spaces is irresponsible, the deputy attorney general suggested that Silicon Valley is putting profits over public safety. On the surface, the dots in Rosenstein's statements line up. On closer inspection, however, there are large gaps in his argument from start to finish.

Published by The Lawfare Institute
in Cooperation With
Brookings

Deputy Attorney General Rod Rosenstein gave a rather remarkable speech on encryption two weeks ago. Arguing that encryption's creation of "warrant-proof" spaces is irresponsible, the deputy attorney general suggested that Silicon Valley is putting profits over public safety. On the surface, the dots in Rosenstein's statements line up. On closer inspection, however, there are large gaps in his argument from start to finish. Because the speech may herald the next salvo in the “Crypto Wars,” it is worth examining in detail.

Matt Tait has already explained how the speech confuses different uses of encryption. Sometimes use of encryption is good for the public, he pointed out, such as https encryption, which protects communications to and from websites. Tait also noted that some battles against encryption are unlikely to succeed because encryption apps intended to protect communications are easy to develop and their use is difficult to regulate. So as Tait says, the current fight is really over device encryption, as per the Apple/FBI case—though one would be hard-pressed to know that from an initial review of Rosenstein's speech.

Rosenstein complained that Apple had not developed software to defeat the security protections of the locked iPhone of the San Bernardino shooter. The phone was unlocked by a contractor (despite the FBI's claim that only Apple could open the iPhone). And no evidence of interest was found on the phone. Although he brought up the San Bernardino case, Rosenstein ignored these concerns.

The deputy attorney general argued that by providing technology that could lock law enforcement out of phones, Apple was prioritizing privacy and profits over public safety. On the contrary: Apple devised its phone locks while seeking to solve a public safety issue. In the late 2000s, Chinese criminals were making and selling techniques that removed data from lost or stolen phones, essentially “hack-in-a-box” tools for sale. Criminals used the stolen personal data to commit fraud. Apple sought a way to protect some of its customers’ most sensitive data, such as email and contacts, on their phones. In 2011, Apple developed an encryption key that combined the user's PIN with the phone's hardware key. This innovation made it harder to steal email off an iPhone even when criminals had a lost or stolen phone in hand. With iOS 8, Apple went one step better, employing the user's PIN and the phone's hardware key to protect 90 to 95 percent of the phone's data. The result: It's much harder for anyone other than the legitimate user to access data from a locked iPhone. That's good for security. But the fact that Apple's locking mechanism provides security wasn't mentioned in Rosenstein's speech. It's possible that Rosenstein and the Department of Justice didn’t know the intent behind Apple's careful and thorough phone-locking scheme.

In the decade since Apple introduced the iPhone, smartphones have become ever more powerful. That power has facilitated greater functionality, leading users to store increasingly more personal and proprietary data on phones. That is reason enough for Apple and Google to pursue solutions that keep phones’ data inaccessible from anyone but their owner. For years the FBI has asked that encryption systems be designed with "exceptional access": a way for law enforcement with a search warrant to subvert encryption effects. Security engineers say there is no safe way to do this; any system that provides a way in for law enforcement will inevitably be subverted by hackers. (See, for example, Keys Under Doormats for arguments why this is so.)

In his recent speech, Rosenstein argued that if Silicon Valley can securely provide updates to devices, it surely can securely provide updates that undo the security protections. This would enable law enforcement armed with a search warrant to open locked phones. This argument ignores the technical realities of what is possible to build securely—and what is not.

Security engineers can't ignore those realities. Let's look at updating phones. To ensure updates come from the phone's manufacturer, the software is "signed" with the company's private "signing key." The software on the device recognizes the update as coming from the company. Keeping signing keys secure prevents others from "signing" software and thus making their fake "update" appear genuine. Internet companies know how to keep those signing keys secret. This is necessary for the security of the update process.

Currently, just a small group of trusted and highly vetted people prepare and validate Apple's update mechanisms. The number of law enforcement "updates" would put the security of the process in peril. To keep an "unlock update" from spreading to many phones—and consequently imperiling their security—each unlock update would have to be tied to a specific phone and created separately. Updates to iPhone operating systems occur only a few times a year. If "updates" were to be used to unlock phones, this updating mechanism would have to accommodate the "thousands of seized devices" in law enforcement's possession.

Now think about how law enforcement unlock updates might be done. The updating process, modified to be used multiple times a day rather than a few times a year, is susceptible to subversion. The process could be automated, but that presents a security risk. It’s prudent to have eyes on the process of undoing the protections of someone's phone. A more likely model is that multiple people—a lawyer (to examine the court order requiring the phone unlock) and an engineer—would work together to approve a "security-undo" update. Letting so many people access the server that authorizes updates introduces human risks to the system. Risks may arise from malfeasance or from sloppiness—or, most likely, a combination of both. Some will no doubt argue that these risks could be prevented by carefully vetting everyone who touches the updates. But that ignores the reality of how a system conducting multiple updates a day would work.

There are other threats as well. Using updates as a way to undo security protections means evidence will go to court. The company providing the operating system (Apple for iPhones, Google for Androids) would have to store the dangerous software, creating risk. Smart defense lawyers would want access to the software that undoes the phone's security protections. Inevitably, that code will leak—putting everyone's phones at risk.

The far graver risk, however, is to the update system. Frequent use of the update system for unlocking could enable compromise of the update system itself. Such an outcome would be disastrous. Automatic patching and updates is perhaps the most important success in improving cybersecurity in recent years. A crucial aspect of their usage is public trust in the process. Any compromise of the update system—especially one that stems from use of the update mechanism to perform an unlocking update—would inevitably cause people to shut off automatic updates. In part, that would come from the (largely unjustified) fear that the government was putting backdoors in everyone's systems. Actions that reduce usage of automatic updates would be highly counterproductive to cybersecurity. A policy that risks a public abandonment of automatic updates does not make sense.

It's not that the Department of Justice is unaware of the security risks in using updates as a way to unlock phones. I testified about them a year and a half ago in a congressional hearing on the Apple/FBI case. Apple made the same argument in its court brief. The deputy attorney general’s speech seemed to willfully ignore this issue.

Making phones easy to open creates an additional security problem. In recent years, phones have become a secure and highly usable way to access online accounts. Popular apps such as Google Authenticator or Duo provide software on phones enabling users to prove their identity to their company, bank or other high-value accounts. The use of smartphones as authenticators increases security, and it has done so in ways that are easy for all users. But phones are secure authenticators to online accounts only if the data on the phones are also secure. That's why the efforts by Apple and Google are so important for security. Locked phones whose data can be accessed only by the owner is a not safety problem; it is good security design. The deputy attorney general seems to ignore the fact that exceptional access would undo phones’ ability to act as secure authenticators.

That brings me to Rosenstein's tone. I've been part of the Crypto Wars for more than two decades; I found myself somewhat appalled by the rhetoric painting Silicon Valley as all about profit and government all about public safety. I'm no fan of the way tech companies failed to catch Russian involvement in the 2016 election.They should have been paying more attention—and possibly even had mechanisms to prevent or reduce such behavior. But on the issues of security, I believe that there should be praise here, not criticism. Apple is concentrating on security—and thus on preventing crime.

Casting opponents as public enemies is an odd approach if one wants their cooperation in preventing and solving crimes. As Rosenstein knows, tech companies cooperate heavily with law enforcement. Data from July 1, 2016, to Dec. 31 corroborates Silicon Valley’s cooperation. During that period, Google had 74,000 requests from U.S. law enforcement for user and account data; the company produced data in 65 percent of the cases. Microsoft received 25,000 requests involving 45,000 accounts and responded to 68 percent of those; the company found no data in half of the remaining requests. Apple received U.S. government requests for information related to more than 20,000 phones and responded to 76 percent of the requests. Apple also received 81 emergency requests and supplied data in 88 percent of those requests. The companies are legally obliged to respond, but the data do not indicate that Silicon Valley is only in the business of selling products and making money. They're clearly cooperating with the U.S. government to fight crime.

I could put down my cudgel here, but two more issues about the speech are worth assessing: one about something that was said, and one about something that wasn't. Rosenstein made a comment regarding a two-person terrorist attack in Garland, Tex., in 2015. He stated, "On the morning of the attack, one of the terrorists exchanged 109 instant messages with an overseas terrorist. He used an app employing end-to-end encryption, so that law enforcement could not decode the messages." These were bad guys; “60 Minutes” reported that the men had "six guns, hundreds of rounds of ammunition, bulletproof and tactical vests." Killed by security guards at the scene, the attackers were unable to conduct the massacre they undoubtedly intended. But here is what's odd: There was an undercover FBI agent at the scene; he was in a car just behind the two men, and he fled as the attack started.

Now I am not suggesting that the FBI had advance notice of the attack and failed to prevent it—I don't believe that. But the FBI's lack of disclosure on what the bureau knew prevents analysis of where the failure to prevent the attack occurred. A speech that decries encryption's use but presents incomplete information about investigations does not shed light on the encryption debate. Quite the contrary.

That brings me to the aspect that hasn't been a large part of the encryption debates. I have already discussed why protecting phones is crucial, but the Russian threat vastly raises the stakes. In the immediate aftermath of the election, many turned to more secure communications tools. Obama, Clinton and Trump staff moved to using Signal, which provides end-to-end encryption. The Senate sergeant at arms subsequently approved the use of Signal by Senate staff. (When someone uses Signal, the communication is ephemeral, gone when the speakers end their call.) But it's not just political people who should be using secure communications and locked devices. Many more Americans should.

In its January report on Russian interference in the election, the Office of the Director of National Intelligence confirmed that there had been Russian efforts against civil society organizations "viewed as likely to shape future" U.S. policies: think tanks, research institutes and the like. Such organizations are the social glue between the public and legislators; they're crucial to healthy democracies. As I discuss in a recent Foreign Policy piece, when civil society is disrupted, democracy is disrupted.

Civil society organizations are not well placed to protect themselves against nation-state attacks. Many are small and on threadbare budgets; they largely lack the infrastructure and expertise to develop their own technical protections. These organizations need to adopt far better information security practices: moving to two-factor authentication; turning to Signal, WhatsApp and other end-to-end encryption systems for protecting communications (making many communications ephemeral), etc. They need more encryption, not less, and they need exactly what Silicon Valley is increasingly providing: communications and devices that are secure by default.

Yes, the loss of content in investigations (or at least the difficulty of collecting it) and the problems of opening locked phones do make law enforcement's job more difficult. But the FBI has been fighting against widespread use of encryption by the public for 25 years. It's time for the crime-fighting agency to vastly increase its capabilities in this area, improve information sharing with state and local police, and much more. In achieving this, it could take a page from the National Security Agency. The late 1990s were a dark period in NSA's history; the intelligence agency was said to be "going deaf." Facing enormous increases in the volume, velocity and variety of communications, the NSA was overwhelmed. One example was that it missed India's first round of nuclear tests. But as any reader of the Snowden disclosures knows, the agency recovered, changing its methods of collection and increasing its capabilities. From his role as NSA Director in the 1990s, Mike McConnell has perspective; in 2015 he said, "From that time until now, NSA has had better 'sigint' [signals intelligence] than any time in history ... Technology will advance, and you can't stop it."

By contrast, the FBI has not adapted to the digital revolution. Instead, the bureau and the Justice Department continue to pursue policies that require restricting protections of communications and devices, which would allow law enforcement to conduct investigations the way it did before the arrival of modern communications technologies. But technology moves forward, and so must law enforcement investigative techniques. No one knows how communications technologies will evolve in the years to come. But it’s clear that decreasing the security protections in consumer devices and applications will inevitably make it easier for criminals, nation-state operatives and other bad guys to break in. That's not the way to improve U.S. security. The Department of Justice should know this by now.


Susan Landau is Bridge Professor in The Fletcher School and Tufts School of Engineering, Department of Computer Science, Tufts University, and is founding director of Tufts MS program in Cybersecurity and Public Policy. Landau has testified before Congress and briefed U.S. and European policymakers on encryption, surveillance, and cybersecurity issues.

Subscribe to Lawfare