Cybersecurity & Tech Surveillance & Privacy

The FBI and I Agree

Susan Landau
Wednesday, March 15, 2017, 4:54 PM

Last year when I testified in the House Judiciary Committee on the

Published by The Lawfare Institute
in Cooperation With
Brookings

Last year when I testified in the House Judiciary Committee on the Apple/FBI case, I argued that what the FBI was asking for would create a serious security risk for Apple. It seems the FBI agrees. In a case about releasing information regarding unlocking a phone, the FBI presented a brief arguing that the security risks means it should not be required to release information about the company that unlocked the phone.

Let me back up a little and remind you of the facts. After the terrorist shootings in San Bernardino, the FBI took possession of a phone belonging to the San Bernardino Health Department and issued to Syed Rizwan Farook, one of the terrorists. The phone, an Apple iPhone running iOS 9.0, was locked. Apple's security protections meant that each time a user guessed an incorrect PIN, there was an increasing delay before another attempt could be made to unlock the phone. And after ten incorrect tries, the operating system would erase the phone's data.

The FBI wanted Apple to issue an update, particularized to Farook's phone, that would undo these protections. Then law-enforcement agents could brute force PIN tries until they were able to unlock the device.The FBI argued that Apple could do the update securely. But during last year's hearing, FBI Director James Comey and Manhattan DA Cy Vance revealed they had a number of phones they wanted unlocked (the number is now in the many hundreds). And therein lies the rub.

Apple would not be doing a single update to Farook's iPhone. It would be employing these updates over and over again, particularized to each phone for which law enforcement had a search warrant. It would be very hard to protect the software, which, after all, could not be written, used once, and then destroyed. And the updates themselves, instead of being done a few times a year, would be used frequently, putting the process itself at risk. As I said during the hearing:

[R]outinizing the signing process will make it too easy to subvert Apple's process and download malware onto customers' devices. My concern is not that the FBI will download rogue software updates onto unsuspecting customers; there is a rigorous wiretap warrant process to prevent government wiretaps from being abused. Rather, I am concerned that routinization will make it too easy for a sophisticated enemy, whether organized crime or a nation attempting an Advanced Persistent Threat attack, to mislead the Apple signing process.

A process that is used rarely—such is now the case in signing updates—is a process that can be carefully scrutinized each time it occurs; the chance for malfeasance is low. But make things routine, and instead of several senior people being involved in the signing process, a web form is used, and a low-level employee is placed in charge of code signing. Scrutiny diminishes. No one pays a great deal of attention, and it becomes easy for rogue requests to be slipped into the queue.

Or worse. The software would be a prime target of hackers and foreign governments, who would like nothing better than to acquire software enabling them to hack iPhones. What the FBI was asking for was, from a security vantage point, too dangerous for Apple to build.

Apparently, the FBI sees the same threat—but to the vendor of the company that unlocked the phone for the bureau. In a brief arguing that the government should not have to release the company's name, the brief states:

[R]evealing the vendor's identity immediately exposes the vendor to attacks and infiltration by hostile entities wishing to exploit the technology they provided to the FBI. While FBI networks are protected by sophisticated cyber-security measures and FBI facilities are protected by armed guards and/or sophisticated physical security measures, the vendor likely does not have the same resources to devote to its own security. With this in mind, the FBI would be taking a substantial risk to the continued viability of this technology by acknowledging the vendor ... It is reasonable to conclude that [the vendor] would not be able to thwart the same types of attacks and infiltration attempts the FBI is currently able to defend against; thus, revealing the vendor's identity may provide hostile enemies with a softer target for attack and infiltration ...

Asking Apple to develop software to undo the iPhone's security carries a much greater risk than this, for it would not only be the particular update software under threat. The more serious risk would be to Apple's update authorization mechanism. And so, the FBI has just made a very strong argument as to why Apple should not be compelled to break its own security.


Susan Landau is Bridge Professor in The Fletcher School and Tufts School of Engineering, Department of Computer Science, Tufts University, and is founding director of Tufts MS program in Cybersecurity and Public Policy. Landau has testified before Congress and briefed U.S. and European policymakers on encryption, surveillance, and cybersecurity issues.

Subscribe to Lawfare