Million Dollar Vulnerabilities and an FBI for the Twenty-First Century
On February 16, US Magistrate Judge Sheri Pym, responding to an FBI request, ordered Apple to provide software to bypass the company's technical protections; this would unlock the work phone of Syed Farook, one of the two San Bernardino terrorists. Apple appealed the order.
Published by The Lawfare Institute
in Cooperation With
On February 16, US Magistrate Judge Sheri Pym, responding to an FBI request, ordered Apple to provide software to bypass the company's technical protections; this would unlock the work phone of Syed Farook, one of the two San Bernardino terrorists. Apple appealed the order. On February 29, US Magistrate Judge James Orenstein rejected an FBI request that Apple provide software to unlock New York drug dealer Fin Jeng's iPhone; the government appealed.
These two cases are now moot. The FBI apparently paid a lot—though probably not quite as much as $1.3 million—for a hack that bypassed the San Bernardino iPhone's encryption system. And someone supplied law enforcement with the passcode for the New York drug dealer's phone, obviating any need for hacking into the device.
Readers of Lawfare may be familiar with the national-security arguments I have made regarding the need for broadly securing communications and devices. Given the cyber risks to critical infrastructure and the US economy, ex DHS Secretary Michael Chertoff and many others, including a number of ex senior members of NSA and DoD, have said that our long-term national security depends upon securing communications and devices. But how might law-enforcement investigations proceed if communications and devices are encrypted? Though the appeals of the San Bernardino and New York iPhone cases are moot, the larger issue of Going Dark is not.
The FBI is going dark, but the cause is not encryption; it is the Bureau's approach to investigations involving encryption and other types of anonymizing tools. Consider the FBI's 2017 budget request. It includes a requested increase of $38.3 million and 0 positions for "challenges related to encryption, mobility, anonymization, and more"; current services are at "39 positions (11 agents) and $31 million." This explains the FBI's problem. Despite six years of publicly pressing for laws to control encryption's deployment, the FBI staffing is at a remarkably low level, one that fits the attack profile of quite a few years ago, not the present time. By contrast, the 2017 request for additional physical surveillance capabilities is for $8.2 million and 36 positions (18 agents); this request is on top of the current 1770 positions (549 agents) and $297.8 million budget.
(The 2017 FBI budget request also includes a separate cyber component with 1,753 positions (897 agents) along with a current budget of $541.4 million, and a 2017 request of $85.1 million and 0 positions. While the cyber component interacts with the Going Dark program and small amounts of funds are fungible, the cyber effort does not substitute for the missing Going Dark capabilities.)
Over the last two decades we have moved virtually everything to digital media, networked it, and now, finally—far too late from a security point of view—started to encrypt it. In the wake of this ongoing debate, it is interesting to look at a report regarding one US government agency's response to cyber:
First we need to address the challenges of technological change and how to meet them effectively and efficiently ... We caution this is a marathon, not a sprint... (page 2).
We focus more on our "tradecraft" than on our customers, partners, and stakeholders ... Our workforce is not prepared for the future ... (page 3).
Align the budget and the workforce with the corporate strategy. You must get systems development under control ... and ensure that the entire workforce is marching to the beat of the business plan. (page 4).
As you may have guessed, these quotes are not from an FBI document. They are from a 1999 study for the NSA on adapting to a cyber future. According to a 1999 article by Seymour Hersh, "the senior military and civilian bureaucrats who work[ed] at the agency's headquarters, in suburban Fort Meade, Maryland, [had] failed to prepare fully for [the then current] high-volume flow of E-mail and fibre-optic transmissions—even as nations throughout Europe, Asia, and the Third World [had] begun exchanging diplomatic and national-security messages encrypted in unbreakable digital code." No one would accuse the NSA of being in a similar state today (doubters need only check the Snowden archives).
Though the FBI has some excellent people conducting cyber investigations, the FBI budget numbers show that its workforce is not prepared—not for the investigations of today, and not for the future. Nearly every serious crime today involves a cyber component. The bureau needs to adopt its investigative capabilities to the world that has evolved over the last two decades.
The point is that regardless of what security changes Apple and other companies make with hardware, there are currently 865 encryption products available worldwide today. Some systems will be easy to break into, some hard. The hardest ones to break into will be those with an integrated hardware system, such as Apple has. But as the San Bernardino case demonstrated, it is even possible in such cases.
As Steve Bellovin, Matt Blaze, Sandy Clark and I argued several years ago, law enforcement will increasingly need to turn to using vulnerabilities already present in devices in order to conduct electronic surveillance. Though more complicated and expensive than investigating unencrypted communications and unsecured devices, such lawful hacking must become part of investigators' toolkits. Nonetheless, vulnerabilities costing a million dollars are a high price to pay for a single investigation. It is easy to imagine quickly breaking the bank if high-priced vulnerabilities are frequently purchased.
But whether it is determining how many agents to put onto tracking a criminal's movements or what efforts should be expended in taking down a botnet, resource allocation has always been an issue in investigations. Law enforcement will need to learn how to figure out when it's worth it to spend a couple of hundred thousand—or even a million—dollars to develop or purchase a tool, and when alternative investigative techniques provide sufficient information for the investigation. On the basis of the San Bernardino and New York cases, it does not seem that the FBI had sufficient expertise to properly make those calls.
Twenty years ago, the NSA developed "NSA II," redesigning itself to handle 21st century signals intelligence investigations. The FBI should have embarked on an effort then to handle 21st century criminal investigations. The Bureau did not. It has no time to waste in doing so now. Rather than trying to fight the inevitable development and deployment of secure communications and tools, the FBI should focus on the best way to ramp up to handling investigations of the 21st century, as well as determining how to manage investigations over the next several years as it develops these needed capabilities. And Congress has an important role in figuring out policies that will enable state and local law enforcement, who are overwhelmed by this shift in secure communications tools, to take advantage of federal investigative expertise.
This is not just the best way to secure us and prevent law enforcement from Going Dark; it is the only way.