Cybersecurity & Tech Surveillance & Privacy

Be Careful What You Wish For: Device Hacking and the Law

Benjamin Wittes
Wednesday, January 6, 2016, 3:14 PM

I’ve been thinking about lawful device hacking of late—that is, government hacking of devices as a way around the “going dark” problem. Many civil libertarians and cryptographers seem actively to prefer government hacking of devices to the alternative of any kind of extraordinary access regime in which service providers would, by one means or another, maintain the capacity to decrypt the signals they carry.

Published by The Lawfare Institute
in Cooperation With
Brookings

I’ve been thinking about lawful device hacking of late—that is, government hacking of devices as a way around the “going dark” problem. Many civil libertarians and cryptographers seem actively to prefer government hacking of devices to the alternative of any kind of extraordinary access regime in which service providers would, by one means or another, maintain the capacity to decrypt the signals they carry.

An important 2014 paper by Steven Bellovin, Matt Blaze, Sandy Clark and Susan Landau entitled “Lawful Hacking: Using Existing Vulnerabilities for Wiretapping on the Internet” described the matter as follows: “Instead of building wiretapping capabilities into communications infrastructure and applications, government wiretappers can behave like the bad guys. That is, they can exploit the rich supply of security vulnerabilities already existing in virtually every operating system and application to obtain access to communications of the targets of wiretap orders.” The authors conclude: “there needs to be a way for law enforcement to execute authorized wiretaps. The solution is remarkably simple. . . . The exploitation of existing vulnerabilities to accomplish legally authorized wiretapping creates uncomfortable issues. Yet we believe the technique is preferable for conducting wiretaps against targets when compared to other possible methods of wiretapping, like deliberately building vulnerabilities into the network or device, would result in less security.”

The idea of lawful hacking as the civil liberties-friendly solution to the “going dark” problem has been gaining traction of late. Herb Lin recently described a variant of it on Lawfare:

Vendors build devices with the ability to receive over-the-air software updates, and it is in principle possible for particular software to be pushed to specific devices. Thus, if government authorities can technically identify the device that must be monitored, that device can be forced to install software that will copy and forward all communications that are sent or received through that device. Further, such installation can usually be done surreptitiously, just as . . . authorities require.

This approach to exceptional access for encrypted communications has much in common with (and is inspired by) an approach laid out in the “Lawful Hacking” paper by Bellovin, Blaze, Clark, and Landau.

The more I think about both the “Lawful Hacking” paper and Herb’s version of the idea, the more I think civil libertarians and cryptographers should be careful what they wish for. The law, it turns out, interacts with this idea in some curious ways, ways I doubt the advocates of lawful hacking as an alternative to encryption regulation have fully considered.

The following analysis is tentative. I welcome pushback if people think it’s in error. But I suspect that by advocating that the government bypass encryption systems, rather than requiring decryption, this approach will actually deprive companies of one of the strongest legal protections now granted them in this area and will instead place them in a arena in which the government can, legally speaking, effectively dragoon them into helping investigators hack consumer devices.

Let’s unpack this.

Both FISA and Title III have what are called “technical assistance” provisions, which require companies to assist government agents in effectuating lawful wiretaps. FISA’s provision states that the FISA court “shall direct”:

upon the request of the applicant, a specified communication or other common carrier, landlord, custodian, or other specified person . . . [to] furnish the applicant forthwith all information, facilities, or technical assistance necessary to accomplish the electronic surveillance in such a manner as will protect its secrecy and produce a minimum of interference with the services that such carrier . . . is providing that target of electronic surveillance.

The Wiretap Act’s provision is substantially similar.

The language here is capacious. If it weren’t for other provisions of federal law, it might even address the “going dark” problem. Technical assistance, after all, could conceivably include technical assistance in decrypting signal.

But at least with respect to cryptography, other provisions of law intervene. As part of CALEA, Congress specifically wrote into law that “A telecommunications carrier shall not be responsible for decrypting, or ensuring the government’s ability to decrypt, any communication encrypted by a subscriber or customer, unless the encryption was provided by the carrier and the carrier possesses the information necessary to decrypt the communication.” In other words, the technical assistance provisions can theoretically be used to make carriers decrypt communications they have retained the capability to decrypt but they cannot be used to force carriers to retain that capability.

But now let’s return to lawful hacking or to Herb’s version of it: pushing a government-written piece of malware out to a device to facilitate surveillance of that device.

Imagine the government wanted to spy on my phone and had a warrant to do so. Imagine that it approached AT&T (“a specified communication or other common carrier”) or Apple (covered conveniently by the “other specified person” language) with a court order directing the company to give technical assistance in pushing a piece of surveillance malware to my phone.

Note that this is not a request to decrypt anything, so it does not fall under the CALEA encryption exemption. Note also that the information this malware might steal could be far in excess of merely the communications on my phone. It could, in theory at least, monitor all activity on my device.

Would my providers be required to assist the government in installing it on my device?

Read the language of the statute closely. On its face, it’s really broad. It lets the government demand of any “communication or other common carrier” or “other specified person” that it provide “all information, facilities, or technical assistance necessary to accomplish the electronic surveillance.” On the surface of things, I don’t see why this would not include technical assistance in device hacking—or in pushing malware to a service provider’s customers.

Would a court actually require a provider to give the government this sort of technical assistance? I’m honestly not sure. The technical assistance language in FISA is, to my knowledge anyway, largely uninterpreted on this point.

But back in 1977, the Supreme Court interpreted the Wiretap Act’s “technical assistance” provision in connection with a company’s refusal to help investigators install a pen register device. Here’s what the Supreme Court said in agreeing that the company could be compelled to provide the help:

we do not think that the Company was a third party so far removed from the underlying controversy that its assistance could not be permissibly compelled. A United States District Court found that there was probable cause to believe that the Company's facilities were being employed to facilitate a criminal enterprise on a continuing basis. For the Company, with this knowledge, to refuse to supply the meager assistance required by the FBI in its efforts to put an end to this venture threatened obstruction of an investigation which would determine whether the Company's facilities were being lawfully used. Moreover, it can hardly be contended that the Company, a highly regulated public utility with a duty to serve the public, had a substantial interest in not providing assistance. Certainly the use of pen registers is by no means offensive to it. The Company concedes that it regularly employs such devices without court order for the purposes of checking billing operations, detecting fraud, and preventing violations of law. It also agreed to supply the FBI with all the information required to install its own pen registers. Nor was the District Court's order in any way burdensome. The order provided that the Company be fully reimbursed at prevailing rates, and compliance with it required minimal effort on the part of the Company and no disruption to its operations.

Finally, we note, as the Court of Appeals recognized, that without the Company's assistance there is no conceivable way in which the surveillance authorized by the District Court could have been successfully accomplished.

In other words, the question would boil down to whether a request for technical assistance is unduly burdensome for companies that push things to users’ phones all the time. A court could go either way with this. It could contend that it’s technically doable, that companies push things to phones regularly, that the government will compensate financially for the help, and that the government has no other path to effectuating a valid court order.

Alternatively, a court could contend that a non-utility with no public service obligation and no history of surreptitiously pushing malware to user phones has a valid trust relationship with users that this sort of activity would grossly burden.

My point, for present purposes, is that when civil libertarians and cryptographers talk about lawful hacking, what that may mean in practice is the government’s commandeering companies into compromising their users’ devices.

The scope of the technical assistance language in the Wiretap Act is being litigated right now by Apple, albeit in the context of decrypting data at rest. The question in that case is whether Apple must help the government decrypt a seized phone in a context in which it potentially has the capacity to do so. Apple’s arguments that this request is unduly burdensome are instructive. The company argues that compliance with the order would divert man hours from engineering and pursuing the company’s other goals. Apple engineers might be required to testify about what they did. But then the company goes on:

public sensitivity to issues regarding digital privacy and security is at an unprecedented level. This is true not only with respect to illegal hacking by criminals but also in the area of government access—both disclosed and covert. Apple has taken a leadership role in the protection of its customers’ personal data against any form of improper access. Forcing Apple to extract data in this case, absent clear legal authority to do so, could threaten the trust between Apple and its customers and substantially tarnish the Apple brand. This reputational harm could have a longer term economic impact beyond the mere cost of performing the single extraction at issue (emphasis added).

In other words, the company’s principal argument that this sort of technical assistance is unduly burdensome is that being perceived as getting in bed with the government is bad for Apple’s brand. I don’t think this argument is going to fly in this context. (Apple is also arguing that CALEA preempts other technical assistance provisions with respect to data at rest, but that’s an argument that for reasons I explain above, have much less salience outside of the realm of decryption.)

This case will be extremely important to watch in connection with how much work the technical assistance language of the Wiretap Act ends up doing for the government. And as I say, there are at least some reasons to think that technical assistance in hacking a device might be different, more burdensome, in a court’s eyes from prior cases. Apple is not a public utility, after all, and it does not routinely push software to unconsenting users. If Apple agrees to push malware to its users, that doesn’t just damage the trust relationship in some abstract sense. Rather, it incentivizes Apple’s customers not to update their software. This raises not only harms to Apple but public policy harms a court might also consider. Device security is, after all, somewhat analogous to herd immunity in vaccinations. The more secure individual devices are, the more secure the entire community will be, even if one or two devices fall through the cracks. So there’s an argument that forcing Apple to push malware would damage the broader security ecosystem. Moreover, if a court were to brand Apple's updates as an instrument of legal due process, the taint, and the security consequences, could go beyond the Apple brand and become a systemic problem of lack of trust in service providers generally. In short, the argument that technical assistance in lawful device hacking is qualitatively different from prior cases may well have legs.

Then again, the statute’s plain language being what it is, it may not have legs. Indeed, the question will likely end up turning on how easy, as a technical matter, it would be for either a device maker, a carrier, or a company like Apple to unobtrusively send malware to its users.

It may also turn on who writes the malware and exactly what the program does. A government request that a company push to a device a program the government itself has written completely may be far less burdensome than a request for help in creating the malware itself. Conversely, consider a provider that has its own software that it pushes to a user's phone for fraud-detection or other business purposes of its own. Imagine for a moment that this software can also be used to facilitate surveillance. Under such circumstances, the argument that asking that company to use it in that way poses an undue burden looks weaker.

All of this is intended merely to raise a simple question: Is a regime in which companies may have to do these things better or worse from a civil liberties perspective than a regime under which they have to help with decryption?


Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance Studies at the Brookings Institution. He is the author of several books.

Subscribe to Lawfare