Armed Conflict Cybersecurity & Tech Foreign Relations & International Law

My Reply to Empty Wheelerman II

Benjamin Wittes
Saturday, July 9, 2011, 10:28 AM
In her post the other day, Marcy Wheeler hit on a theme I have been thinking a great deal about recently--indeed, that I have been writing a book about. So I know this is supposed to be a debate, but in what follows--I confess--I want to elaborate on, not argue with, a point that underlies Wheeler's post. And I want to show the thematic links between it and things I have been writing about for the past year. Wheeler, to put it mildly, comes to this discussion from a different place politically than I do.

Published by The Lawfare Institute
in Cooperation With
Brookings

In her post the other day, Marcy Wheeler hit on a theme I have been thinking a great deal about recently--indeed, that I have been writing a book about. So I know this is supposed to be a debate, but in what follows--I confess--I want to elaborate on, not argue with, a point that underlies Wheeler's post. And I want to show the thematic links between it and things I have been writing about for the past year. Wheeler, to put it mildly, comes to this discussion from a different place politically than I do. So her post is full of assumptions and rhetoric that I don't share. But cut through all that for a minute, and focus on her core point:
I believe that drones are a tool that presents a heightened threat to the concept of sovereignty, for better or worse. . . . One thing I think is stunning about our drone war is the degree to which it impacts issues of sovereignty almost everywhere we use it. . . . I’m not entirely certain whether chipping away at sovereignty is a good thing–will it allow oppressed people to band together to fight the global elite, or a terrible thing–will it allow weaponized elites to turn average people back into serfs in exchange for the security the nation-state used to offer (though of course I’ve repeatedly suggested we’re headed for the latter condition). But our elected representatives are wittingly and unwittingly pursuing policies that accelerate the process.
I agree emphatically with Wheeler's focus on sovereignty here--although for reasons somewhat different from the ones she offers. Indeed, I think Wheeler doesn't go quite far enough. For it isn't just sovereignty at issue in the long run, it is governance itself. Robotics are one of several technological platforms that we can expect to  greatly enhance the power of individuals and small groups relative to states. The more advanced of these technological areas are networked computers and biotechnology, but robotics is not all that far behind--a point Ken Anderson alludes to at a post over at the Volokh Conspiracy. Right now, the United States is using robotics, as Wheeler points out, in situations that raises issues for other countries' sovereignty and governance and has a dominant technological advantage in the field. But that's not going to continue. Eventually, other countries--and other groups, and other individuals--will use robotics in a fashion that has implications for American sovereignty, and, more generally, for the ability of governments in general to protect security. I have written about this problem before, in a Brookings paper I wrote last year and in this volume (see pp. 438-439) published by the Kauffman Foundation, where I described this class of technologies as follows:
Technologies of mass empowerment —of which bio technology and globally networked computers are the paradigmatic examples—have certain common characteristics that bear emphasis. First, they are widely disseminated technologies that depend on readily available training and materials. Unlike nuclear technologies, they did not develop principally in classified settings at government-run labs with the government’s controlling access to the key materials. Rather, they developed in public in open dialogue with nonmilitary purposes in mind. Scientists did not discover the double helix or sequence the human genome in order figure out how to design viruses to kill people. Nor did engineers build the Internet so that terrorists or foreign governments could seize control of the Hoover Dam—or even so that our intelligence agencies could seize control of some other country’s dams. Yet in the cyber arena, attacks have grown up alongside the platform. And in the biotech arena, a public literature now exists to teach bad guys how to do horrific things—and the materials, unlike highly enriched uranium, are neither scarce nor expensive. Second, the destructive technologies are virtually inseparable from the socially beneficial innovations that give rise to them. In the wrong hands, the research on how to use genetics to cure and prevent disease can be used to cause and spread disease. A paper on how to shield computers against viruses necessarily involves analysis of viruses that one can use to write stronger ones. Defensive research in this space will potentially empower the bad guys too. Third, the use of these technologies blurs the distinction between foreign and domestic threats and, indeed, makes attribution of any attack extremely difficult. As every student in a biological laboratory and every individual on his home computer becomes a possible threat to national security, traditional techniques of surveillance, deterrence, and nonproliferation become increasingly ill suited to detecting and preventing terrorist activity. Large numbers of cyberattacks already take place with attribution impossible or long delayed. In the case of the anthrax attacks in the wake of September 11, attribution took seven years and remains to this day contested. Indeed, often in these cases, a targeted entity will not be able to determine whether its attacker is another state, a political group, a criminal group, or a lone gunman.
Right now, it may seem outlandish to imagine that drone technology--or other robotics technologies--will develop in this fashion. But don't kid yourself. The whole history of computing is a history of delivering more and more power in smaller and smaller packages. Robotics will not be different. If you think it will, take a good look at this website devoted to drones as an individual hobby. And while individuals won't weaponize robots in the form of Predators and Reapers (as in this cartoon), they surely will figure out how to use robotics to, say, conduct bombings or assassinate presidents. Why send a suicide bomber when you can send a machine? Let me go a step further than Wheeler. In a different paper--still not published--I also describe the development of this class of technologies as a challenge to the Westphalian system. But it's not just sovereignty that is at issue. As I describe in the Kauffman paper, the development of technologies of mass empowerment seems to me to present a profound challenge to the governance of security itself:
The lack of promising [policy] options gives rise to what I suspect will be the most profound impact of this class of technologies on our law, one that touches the very structural arrangements of power in American life—and the lives of most other states as well. That is, it stands to bring about a substantial erosion of the government’s monopoly on security policy, putting in diffuse and private hands for the first time responsibility for protecting the nation. There are people who would write that sentence with joy in their hearts. I am not one of them. My views on executive capacity— notwithstanding the excesses of the Bush administration—are unapologetically Hamiltonian. The Constitutional assumption that the political branches, particularly the executive branch, are both responsible for national security and have the tools necessary to fulfill that responsibility is a comforting one, the destabilization of which I find scary. “Power to the people!” is a slogan that has always rung to me of gridlock at best, mob rule at worst. The Constitution contains very few textual exceptions to the notion that national security is a federal responsibility. One, the Second Amendment, embodies the Framers’ reverence for state militias, both as a means of fending off native attacks and as a means of preventing federal encroachments on state prerogatives. The other, the Letters of Marque Clause of Article I, contemplates a limited role for the private sector in military engagements— under Congressional supervision. Both involve institutions that have long since lapsed into disuse. The broader and more lasting presumptions in the document were that Congress would make the rules of security and that the president would lead the armed forces and the larger executive apparatus in a military or other crisis. I’m not sure how these presumptions hold in the face of rapid development of these technologies. This point is perhaps most vivid in the cyber arena, where huge amounts of traffic into and out of the United States—including government traffic—now takes place over privately owned lines and the government quite literally does not control the channels through which attacks can occur. But it’s also true in the biotechnology sphere. Because the revolution has taken place largely in private, not government, hands, the government employs only a fraction of the capable individuals. And the capacity to respond to or prevent an attack is therefore as diffuse as the capacity to launch one.
All of which is simply to say "amen" to Wheeler's insistence that drones have implications for sovereignty--and governance more generally. The implications of drones in and of themselves are not enormous. As an example of a much larger class of technologies that will, over time level things between states, and more fundamentally, level things between states and non-states, the implications are huge. I even agree with Wheeler that this trend ought to be the subject of wider and more serious public debate.

Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance Studies at the Brookings Institution. He is the author of several books.

Subscribe to Lawfare