Armed Conflict Cybersecurity & Tech

An Opportunity to Change the Conversation on Autonomous Weapon Systems

Rebecca Crootof, Frauke Renz
Thursday, June 15, 2017, 4:01 PM

Those working to ban “killer robots” were clearly distraught when the Chair of the Convention on Certain Conventional Weapons (CCW) recently announced that the August 2017 meeting—and, by extension, the inaugural meeting of the Group of Governmental Experts on Lethal Autonomous Weapon Systems—was canceled due to insufficient funding. Disappointed but undeterred, ban advocates proposed that states “move swiftly . .

Published by The Lawfare Institute
in Cooperation With
Brookings

Those working to ban “killer robots” were clearly distraught when the Chair of the Convention on Certain Conventional Weapons (CCW) recently announced that the August 2017 meeting—and, by extension, the inaugural meeting of the Group of Governmental Experts on Lethal Autonomous Weapon Systems—was canceled due to insufficient funding. Disappointed but undeterred, ban advocates proposed that states “move swiftly . . . to negotiate a new international treaty prohibiting fully autonomous weapons.”

But this seeming setback towards a treaty ban may actually be an opportunity. For far too long, the international conversation on autonomous weapon systems has been mired in the to-ban-or-not-to-ban debate. Yet the parties to that static faceoff have surprisingly similar aims: to evaluate and proactively address risks associated with increasing autonomy in weapon systems, to preserve the law of armed conflicts’ humanitarian protections, and to minimize human suffering and death. Instead of myopically pursuing an unlikely treaty ban, the international community should seize this chance to explore alternative approaches to addressing the regulatory challenges posed by autonomous weapon systems.

A treaty ban on autonomous weapon systems has always been a moonshot. First, there is no accepted definition on what constitutes an autonomous weapon system—and, by extension, no consensus on whether we’re discussing banning weaponry already in widespread use or some hypothetical future technology. Second, as one of us has detailed in a study of weapons bans, autonomous weapon systems share few characteristics with weapons that have been successfully banned in the past. (This study also highlights why the unusually successful ban on permanently blinding lasers, often cited by advocates for a ban on autonomous weapon systems, is actually poor precedent: the two technologies share few relevant traits.)

However, the conversations that occur in the process of working towards a regulatory treaty can be of far more utility than a finished product. Consider recent history regarding the regulation of Private Military and Security Companies (PMSC). Of the several efforts to address legal concerns particular to states’ increased reliance on PMSC, including the Montreux Document and the International Code of Conduct, the non-binding recommendations drafted by an open-ended intergovernmental working group (IGWG) have been of particular use. In 2010, the UN Human Rights Council established the IGWG with a mandate to “consider the possibility of elaborating an international regulatory framework, including, inter alia, the option of elaborating a legally binding instrument on the regulation, monitoring and oversight of the activities of private military and security companies.” Over the past seven years, the IGWG has brought together government representatives, academics, and civil society and human rights activists to shed light on pressing legal questions and to hold frank debates on how best to address them. Based on these convenings, the IGWG has drafted reports that have been foundational for states constructing their own national regulatory frameworks. While the IGWG has not made much progress towards “elaborating a legally binding instrument,” there is now little consensus that any such instrument would be of more use than the current, conversational approach.

Rather than focusing on negotiating a treaty ban, it would be far more productive to set up a working group to explore the perennial legal issues associated with autonomous weapon systems. The CCW’s Group of Governmental Experts could have been a space for IGWG-like discussions; should the CCW manage to raise the needed monies, it might still be. After all, while a few states have explicitly called for a ban on autonomous weapon systems, the vast majority of participants in the CCW process have been more interested in discussing how this new technology should be regulated. Alternatively, states could set up a more inclusive open-ended working group to discuss what is fundamentally new about autonomous weapon systems and develop governance guidelines. In short, a entity charged with identifying appropriate regulations for autonomous weapon systems would likely garner wider participation and, perhaps counterintuitively, would likely be more impactful than one dedicated to producing a treaty ban.

To be sure, working groups are often criticized for producing only non-binding recommendations. When compared with a treaty—considered by many to be the gold standard of international law—mere guidance seems weak and ineffectual. But, given the unlikelihood of a treaty ban on autonomous weapon systems, the probable near-term options are non-binding guidance or nothing. And while treaties are valued in part because they are stable, relatively inflexible written documents, that very stability renders them ill-suited to regulating new and evolving technologies. Tech-specific treaties risk early obsolescence and displacement by subsequent state action in response to new developments, especially when they are drafted with incomplete information about the technology at issue. Non-binding guidance, in contrast, is flexible and responsive. Recall also that time-tested non-binding guidance may eventually morph into customary international law or serve as a foundation for a more considered regulatory treaty.

Moreover, non-binding recommendations can prove surprisingly influential in practice. For example, the European Commission has asked member states to develop action plans to implement the non-binding Guiding Principles on Business & Human Rights in their domestic law. Of course, for its recommendations to be respected and useful, a working group should ideally be comprised of diverse stakeholders—including government, academic, industry, and civil society representatives—and represent the full spectrum of viewpoints—ranging from those favoring a wait-and-see approach to those advocating for proactive regulation to those arguing for an outright ban.

If we as an international community do not seize this opportunity to change the conversation and engage with the full range and depth of legal challenges raised by autonomous weapon systems, we risk remaining stuck in the simplified, binary ban debate—while the development and deployment of autonomous weapon systems continues, unabated and unregulated.


Rebecca Crootof is an Assistant Professor of Law at the University of Richmond School of Law. Dr. Crootof's primary areas of research include technology law, international law, and torts; her written work explores questions stemming from the iterative relationship between law and technology, often in light of social changes sparked by increasingly autonomous systems, artificial intelligence, cyberspace, robotics, and the Internet of Things. Work available at www.crootof.com.
Frauke Renz is a PhD Candidate writing on state accountability and new trends of warfare, such as private military and security companies as well as autonomous weapon systems. She is a Non-Resident Pacific Forum CSIS Fellow and published on geopolitics and law of the sea. Previously, she was Visiting Researcher at the Georgetown University Law Center as well as the East-West Center.

Subscribe to Lawfare