Cybersecurity & Tech

Human Rights Watch Campaign on Killer Robots---and Tom Malinowski's Response

Benjamin Wittes
Friday, February 22, 2013, 5:53 PM
A few weeks ago, Tom Malinowski of Human Rights Watch had a thoughtful and serious---if sometimes playful---exchange with Matt, Ken, and me over fully autonomous weapons systems. But there seems to be another, less serious, side of Human Rights Watch's advocacy on the subject, a grass-roots campaign that is quite extreme in its rhetoric and tone. The other day, a Brookings colleague forwarded me an email she had received from human rights way.

Published by The Lawfare Institute
in Cooperation With
Brookings

A few weeks ago, Tom Malinowski of Human Rights Watch had a thoughtful and serious---if sometimes playful---exchange with Matt, Ken, and me over fully autonomous weapons systems. But there seems to be another, less serious, side of Human Rights Watch's advocacy on the subject, a grass-roots campaign that is quite extreme in its rhetoric and tone. The other day, a Brookings colleague forwarded me an email she had received from human rights way. The subject line is, "Sign the Petition: Stop Killer Robots." It reads:
Dear Friend,At this very moment, military researchers---including some in the United States---are laying the groundwork for an unthinkable type of weapon:  killer robots.These robots are intended to select targets and fire without direct human oversight. They would not be governed by human reasoning or be able to feel compassion. A killer robot might not be able to tell that that a child running into the street with a toy gun is not a threat; it may not comprehend a mother’s plea for reason as she defends her child’s life.Sign Human Rights Watch’s petition to President Obama asking him to stop development of such weapons immediately.On February 28th we will deliver the letter to the White House.  Therefore, I urge you to sign this petition and share this email with your friends, family and coworkers.  We cannot stand idly by.Tell President Obama that human decision-making and responsibility must always be part of warfare and that taking humans out of battle could risk more lives than it saves.If we don’t stop this program before it is finished, there will be no going back.Don’t wait until it’s too late.Thank you,
The petition itself reads:
Mr. President:I urge you to take every possible action to block the development and creation of fully autonomous weapons, or “killer robots.”  The use of these weapons would not be consistent with international humanitarian law and could increase the risks of death or injury to civilians during armed conflict.We urgently need a pre-emptive prohibition on their development and use.  I look to you to champion this important cause.
I asked Tom about whether this was the serious conversation he had meant to start on the subject. He responded:
The serious conversation is the one that we’ve been having.  As you know, I feel strongly that it would be a bad idea to allow the development of fully autonomous lethal weapons.  I also acknowledge that the legal, moral and technological issues at stake are highly complex.  HRW is not against all forms of autonomy, and reasonable people can disagree about how best to ensure that the worst case scenarios do not come to pass.That said, large organizations must sometimes simplify and amplify their messages when making fundraising appeals or launching petition drives or campaigns to build grass roots support for a cause.   I’ve signed petitions in my life that consist of slogans followed by exclamation points; I’ve never signed one that reads like one of my Lawfare posts!  Yet serious efforts to affect public policy often must be accompanied by this kind of activism, which tells policy makers that the public would be horrified if they allowed a feared outcome to happen. You sometimes have to establish the red lines (Save Darfur! Don’t balance the budget on the backs of our seniors! Ban killer robots!) before getting down to a more thoughtful discussion about how the goal can and should be met.The language in this particular appeal is clearly over-simplified.  We are far from having “killer robots” deployed on the battlefield and interacting with mothers of children, and some phrases (like “taking humans out of battle”) are obviously imprecise.  As readers of my posts know, this is not the kind of language I would be most comfortable using.  That said, it does reflect our sense of a possible worst case scenario and our view that machines are unlikely ever to be able to make the highly complex, and, we believe, inherently human, judgments required of ethical soldiers.  It also asks people to endorse our bottom line and that of many other experts who have wrestled with this issue – that there should always be a human being in the loop when making decisions about lethal force.If you disagree, you could always go to the White House petition site, and start one that says “Tell Tom Malinowski to demand more nuance in HRW petition drives!”

Benjamin Wittes is editor in chief of Lawfare and a Senior Fellow in Governance Studies at the Brookings Institution. He is the author of several books.

Subscribe to Lawfare