Defense Science Board on Autonomous Systems

Jack Goldsmith
Wednesday, September 12, 2012, 6:29 AM
The Defense Science Board recently issued a new study on The Role of Autonomy in DoD Systems.  Spencer Ackerman has a good story summarizing and explaining its conclusions:
The Pentagon’s science advisers want military robots to operate with far greater autonomy than they do today.

Published by The Lawfare Institute
in Cooperation With
Brookings

The Defense Science Board recently issued a new study on The Role of Autonomy in DoD Systems.  Spencer Ackerman has a good story summarizing and explaining its conclusions:
The Pentagon’s science advisers want military robots to operate with far greater autonomy than they do today. Only one problem: There’s a cloud of distrust and misunderstanding hovering over the robots that the Pentagon already has. That’s an unexpected conclusion in a July study from the Defense Science Board, recently acquired by Steve Aftergood of the Federation of American Scientists. The Board wondered what’s inhibiting the development of autonomous military vehicles and other systems. It found that the humans who have to interact with robots in high-stakes situations often labor under the misimpression that autonomy means the machine can do a human’s job, rather than help a human do her job more efficiently. And some simply don’t have faith that the robots work as directed. There’s a “lack of trust among operators that a given unmanned system will operate as intended,” the Board found. One major reason: “Most [Defense Department] deployments of unmanned systems were motivated by the pressing needs of conflict, so systems were rushed to theater with inadequate support, resources, training and concepts of operation.” War may spur innovation, but it’s not always the best place to beta-test. And there’s a deeper, conceptual problem behind the frustration. “Treating autonomy as a widget or ‘black box’ supports an ‘us versus the computer’ attitude among commanders rather than the more appropriate understanding that there are no fully autonomous systems just as there are no fully autonomous soldiers, sailors, airmen or Marines,” the Board found. . . .

Jack Goldsmith is the Learned Hand Professor at Harvard Law School, co-founder of Lawfare, and a Non-Resident Senior Fellow at the American Enterprise Institute. Before coming to Harvard, Professor Goldsmith served as Assistant Attorney General, Office of Legal Counsel from 2003-2004, and Special Counsel to the Department of Defense from 2002-2003.

Subscribe to Lawfare