Who Will Be Accountable for Military Technology?

What's to come?
Nov. 15 2012 7:27 AM

Who Will Be Accountable for Military Technology?

As drones, robots, and even enhanced soldiers take the battlefield, questions of responsibility get more complicated.

1128390
A crew member of the USS Vincennes checks a guided missile launcher in 2002. The technologically advanced Vincennes shot down an Iranian passenger jet in 1988.

Photo by Gabriel Mistral/Gettyimages.

Also in Slate: Brad Allenby and Carolyn Mattick explain why new technologies mean we need to rewrite the “rules of war,” and Fritz Allhoff examines the paradox of nonlethal weapons.

The “Global Campaign To Stop Killer Robots” kicked off in New York on Oct. 21. Nobel Peace Prize Laureate Jody Williams urged the nations of the world to act against lethal autonomous robots, declaring them “beyond the pale.” Williams is not alone; on CNN earlier in October, Peter Bergen, the author of several best-selling books about Osama Bin Laden, also argued for a convention regulating lethal robots. The International Committee for Robot Arms Control, a group of academic experts on robot technologies and international security, is on board as well. The pressure on the robots is mounting.

Underlying the debate about “killer robots” is concern that machines are not, and cannot, be legally accountable for their actions. As professor Oren Gross of the University of Miami School of Law told this year’s inaugural “We Robot” conference on robots and the law in April, domestic and international law are not well suited to dealing with robots that commit war crimes.

Advertisement

As technology advances, we face a very real danger that it will become increasingly difficult to hold those who wage war on our behalf accountable for what they do. Artificial intelligence is not the only technology to pose such accountability problems. Others, such as bio-enhancement, do, too, although of a different sort.

Machines entirely capable of replacing humans are not yet on the market, but robotic systems capable of using lethal force without a human in the loop do already exist. The U.S. Navy’s Aegis Combat System, which is capable of autonomously tracking enemy aircraft and guiding weapons onto them, is an example. But if a robot system goes “rogue” and commits what for a human would be a crime, there would not be much point in arresting the machine. Our gut instinct is that somebody should be held accountable, but it is difficult to see who. When the USS Vincennes shot down an Iranian airliner in 1988, killing 290 civilians, there were real people whose behavior one could investigate. If the Aegis system on the Vincennes had made the decision to shoot all by itself, it would have been much harder. When a robot decides, clear lines of responsibility are absent.

For obvious reasons, human beings do not like this. An experiment conducted by the Human Interaction With Nature and Technological Systems Lab at the University of Washington had a robot, named Robovie, lie to students and cheat them out of a $20 reward. Sixty percent of victims could not help feeling that Robovie was morally responsible for deceiving them. Commenting on the future use of robots in war, the HINTS experimenters noted in their final report that a military robot will probably be perceived by most “as partly, in some way, morally accountable for the harm it causes. This psychology will have to be factored into ongoing philosophical debate about robot ethics, jurisprudence, and the Laws of Armed Conflict.” Quite how this could be done is unclear.

An assumption of human free will is fundamental for any system of legal accountability. Unfortunately, the more the cognitive sciences develop, the more they suggest that our moral reasoning lies largely outside of our control—and can even be manipulated.

An example is transcranial magnetic stimulation. Experiments with TMS reveal that you can alter somebody’s moral reasoning using a powerful magnet. Unscrupulous military leaders could artificially distort their subordinates’ morality for the worse by attaching a TMS unit to their helmets. Yet if a soldier committed war crimes because somebody else had turned off his morals, it is hard to see how we could hold him responsible for his actions.

  Slate Plus
Slate Picks
Nov. 21 2014 1:38 PM What Happened at Slate This Week? See if you can keep pace with the copy desk, Slate’s most comprehensive reading team.