Who Will Be Accountable for Military Technology?

What's to come?
Nov. 15 2012 7:27 AM

Who Will Be Accountable for Military Technology?

As drones, robots, and even enhanced soldiers take the battlefield, questions of responsibility get more complicated.

1128390
A crew member of the USS Vincennes checks a guided missile launcher in 2002. The technologically advanced Vincennes shot down an Iranian passenger jet in 1988.

Photo by Gabriel Mistral/Gettyimages.

Also in Slate: Brad Allenby and Carolyn Mattick explain why new technologies mean we need to rewrite the “rules of war,” and Fritz Allhoff examines the paradox of nonlethal weapons.

The “Global Campaign To Stop Killer Robots” kicked off in New York on Oct. 21. Nobel Peace Prize Laureate Jody Williams urged the nations of the world to act against lethal autonomous robots, declaring them “beyond the pale.” Williams is not alone; on CNN earlier in October, Peter Bergen, the author of several best-selling books about Osama Bin Laden, also argued for a convention regulating lethal robots. The International Committee for Robot Arms Control, a group of academic experts on robot technologies and international security, is on board as well. The pressure on the robots is mounting.

Underlying the debate about “killer robots” is concern that machines are not, and cannot, be legally accountable for their actions. As professor Oren Gross of the University of Miami School of Law told this year’s inaugural “We Robot” conference on robots and the law in April, domestic and international law are not well suited to dealing with robots that commit war crimes.

Advertisement

As technology advances, we face a very real danger that it will become increasingly difficult to hold those who wage war on our behalf accountable for what they do. Artificial intelligence is not the only technology to pose such accountability problems. Others, such as bio-enhancement, do, too, although of a different sort.

Machines entirely capable of replacing humans are not yet on the market, but robotic systems capable of using lethal force without a human in the loop do already exist. The U.S. Navy’s Aegis Combat System, which is capable of autonomously tracking enemy aircraft and guiding weapons onto them, is an example. But if a robot system goes “rogue” and commits what for a human would be a crime, there would not be much point in arresting the machine. Our gut instinct is that somebody should be held accountable, but it is difficult to see who. When the USS Vincennes shot down an Iranian airliner in 1988, killing 290 civilians, there were real people whose behavior one could investigate. If the Aegis system on the Vincennes had made the decision to shoot all by itself, it would have been much harder. When a robot decides, clear lines of responsibility are absent.

For obvious reasons, human beings do not like this. An experiment conducted by the Human Interaction With Nature and Technological Systems Lab at the University of Washington had a robot, named Robovie, lie to students and cheat them out of a $20 reward. Sixty percent of victims could not help feeling that Robovie was morally responsible for deceiving them. Commenting on the future use of robots in war, the HINTS experimenters noted in their final report that a military robot will probably be perceived by most “as partly, in some way, morally accountable for the harm it causes. This psychology will have to be factored into ongoing philosophical debate about robot ethics, jurisprudence, and the Laws of Armed Conflict.” Quite how this could be done is unclear.

An assumption of human free will is fundamental for any system of legal accountability. Unfortunately, the more the cognitive sciences develop, the more they suggest that our moral reasoning lies largely outside of our control—and can even be manipulated.

An example is transcranial magnetic stimulation. Experiments with TMS reveal that you can alter somebody’s moral reasoning using a powerful magnet. Unscrupulous military leaders could artificially distort their subordinates’ morality for the worse by attaching a TMS unit to their helmets. Yet if a soldier committed war crimes because somebody else had turned off his morals, it is hard to see how we could hold him responsible for his actions.

TODAY IN SLATE

The World

The Budget Disaster that Sabotaged the WHO’s Response to Ebola

Are the Attacks in Canada a Sign of ISIS on the Rise in the West?

PowerPoint Is the Worst, and Now It’s the Latest Way to Hack Into Your Computer

Is It Offensive When Kids Use Bad Words for Good Causes?

Fascinating Maps Based on Reddit, Craigslist, and OkCupid Data

Culturebox

The Real Secret of Serial

What reporter Sarah Koenig actually believes.

Culturebox

The Actual World

“Mount Thoreau” and the naming of things in the wilderness.

In Praise of 13th Grade: Why a Fifth Year of High School Is a Great Idea

Can Democratic Sen. Mary Landrieu Pull Off One More Louisiana Miracle?

  News & Politics
Politics
Oct. 23 2014 3:55 PM Panda Sluggers Democrats are in trouble. Time to bash China.
  Business
Business Insider
Oct. 23 2014 2:36 PM Take a Rare Peek Inside the Massive Data Centers That Power Google
  Life
Atlas Obscura
Oct. 23 2014 1:34 PM Leave Me Be Beneath a Tree: Trunyan Cemetery in Bali
  Double X
The XX Factor
Oct. 23 2014 11:33 AM Watch Little Princesses Curse for the Feminist Cause
  Slate Plus
Working
Oct. 23 2014 11:28 AM Slate’s Working Podcast: Episode 2 Transcript Read what David Plotz asked Dr. Meri Kolbrener about her workday.
  Arts
Brow Beat
Oct. 23 2014 4:03 PM You’re Doing It Wrong: Puttanesca Sauce
  Technology
Technology
Oct. 23 2014 11:45 AM The United States of Reddit  How social media is redrawing our borders. 
  Health & Science
Bad Astronomy
Oct. 23 2014 7:30 AM Our Solar System and Galaxy … Seen by an Astronaut
  Sports
Sports Nut
Oct. 20 2014 5:09 PM Keepaway, on Three. Ready—Break! On his record-breaking touchdown pass, Peyton Manning couldn’t even leave the celebration to chance.