Who Will Be Accountable for Military Technology?

What's to come?
Nov. 15 2012 7:27 AM

Who Will Be Accountable for Military Technology?

As drones, robots, and even enhanced soldiers take the battlefield, questions of responsibility get more complicated.

1128390
A crew member of the USS Vincennes checks a guided missile launcher in 2002. The technologically advanced Vincennes shot down an Iranian passenger jet in 1988.

Photo by Gabriel Mistral/Gettyimages.

Also in Slate: Brad Allenby and Carolyn Mattick explain why new technologies mean we need to rewrite the “rules of war,” and Fritz Allhoff examines the paradox of nonlethal weapons.

The “Global Campaign To Stop Killer Robots” kicked off in New York on Oct. 21. Nobel Peace Prize Laureate Jody Williams urged the nations of the world to act against lethal autonomous robots, declaring them “beyond the pale.” Williams is not alone; on CNN earlier in October, Peter Bergen, the author of several best-selling books about Osama Bin Laden, also argued for a convention regulating lethal robots. The International Committee for Robot Arms Control, a group of academic experts on robot technologies and international security, is on board as well. The pressure on the robots is mounting.

Underlying the debate about “killer robots” is concern that machines are not, and cannot, be legally accountable for their actions. As professor Oren Gross of the University of Miami School of Law told this year’s inaugural “We Robot” conference on robots and the law in April, domestic and international law are not well suited to dealing with robots that commit war crimes.

Advertisement

As technology advances, we face a very real danger that it will become increasingly difficult to hold those who wage war on our behalf accountable for what they do. Artificial intelligence is not the only technology to pose such accountability problems. Others, such as bio-enhancement, do, too, although of a different sort.

Machines entirely capable of replacing humans are not yet on the market, but robotic systems capable of using lethal force without a human in the loop do already exist. The U.S. Navy’s Aegis Combat System, which is capable of autonomously tracking enemy aircraft and guiding weapons onto them, is an example. But if a robot system goes “rogue” and commits what for a human would be a crime, there would not be much point in arresting the machine. Our gut instinct is that somebody should be held accountable, but it is difficult to see who. When the USS Vincennes shot down an Iranian airliner in 1988, killing 290 civilians, there were real people whose behavior one could investigate. If the Aegis system on the Vincennes had made the decision to shoot all by itself, it would have been much harder. When a robot decides, clear lines of responsibility are absent.

For obvious reasons, human beings do not like this. An experiment conducted by the Human Interaction With Nature and Technological Systems Lab at the University of Washington had a robot, named Robovie, lie to students and cheat them out of a $20 reward. Sixty percent of victims could not help feeling that Robovie was morally responsible for deceiving them. Commenting on the future use of robots in war, the HINTS experimenters noted in their final report that a military robot will probably be perceived by most “as partly, in some way, morally accountable for the harm it causes. This psychology will have to be factored into ongoing philosophical debate about robot ethics, jurisprudence, and the Laws of Armed Conflict.” Quite how this could be done is unclear.

An assumption of human free will is fundamental for any system of legal accountability. Unfortunately, the more the cognitive sciences develop, the more they suggest that our moral reasoning lies largely outside of our control—and can even be manipulated.

An example is transcranial magnetic stimulation. Experiments with TMS reveal that you can alter somebody’s moral reasoning using a powerful magnet. Unscrupulous military leaders could artificially distort their subordinates’ morality for the worse by attaching a TMS unit to their helmets. Yet if a soldier committed war crimes because somebody else had turned off his morals, it is hard to see how we could hold him responsible for his actions.

TODAY IN SLATE

Politics

Meet the New Bosses

How the Republicans would run the Senate.

The Government Is Giving Millions of Dollars in Electric-Car Subsidies to the Wrong Drivers

Scotland Is Just the Beginning. Expect More Political Earthquakes in Europe.

Cheez-Its. Ritz. Triscuits.

Why all cracker names sound alike.

Friends Was the Last Purely Pleasurable Sitcom

The Eye

This Whimsical Driverless Car Imagines Transportation in 2059

Medical Examiner

Did America Get Fat by Drinking Diet Soda?  

A high-profile study points the finger at artificial sweeteners.

The Afghan Town With a Legitimately Good Tourism Pitch

A Futurama Writer on How the Vietnam War Shaped the Series

  News & Politics
Photography
Sept. 21 2014 11:34 PM People’s Climate March in Photos Hundreds of thousands of marchers took to the streets of NYC in the largest climate rally in history.
  Business
Business Insider
Sept. 20 2014 6:30 AM The Man Making Bill Gates Richer
  Life
Quora
Sept. 20 2014 7:27 AM How Do Plants Grow Aboard the International Space Station?
  Double X
The XX Factor
Sept. 19 2014 4:58 PM Steubenville Gets the Lifetime Treatment (And a Cheerleader Erupts Into Flames)
  Slate Plus
Tv Club
Sept. 21 2014 1:15 PM The Slate Doctor Who Podcast: Episode 5  A spoiler-filled discussion of "Time Heist."
  Arts
Television
Sept. 21 2014 9:00 PM Attractive People Being Funny While Doing Amusing and Sometimes Romantic Things Don’t dismiss it. Friends was a truly great show.
  Technology
Future Tense
Sept. 21 2014 11:38 PM “Welcome to the War of Tomorrow” How Futurama’s writers depicted asymmetrical warfare.
  Health & Science
The Good Word
Sept. 21 2014 11:44 PM Does This Name Make Me Sound High-Fat? Why it just seems so right to call a cracker “Cheez-It.”
  Sports
Sports Nut
Sept. 18 2014 11:42 AM Grandmaster Clash One of the most amazing feats in chess history just happened, and no one noticed.