War Stories

We, Robot

Is it dangerous to let drones fight our wars for us?

A couple of weeks ago, I was on the phone with a reporter doing an interview about Sen. John Cornyn’s, R-Texas, complaints that DHS hadn’t yet sent any border-patrol Predator drones to his state. In the midst of the interview, I snuck a peek at my computer to see what was happening with my two most important assets: my fantasy sports team and my investments. While, fortunately, everything was OK for the “Raging Pundits” franchise, the same couldn’t be said for Wall Street. Major components of the stock market had fallen by as much as 60 percent in a few minutes, with stocks like Accenture and Boston Beer falling all the way down to just 1 cent before they jumped back to $40 a few minutes later.

Think about this: A politician was angry that the federal government hadn’t sent any robots to patrol his state’s borders. At the same time, our financial system was spinning off in a way that we humans not only didn’t control but didn’t even understand, because the artificial intelligence programs the major financial houses have become reliant on acted in unexpected ways. As I sat there trying to piece it all together, it felt like I, Robot (the Isaac Asimov novel, not the crummy Will Smith movie) had come true. The book ends with a world in which robots are no longer seen as remarkable and AI runs an economy that has become too complex for mere human stock traders.

When the U.S. military went into Iraq in 2003, it had only a handful of robotic planes, commonly called “drones” but more accurately known as “unmanned aerial systems.” Today, we have more than 7,000 of these systems in the air, ranging from 48-foot-long Predators to micro-aerial vehicles that a single soldier can carry in a backpack. The invasion force used zero “unmanned ground vehicles,” but now we have more than 12,000, such as the lawnmower-size Packbot and Talon, which help find and defuse deadly roadside bombs. The Packbot is made by the same company that makes the Roomba robot vacuum cleaner. It is called iRobot.

While these real-world robots often seem straight out of science fiction, they are merely the first generation—the equivalent of the Model T Ford or the Wright brothers’ Flyer. Both the Air Force and Army have recently issued roadmaps that plan an ever-greater use of more advanced and autonomous and deadly robots. While it’s easy to wave our hands in confusion at such a massive, quick change, much as those hapless traders on Wall Street did last week, we’ve actually been through something like this before. Many scientists compare our “unmanned” systems of today to “horseless” carriages of a century ago. Automobiles became important not merely because most of us now have garages instead of stables but because they caused huge ripple effects. Horseless carriages didn’t just help mechanize industry and warfare; the technology also reshaped our cities through the creation of suburbia, gave massive economic power to Middle East nomads who lived above previously worthless oil deposits, and heated up our planet.

Like automobiles, gunpowder, the printing press, or the atomic bomb, the field of robotics is similarly revolutionary. The capabilities of the technology are enormous, but what really matters are the ripples it will send out into wider society. We can’t yet know what the ripple effects of robotics will be, but if we want to do a better job of shaping them, we need to identify exactly what questions we are going to have to answer.

Where is the (unmanned) military headed?

The U.S. military has gone from barely using robotics to using thousands of robots in a bureaucratic blink of an eye. But as one Air Force captain put it to me, the problem is that “It’s not ‘Let’s think this better’; it’s only ‘Give me more.’ ”

The Pentagon will need to avoid its usual tendency of buying overpriced, over-engineered, unwieldy systems that have gold-plated processors made in congressional committee chairmen’s districts.

We also need to have a vigorous debate about how best to use robots—we need what is known in the military as “doctrine.” Having the right doctrine can be the difference between winning and losing wars, between committing America to the 21st-century version of either the Maginot Line or the Blitzkrieg. This is not just a matter of tactics in the field but also of personnel and organizational issues. How can we better support the men and women operating this new technology, who may not be in the physical war zone but are experiencing an entirely new type of combat stress? And how do you ensure their future career prospects so that the prevailing status quo culture inside a service does not stymie change? And we need to rethink the roles of warriors and civilians in this strange, new technologic space. Is it proper that presently 75 percent of the maintenance and weapons loading of systems like the Predator has been outsourced to private contractors, including to controversial firms like Blackwater/Xe?

What are the perceptions of robots in war?

As of May 3, American unmanned systems had carried out 131 known airstrikes into Pakistan, well over triple the number we did with manned bombers in the opening round of the Kosovo War just a decade ago. By the old standards, this would be viewed as a war.

But why do we not view it as such? Is it because it is being run by the CIA, not by the U.S. military? This has certainly minimized public debate, but it is the 21st-century equivalent of the equally not-so-covert fleet of repainted B-26 bombers the CIA sent to the Bay of Pigs invasion. We have ended up in a very odd situation: that the only true air war that the United States is fighting right now is not one commanded by an Air Force general but by a former congressman from California. This also means that not only are civilians handling weapons of war but also that civilian officials and lawyers, rather than military officers, are wrestling with complex issues of war, such as operational concept and strategy, rules of engagement, etc., that they do not have the background or mandate to manage.

Perceptions also matter on the receiving end. Approximately 7,000 miles away, our “efficient” and “costless” unmanned strikes are described as being, as one Middle East newspaper editor put it, “cruel and cowardly.” Drone has become a colloquial word in Urdu, while Pakistani rockers sing about America not fighting with honor.

Contrary to the media reports on both sides of this divide, the men and women operating our robotic weapons systems make painstaking efforts to act with precision, and the collateral damage they’ve caused pales compared with any previous war in history. That doesn’t change the fact that those very same efforts provoke anger on the other side of the globe. The fear here is that if we don’t figure out how to master this narrative, America may well paint itself into the same corner that Israel did in Gaza, where it got very good at targeted strikes of Hamas leaders but also good at unintentionally inducing 12-year-old Palestinian boys to want to join Hamas.

Can the laws keep up?

Unmanned systems are not used just in war. DHS is flying them for border security. A range of local police departments are seeking to get into the game as well (Miami-Dade recently got authorization to use them). But for every organization that deploys robots to fight crime, hunt for forest fires, find Haitian earthquake survivors, or help shut down oil spills (to give recent examples of real-world use), there are others with something less noble in mind. Criminals in Taiwan have used them to rob apartment buildings, while the civilian vigilante “border militias” in Arizona have used drones to carry out their own (arguably illegal) patrols.

As this proliferation continues, we’re going to have to figure out all sorts of issues that one doesn’t normally think of with reference to robots, such as licensing and training. And, as one federal district court judge put it to me, robots’ constant, and sometimes unauthorized, gathering and storage of information means that the legal questions they raise in such areas as probable cause and privacy will likely reach the Supreme Court. Does the Second Amendment cover my right to bear (robotic) arms? It sounds like a joke, but where does the line go, and why? A bar owner in Atlanta already started the push to test this, building “Bum Bot,” a robot armed with an infrared camera, spotlight, loudspeaker, and aluminum water cannon that he used to scare away homeless people and drug dealers from a parking lot near his business.

The challenge in much of this is not that robotics remove humans from the decision-making but that they move that human role geographically and chronologically. Decisions now made hundreds of feet away in the case of Bum Bot or thousands of miles away in the case of Predators, or even years ago in the case of the designers of such systems, may have great relevance to a machine’s actions (or inactions). An automated anti-aircraft cannon in South Africa, for example, had a “software glitch” and accidentally killed nine soldiers in a training exercise. How to investigate and adjudicate this real-world version of the famous scene from Robocop is not simple.

While technological advancement accelerate at an exponential pace, our institutions are struggling to keep up. For example, the prevailing laws of war, the Geneva Conventions, were written in a year in which people listened to Al Jolson on 78rpm records and the average house cost $7,400. Is it too much to ask them to regulate a 21st-century technology like a MQ-9 Reaper that is being used to target a modern-day insurgent who is intentionally violating those laws by hiding out in a civilian house?

Will America go the way of Commodore computers?

Robotics is a growing industry, with billions of dollars in sales already and exponential growth curves expected in the future. And with its growing role in war, it’s also crucial to national security. Yet unlike Korea, or even Thailand, the United States does not have a national robotics strategy. How will America compete with the 43 other countries building, buying, and using military robotics, including allies such as the United Kingdom and Germany, as well as rivals such as Russia, China, and Iran? Can we stay ahead, or will we fall behind like so many other historic first-movers in technological revolutions?

The health of the American manufacturing economy and the state of science and mathematics education in our schools give another cause for worry. The United States graduates fewer students with a degree in IT or engineering than it did in 1986 (but not to worry, we have had a more-than-500-percent rise in “parks, recreation, leisure, and fitness studies”).

What does the “open source” revolution hold for us?

Robots are not like aircraft carriers or nuclear bombs; much of the technology is off-the-shelf and even do-it-yourself. Hitler’s Luftwaffe may not have been able to fly across the Atlantic during World War II, but a 77-year-old blind man has already done so with his own homemade drone. This proliferating technology will thus inevitably pass into the wrong hands, allowing small groups and even individuals to wield great power. Indeed, Hezbollah flew four such weapons in its war with Israel, while al-Qaida reportedly explored using drones to attack a G-8 summit.

As the 9-11 Commission warned, the 2001 tragedy was caused in part by a “failure of imagination.” We may well have to wrestle with something similar in the realm of robotics. We need to develop a military and homeland-security strategy that considers not only how we will best use such sophisticated technology but also how others will use it against us. That means widening the threat scenarios our agencies plan and train for. It also means new legal regimes to determine who should have access to such dangerous technologies—lest our best new weapon come back to bite us.

It is easy to discount all this as mere science fiction. Indeed, the idea of wrestling with laws for robots seems more apt for a science-fiction convention like Comicon than a Slate-sponsored conference in Washington, D.C. But remember past science-fiction fantasies: Jules Verne’s submarine, A.A. Milne’s “military aeroplane,” or H.G. Wells’ “land ironclads” (what Winston Churchill renamed the “tank”) and “atomic bombs.” What was very recently imaginary becomes all too real, all too quickly.



And thus, it is crucial for serious people to engage upon the serious issues that are playing out before us. And it is not like the stock market: With real-world robots, we can’t just cancel the trades and act like nothing happened.



The article is being published in conjunction with “Warring Futures: How Biotech and Robotics Are Transforming Today’s Military—and How That Will Change the Rest of Us,” a May 24 conference in Washington, D.C., sponsored by Slate, the New America Foundation, and Arizona State University. You can sign up to attend the event here. Read an article by Fred Kaplan about how the nature of war limits the use of technology and by Brad Allenby about why it’s futile to resist  new military technology.



Become a fan of Slate on Facebook. Follow Slate and the Slate Foreign Desk on Twitter.