What are the perceptions of robots in war?
As of May 3, American unmanned systems had carried out 131 known airstrikes into Pakistan, well over triple the number we did with manned bombers in the opening round of the Kosovo War just a decade ago. By the old standards, this would be viewed as a war.
But why do we not view it as such? Is it because it is being run by the CIA, not by the U.S. military? This has certainly minimized public debate, but it is the 21st-century equivalent of the equally not-so-covert fleet of repainted B-26 bombers the CIA sent to the Bay of Pigs invasion. We have ended up in a very odd situation: that the only true air war that the United States is fighting right now is not one commanded by an Air Force general but by a former congressman from California. This also means that not only are civilians handling weapons of war but also that civilian officials and lawyers, rather than military officers, are wrestling with complex issues of war, such as operational concept and strategy, rules of engagement, etc., that they do not have the background or mandate to manage.
Perceptions also matter on the receiving end. Approximately 7,000 miles away, our "efficient" and "costless" unmanned strikes are described as being, as one Middle East newspaper editor put it, "cruel and cowardly." Drone has become a colloquial word in Urdu, while Pakistani rockers sing about America not fighting with honor.
Contrary to the media reports on both sides of this divide, the men and women operating our robotic weapons systems make painstaking efforts to act with precision, and the collateral damage they've caused pales compared with any previous war in history. That doesn't change the fact that those very same efforts provoke anger on the other side of the globe. The fear here is that if we don't figure out how to master this narrative, America may well paint itself into the same corner that Israel did in Gaza, where it got very good at targeted strikes of Hamas leaders but also good at unintentionally inducing 12-year-old Palestinian boys to want to join Hamas.
Can the laws keep up?
Unmanned systems are not used just in war. DHS is flying them for border security. A range of local police departments are seeking to get into the game as well (Miami-Dade recently got authorization to use them). But for every organization that deploys robots to fight crime, hunt for forest fires, find Haitian earthquake survivors, or help shut down oil spills (to give recent examples of real-world use), there are others with something less noble in mind. Criminals in Taiwan have used them to rob apartment buildings, while the civilian vigilante "border militias" in Arizona have used drones to carry out their own (arguably illegal) patrols.
As this proliferation continues, we're going to have to figure out all sorts of issues that one doesn't normally think of with reference to robots, such as licensing and training. And, as one federal district court judge put it to me, robots' constant, and sometimes unauthorized, gathering and storage of information means that the legal questions they raise in such areas as probable cause and privacy will likely reach the Supreme Court. Does the Second Amendment cover my right to bear (robotic) arms? It sounds like a joke, but where does the line go, and why? A bar owner in Atlanta already started the push to test this, building "Bum Bot," a robot armed with an infrared camera, spotlight, loudspeaker, and aluminum water cannon that he used to scare away homeless people and drug dealers from a parking lot near his business.
The challenge in much of this is not that robotics remove humans from the decision-making but that they move that human role geographically and chronologically. Decisions now made hundreds of feet away in the case of Bum Bot or thousands of miles away in the case of Predators, or even years ago in the case of the designers of such systems, may have great relevance to a machine's actions (or inactions). An automated anti-aircraft cannon in South Africa, for example, had a "software glitch" and accidentally killed nine soldiers in a training exercise. How to investigate and adjudicate this real-world version of the famous scene from Robocop is not simple.
While technological advancement accelerate at an exponential pace, our institutions are struggling to keep up. For example, the prevailing laws of war, the Geneva Conventions, were written in a year in which people listened to Al Jolson on 78rpm records and the average house cost $7,400. Is it too much to ask them to regulate a 21st-century technology like a MQ-9 Reaper that is being used to target a modern-day insurgent who is intentionally violating those laws by hiding out in a civilian house?