The Golden Age of Privacy Is Over, but Drones Aren’t to Blame

What's to come?
April 30 2013 9:45 AM

The Golden Age of Privacy Is Over

But don’t blame drones.

Deputy Troy Sella from the new technology department of the Los Angeles Sheriff's Department (LASD) prepares the SkySeer Unmanned Arial Vehicle (UAV) drone for launch,16 June 2006 during a demonstration flight in Redlands, California. The LASD plans to purchase SkySeer drones to carry out surveillance and rescue operations.
Deputy Troy Sella from the new technology department of the Los Angeles Sheriff's Department (LASD) prepares the SkySeer Unmanned Arial Vehicle (UAV) drone for launch, in June 2006, during a demonstration flight in Redlands, California.

Photo by Robyn Beck/AFP/Getty Images

This article arises from Future Tense, a partnership of Slate, the New America Foundation, and Arizona State University. On Tuesday, May 7, Future Tense will host an event in Washington, D.C., on the use of drones in the United States. For more information and to RSVP, visit the New America Foundation’s website.

Look! In the sky! Black helicopters! No … it’s drones! The difference, of course, is that the black helicopters beloved of conspiracy theorists do not exist, while drones will soon be coming to an airspace near you. The Federal Aviation Administration, in fact, is under congressional mandate to establish the rules for operating drones in domestic airspace, and they’re already being used as part of U.S. border security technology systems. Meanwhile, smaller drones such as quadrotors, many already fitted out with cameras, are widely available on Amazon and elsewhere.

Domestic deployment of drones is causing substantial concern, especially regarding the implications for privacy. The American Civil Liberties Union, for example, fears that “routine aerial surveillance would profoundly change the character of public life in America” and demands rules and regulations so that we can avoid “a ‘surveillance society’ in which our every move is monitored, tracked, recorded, and scrutinized by the government.” Glenn Greenwald warns about the arrival of the “Surveillance State,” and somewhat dramatically suggests drones will “psychologically terrorize the population.” Some animal rights groups allege that drones used to monitor hunters have been shot down; other people fear weaponized drones. As of March 2013, legislation had been proposed in 35 states to regulate drones, driven in large part by fears about loss of privacy.


It is useful to remember a few basic things. First, drones—sometimes called unmanned aerial vehicles, UAVs, or unmanned aircraft systems, UASs—is a pretty sloppy term. Things called drones come in many different sizes and shapes, from airplane-size Predators used by governments for combat missions, to quadcopters used by hobbyists in legions of innovative ways, to rather cute robotic hummingbirds that are not yet, but undoubtedly will be, used by political parties, news and propaganda organizations, and divorce lawyers for all sorts of spying. But the reality is that drones are neither weapons nor surveillance devices: They are a platform. Put a camera on them, and you can do surveillance (as environmentalists have already done); put a ricin stinger or an exploding device on them, and they are a weapon; put light-emitting diodes on them, and they are entertainment; put sensors on them, and they can help emergency responders in complex disaster situations; put instruments on them, and you can significantly improve scientific research. Small drones are already being used to survey sensitive ecosystems without disturbing them, for example.

The question is not whether new technologies need vetting. Of course they do. And regulatory issues, from air space safety to constitutional rights to misuse by private parties, do need to be engaged. The interesting question is: Why do such drones seem to be drawing such disproportionate, indeed sometimes hysterical, attention? After all, massive data accumulation by private firms such as Facebook, Apple, Microsoft, and Google, not to mention the financial firms that hold your credit card records, has had a far more caustic effect on real privacy than drones. And if one is worried about a “surveillance society,” smartphone videos, and the advent of universal recording of reality—like with Google glasses—are a more obvious place to start. So why the blowback on drones?

Part of the reason, of course, is that drones are already associated in the public mind with (sometimes contentious) violent military activities. This has probably been reinforced by the fact that the most well-publicized domestic drone deployment has been on the Mexican-U.S. border, which is associated in many people’s minds with drugs, national defense, and conflict. Say drone and people think Predator, not hobbyist quadrotor. The technology category has thus for many people already been framed by these pre-existing uses and media coverage.

More subtly, however, drones in public discourse appear to have become a symbol of an inchoate fear of a future in which privacy is nonexistent. This is not an unreasonable fear, but it is somewhat anachronistic, because in reality that future is already here—not because of drones, but because of pervasive video recording, RFID systems, massive data collection systems, and other, less obtrusive technologies. Fear of drones, then, becomes a reactionary spasm against the present, rather than a reasoned response to the marginal challenge that drones actually represent to what little privacy remains. Given such a dynamic, drones can never overcome the fear, because nothing that can actually be done to regulate the technology can ever call back the golden age of privacy.

There are significant policy implications as drones morph from a technology requiring assessment and proper regulation to a symbol of an impossible quest to regain a largely mythic past. To begin with, neither the technologists that work on drones, nor the law enforcement and emergency response agencies that can significantly enhance public safety by using such technologies, understand that, at least when it comes to privacy and public perspective, they are not dealing with an engineering or rational public policy issue, but with a deeply symbolic framing that is mostly implicit and hidden. The kinds of instrumental design and regulatory responses that, for example, can help drones operate safely in public airspace do not work when powerful cultural currents are at play. If I am introducing a drone, and the discussion revolves around the technology itself, it is a rational, relatively objective, and bounded discussion. If, however, a technology that I want to introduce becomes a symbol of something larger, such as an inchoate fear of loss of privacy, I have a much more difficult task. This is especially true because it is not likely that most people engaging in the discussion will themselves realize that such a shift has occurred. They honestly believe that it is the drone as drone, not as symbol, about which they are concerned.

This offers a profound lesson for those who traffic in technology, from the Department of Defense, to industrialists, to emergency response and public agencies, to hobbyists, to the public itself: Beware technologies that are likely to morph into cultural symbols. An active denial system microwave non-lethal crowd-control truck-mounted weapon may be very useful in a complex conflict—but it also might be perceived as a ray weapon deployed by advanced societies against poor people, a symbol with 100 years of science fiction working against it, and a potential public relations disaster in a counterinsurgency environment. And you won’t be able to use it. A drone is quite a useful device—indeed, a lifesaving technology—in many cases, but if it becomes a symbol of lost privacy and cultural anti-technology angst, you won’t be able to use it.

The time to manage the technology to avoid such responses is in its early stages, but unfortunately the individuals and the institutions that are involved in those stages are usually neither trained nor particularly adept at perceiving the possibility of such implications. Ethical and cultural assessments of technologies need to be part of the technology development process from the beginning, not called in at the last moment when the battles are, if not lost, far more contentious than might have been necessary.