In approximately 90 cities throughout the United States, ShotSpotter sensors—gunfire detection technology—time stamp and triangulate the location of shots fired in an effort to aid law enforcement in cracking down on gun violence. In many underserved communities, citizens are not likely to call 911 to report gunshots, so this technology is supposed to increase the awareness of police to instances of gun use in their jurisdiction.
Across the country law enforcement agencies are already deploying powerful technologies like ShotSpotter, surveillance systems, body-worn cameras for police, and predictive algorithms that attempt to anticipate where crime will happen. Proponents of the technologies argue that they have the potential to increase accountability and improve the ways police engage with the community.
On Nov. 30, Future Tense—a partnership of Slate, New America, and Arizona State University—convened experts for a live event in Washington, D.C., to discuss the potential for technology to prevent crime and improve the relationship between law enforcement agencies and the communities they serve.
Ralph Clark, president and CEO of ShotSpotter, says that his company’s product gives police the ability to proactively protect the public by responding to unreported gun violence. Yet in many communities, the problem isn’t necessarily whether police are responding to crime—it’s how they respond. Kami Chavis, director of the criminal justice program at Wake Forest University, says, “Police departments indeed have an obligation to use technology and anything they can to keep communities safe. At the same time, though, we have this very delicate balance of protecting the civil liberties of people who live in those communities as well.”
Critics argue that high-tech law enforcement tools, when left unchecked, can threaten privacy and perpetuate harmful biases. Predictive policing, for example, relies on historic crime data to anticipate where crime is most likely to take place or who is most likely to commit a crime. But this data shows an incomplete picture. Logan Koepke, an analyst at Upturn, points out that patterns of biased enforcement will be translated into the algorithms that predict future crime. The result is increased police presence and surveillance in already over-policed and over-surveilled communities.
Even when we know the problems with a law enforcement technology, it isn’t easy to find solutions. Lauren Kirchner, senior reporting fellow at ProPublica, warns, “The speed at which technology is advancing and the infrastructure of surveillance are building up so fast that oversight and regulation can’t possibly keep up. Even public knowledge can’t keep up.”
In many cases, the people being surveilled by these technologies aren’t aware of it or don’t understand its implications for their future. For example, body-worn cameras, which many called for after the shooting and protests in Ferguson, Missouri, have the potential to alter police-community relations for the better. But some companies are working to integrate facial recognition software into their bodycams—and without strong regulation, that could make them just another tool to catalog, track, and monitor individuals. It’s easy to think there’s nothing to worry about if you’re not doing anything criminal, but Jennifer Lynch, staff attorney for the Electronic Frontier Foundation, pointed out that many social movements were once seen as unacceptable, such as LGBT rights. Imagine how things might be different had those communities been under constant surveillance by people who considered their behavior unlawful.
That’s why it’s so important for members of communities to work with law enforcement to create fair structures and systems as departments adopt and deploy these technologies. And for that to happen, we need better police-community trust. Samuel Sinyangwe, co-founder of WeTheProtesters, reminds us, “It is impossible to have police-community trust in a black community when the majority of black youth, either themselves personally or somebody that they know personally has experienced police violence.” He suggests a good start to addressing this issue is increased accountability and transparency from law enforcement. And technology itself can be part of it: As Philadelphia City Councilman David Oh put it, “The technology is not a problem—the technology is a promise of transparency and a uniformed protocol.”
There are a few ways to fulfill that promise—and chief among them is bringing together citizens, law enforcement, and the companies that create these technologies to identify areas of improvement and understanding. Denice Ross, co-founder of the White House Police Data Initiative, believes “data transparency can shift the dialogue from one of confrontation to collaboration.” Comprehensive data documenting police conduct is necessary for police reform. With it, communities can hold police accountable for their actions. Without it, citizens are limited in their ability to review the behavior of their police, widening the trust gap between citizens and law enforcement.
Data and technology have the potential to do a lot of good—but unless there is an open dialogue with members of the community, they could also perpetuate critical problems within our justice system. NYPD Deputy Commissioner Tracie Keesee says that we must be careful about how much we rely on technology to mediate human interaction. Or as she put it: “Trust is a human component, not something technology can build for you.”