That may not sound like much. But these days, with national crime rates at historic lows, reductions are on the margins. "The low-hanging fruit has already been taken," says Brantingham. Predictive policing done right also allocates resources more efficiently, which can't hurt at a time when departments are getting lopped in half due to tight budgets.
Skepticism is warranted. For example, how is all this different from CompStat, the computer statistics program pioneered by Bill Bratton when he was police commissioner in New York City? The big difference is that CompStat is more retrospective than prospective. It collects crime numbers from previous weeks or months and uses them to evaluate a police department's efficiency. Cops could then use those trends to inform future patrolling. But they were still using old data, rather than current data. Predictive policing collects data in real time and uses it to map probable hotspots in the near future.
But isn't a lot of this stuff intuitive? If a crime occurs on a particular block of Compton, can't the LAPD just keep a closer eye on that area in the days after the crime? Sure, says Brantingham, but intuition can take a police officer only so far. In a city as large and complex as Los Angeles, it's hard to perform predictive policing by gut alone. Statistical models may simply confirm police intuition 85 percent or 90 percent of the time. "It's in the remaining 10 or 15 percent where police intuition may not be quite as accurate," says Brantingham. Malinowski calls the data "another tool in the toolbox."
Data-driven law enforcement shows that the criminal mind is not the dark, complex, and ultimately unknowable thing of Hollywood films. Instead, it's depressingly typical—driven by supply, demand, cost, and opportunity. "We have this perception that criminals are a breed apart, psychologically and behaviorally," says Brantingham. "That's not the case."
It's a common saying that when someone gets killed, he becomes a statistic. So does the killer.