The Dismal Science

Errors in Judgment

Were hundreds of criminals given the wrong sentences because lawyers messed up a basic work sheet?

In early 2005, Emily Owens was halfway through her Ph.D. thesis in economics at the University of Maryland. Her topic: the deterrence effect of long prison sentences. She had just received data from the Maryland State Commission on Criminal Sentencing Policy on tens of thousands of cases that had appeared in the state’s courts over the previous years, cases she hoped would help her close out her dissertation. But as she started working through the numbers, she came across thousands of inconsistencies and errors in the sentencing recommendations provided to judges. * The errors ultimately translated into extra months and years of prison time for unlucky convicts and light sentences for lucky ones. What might have been a run of the mill economic analysis of crime and punishment turned into a shocking account of human error.

In addition to the usual information on defendants and their crimes, Owens’ data set included sentence recommendations provided to judges to guide their punishment decisions. The sentencing guidelines—based on a work sheet that graded the severity of a convict’s crime and his risk to society—were meant to make the administration of justice a little less arbitrary: Similar cases should lead to similar penalties.

To get acquainted with her data, Owens programmed the work sheet and its scoring system into her computer, fed in the case data, and expected to see her program spit out the set of sentences that had been provided to judges presiding over these cases. Most of the time, the two sets of numbers were the same. Yet no amount of checking and rechecking could account for a dismaying number of inconsistencies: In a little over 10 percent of cases, she just couldn’t reconcile her figures with those of the commission.

After reviewing the original work sheets and consulting with the Sentencing Policy Commission’s director, Owens concluded that neither her math nor her data were to blame. A system designed to make justice more predictable was producing errors in one out of every 10 trials.

Errors in recommended sentences don’t necessarily translate into actual punishments. Maryland judges preside over thousands of cases per year, which might well lead them to develop “judicial intuition,” a sense of when a recommended prison term isn’t quite right. And since the state gives judges nearly complete discretion over sentencing, they’re free to ignore the recommendations entirely.

Owens—now a professor at Cornell—worked together with co-authors Shawn Bushway and Anne Piehl, who had independently noticed the errors in the Maryland sentencing data. They compared trials in which the defendants’ characteristics and their crimes should have added up to identical sentence recommendations but, because mistakes were made in worksheet calculations, one ended up with a suggested sentence that was higher or lower than the other. Since the crimes and the criminals were otherwise quite similar, any systematic difference in punishments handed out by judges could be attributed to work sheet errors.

The authors found that recommendations mattered: For each month of incarceration recommended above the correct sentence length, the actual jail term went up by about four days. If the error worked in the other direction, the effect was much bigger: 13 fewer days in prison for each month that the recommended sentence was off. The average error made in a recommended sentence was nearly two years, meaning the mistakes translate into about an extra two months of prescribed prison time for convicts whose recommendations overshot the correct values and nearly a year off for convicts whose recommendations undershot the correct values.

(Owens and her co-authors also looked at how the impact of errors differed according to the crime committed and the demographics of the offender. Errors had no greater an effect, for example, on black criminals relative to white ones. So justice may have been random, but at least it was blind.)

The story doesn’t end with the judges. In Maryland, there’s one more party that gets to weigh in on prison terms. Parole boards have a great deal of discretion in making early-release decisions, which gives them the power to undo the work sheet mistakes that make it past Maryland’s judges, and they haven’t been shy in exercising their authority. When the study’s authors looked at time actually spent in jail, they found that each month of recommended sentence above the correct value resulted in only two additional days spent behind bars. So parole boards cut in half the extra time from work sheet mistakes, from four to two days. Then again, this is more or less the same proportion by which they cut all prison sentences, regardless of whether there’s been an error in sentence recommendation.

For prisoners who benefited from shortened sentences, parole boards seem much more effective in taking corrective measures. The 13 days knocked off a prisoner’s sentence for each month that the recommended sentence was wrong shrinks to only a day and a half of early release once the parole board does its work. So parole boards proved very effective in reversing errors that would have led to shortened prison time; much less so for undeserved extra time.

With the stakes so high—months and years of freedom gained or lost—how could Maryland’s Sentencing Policy Commission have been so sloppy? For academic research—a matter trivial by comparison—it’s common to have data entered independently by at least two typists, whose output is then cross-checked for accuracy. Yet it turns out that complacent bureaucrats weren’t to blame for the sentencing mistakes. The work sheet had to be filled out by the state attorney prosecuting the case, with the final form signed and approved by the defense attorney (who, if he was doing his job properly, would have done the work sheet calculations independently). The commission had, by design, handed off the task of work sheet completion to parties that it assumed would have every incentive to get the numbers right, but it apparently never accounted for widespread incompetence in Maryland’s legal profession.  

The Maryland Sentencing Commission has been responsive to feedback from Bushway, Owens, and Piehl. The executive director of the commission, David Soulé, assisted the researchers in going through the worksheet records to understand how to locate the full set of erroneous recommendations. And independent of the researchers’ findings, the Commission had already been at work developing an automated worksheet with the explicit goal of eliminating errors.

One lesson from the case of Maryland’s work sheet errors is that multiple levels of evaluation helped to undo some of the damage (though you might not see it that way if you spent an extra couple of months in the slammer because your attorney can’t do arithmetic). It’s a crucial insight given that states around the country have limited or abolished the discretion of parole boards as a result of Truth in Sentencing laws. More generally, crime and punishment in America remains rife with prejudice and inconsistencies. The poor and uneducated are convicted at rates disproportionate to their crimes; jurists and judges alike are biased and sometimes outright irrational. Keeping some checks and balances in place—like the moderating effects of parole boards—might help to keep the justice system a little more just.

Correction, Oct. 26, 2009: The article originally stated that the errors were in sentencing recommendations provided to judges by the Maryland State Commission on Criminal Sentencing Policy. The errors were the result of miscalculations by the person completing the sentencing recommendation worksheet, not the commission. ( Return to the corrected sentence.)