Were hundreds of criminals given the wrong sentences because lawyers messed up a basic work sheet?

The search for better economic policy.
Oct. 22 2009 9:31 AM

Errors in Judgment

Were hundreds of criminals given the wrong sentences because lawyers messed up a basic work sheet?

1_123125_123079_2209487_2232560_091021_ds_jailtn

In early 2005, Emily Owens was halfway through her Ph.D. thesis in economics at the University of Maryland. Her topic: the deterrence effect of long prison sentences. She had just received data from the Maryland State Commission on Criminal Sentencing Policy on tens of thousands of cases that had appeared in the state's courts over the previous years, cases she hoped would help her close out her dissertation. But as she started working through the numbers, she came across thousands of inconsistencies and errors in the sentencing recommendations provided to judges. * The errors ultimately translated into extra months and years of prison time for unlucky convicts and light sentences for lucky ones. What might have been a run of the mill economic analysis of crime and punishment turned into a shocking account of human error.

In addition to the usual information on defendants and their crimes, Owens' data set included sentence recommendations provided to judges to guide their punishment decisions. The sentencing guidelines—based on a work sheet that graded the severity of a convict's crime and his risk to society—were meant to make the administration of justice a little less arbitrary: Similar cases should lead to similar penalties.

Advertisement

To get acquainted with her data, Owens programmed the work sheet and its scoring system into her computer, fed in the case data, and expected to see her program spit out the set of sentences that had been provided to judges presiding over these cases. Most of the time, the two sets of numbers were the same. Yet no amount of checking and rechecking could account for a dismaying number of inconsistencies: In a little over 10 percent of cases, she just couldn't reconcile her figures with those of the commission.

After reviewing the original work sheets and consulting with the Sentencing Policy Commission's director, Owens concluded that neither her math nor her data were to blame. A system designed to make justice more predictable was producing errors in one out of every 10 trials.

Errors in recommended sentences don't necessarily translate into actual punishments. Maryland judges preside over thousands of cases per year, which might well lead them to develop "judicial intuition," a sense of when a recommended prison term isn't quite right. And since the state gives judges nearly complete discretion over sentencing, they're free to ignore the recommendations entirely.

Owens—now a professor at Cornell—worked together with co-authors Shawn Bushway and Anne Piehl, who had independently noticed the errors in the Maryland sentencing data. They compared trials in which the defendants' characteristics and their crimes should have added up to identical sentence recommendations but, because mistakes were made in worksheet calculations, one ended up with a suggested sentence that was higher or lower than the other. Since the crimes and the criminals were otherwise quite similar, any systematic difference in punishments handed out by judges could be attributed to work sheet errors.

The authors found that recommendations mattered: For each month of incarceration recommended above the correct sentence length, the actual jail term went up by about four days. If the error worked in the other direction, the effect was much bigger: 13 fewer days in prison for each month that the recommended sentence was off. The average error made in a recommended sentence was nearly two years, meaning the mistakes translate into about an extra two months of prescribed prison time for convicts whose recommendations overshot the correct values and nearly a year off for convicts whose recommendations undershot the correct values.

(Owens and her co-authors also looked at how the impact of errors differed according to the crime committed and the demographics of the offender. Errors had no greater an effect, for example, on black criminals relative to white ones. So justice may have been random, but at least it was blind.)

The story doesn't end with the judges. In Maryland, there's one more party that gets to weigh in on prison terms. Parole boards have a great deal of discretion in making early-release decisions, which gives them the power to undo the work sheet mistakes that make it past Maryland's judges, and they haven't been shy in exercising their authority. When the study's authors looked at time actually spent in jail, they found that each month of recommended sentence above the correct value resulted in only two additional days spent behind bars. So parole boards cut in half the extra time from work sheet mistakes, from four to two days. Then again, this is more or less the same proportion by which they cut all prison sentences, regardless of whether there's been an error in sentence recommendation.

For prisoners who benefited from shortened sentences, parole boards seem much more effective in taking corrective measures. The 13 days knocked off a prisoner's sentence for each month that the recommended sentence was wrong shrinks to only a day and a half of early release once the parole board does its work. So parole boards proved very effective in reversing errors that would have led to shortened prison time; much less so for undeserved extra time.