Future Tense

Convicted by Code

Defendants don’t always have the ability to inspect the code that could help convict them.

Image by Mirexon/Shutterstock

Secret code is everywhere—in elevators, airplanes, medical devices. By refusing to publish the source code for software, companies make it impossible for third parties to inspect, even when that code has enormous effects on society and policy. Secret code risks security flaws that leave us vulnerable to hacks and data leaks. It can threaten privacy by gathering information about us without our knowledge. It may interfere with equal treatment under law if the government relies on it to determine our eligibility for benefits or whether to put us on a no-fly list. And secret code enables cheaters and hides mistakes, as with Volkswagen: The company admitted recently that it used covert software to cheat emissions tests for 11 million diesel cars spewing smog at 40 times the legal limit.

But as shocking as Volkswagen’s fraud may be, it only heralds more of its kind. It’s time to address one of the most urgent if overlooked tech transparency issues—secret code in the criminal justice system. Today, closed, proprietary software can put you in prison or even on death row. And in most U.S. jurisdictions you still wouldn’t have the right to inspect it. In short, prosecutors have a Volkswagen problem.

Take California. Defendant Martell Chubbs currently faces murder charges for a 1977 cold case in which the only evidence against him is a DNA match by a proprietary computer program. Chubbs, who ran a small home-repair business at the time of his arrest, asked to inspect the software’s source code in order to challenge the accuracy of its results. Chubbs sought to determine whether the code properly implements established scientific procedures for DNA matching and if it operates the way its manufacturer claims. But the manufacturer argued that the defense attorney might steal or duplicate the code and cause the company to lose money. The court denied Chubbs’ request, leaving him free to examine the state’s expert witness but not the tool that the witness relied on. Courts in Pennsylvania, North Carolina, Florida, and elsewhere have made similar rulings.

We need to trust new technologies to help us find and convict criminals but also to exonerate the innocent. Proprietary software interferes with that trust in a growing number of investigative and forensic devices, from DNA testing to facial recognition software to algorithms that tell police where to look for future crimes. Inspecting the software isn’t just good for defendants, though—disclosing code to defense experts helped the New Jersey Supreme Court confirm the scientific reliability of a breathalyzer.

Short-circuiting defendants’ ability to cross-examine forensic evidence is not only unjust—it paves the way for bad science. Experts have described cross-examination as “the greatest legal engine ever invented for the discovery of truth.” But recent revelations exposed an epidemic of bad science undermining criminal justice. Studies have disputed the scientific validity of pattern matching in bite marks, arson, hair and fiber, shaken baby syndrome diagnoses, ballistics, dog-scent lineups, blood spatter evidence, and fingerprint matching. Massachusetts is struggling to handle the fallout from a crime laboratory technician’s forgery of results that tainted evidence in tens of thousands of criminal cases. And the Innocence Project reports that bad forensic science contributed to the wrongful convictions of 47 percent of exonerees. The National Academy of Sciences has blamed the crisis in part on a lack of peer review in forensic disciplines.

Nor is software immune. Coding errors have been found to alter DNA likelihood ratios by a factor of 10, causing prosecutors in Australia to replace 24 expert witness statements in criminal cases. When defense experts identified a bug in breathalyzer software, the Minnesota Supreme Court barred the affected test from evidence in all future trials. Three of the state’s highest justices argued to admit evidence of additional alleged code defects so that defendants could challenge the credibility of future tests.

Cross-examination can help to protect against error—and even fraud—in forensic science and tech. But for that “legal engine” to work, defendants need to know the bases of state claims. Indeed, when federal district Judge Jed S. Rakoff of Manhattan resigned in protest from President Obama’s commission on forensic sciences, he warned that if defendants lack access to information for cross-examination, forensic testimony is “nothing more than trial by ambush.”

Rakoff’s warning is particularly relevant for software in forensic devices. Because eliminating errors from code is so hard, experts have endorsed openness to public scrutiny as the surest way to keep software secure. Similarly, requiring the government to rely exclusively on open-source forensic tools would crowd-source cross-examination of forensic device software. Forensic device manufacturers, which sell exclusively to government crime laboratories, may lack incentives to conduct the obsessive quality testing required.

To be sure, government regulators currently conduct independent validation tests for at least some digital forensic tools. But even regulators may be unable to audit the code in the devices they test, instead merely evaluating how these technologies perform in controlled laboratory environments. Such “black box” testing wasn’t enough for the Environmental Protection Agency to catch Volkswagen’s fraud, and it won’t be enough to guarantee the quality of digital forensic technologies, either.

The Supreme Court has long recognized that making criminal trials transparent helps to safeguard public trust in their fairness and legitimacy. Secrecy about what’s under the hood of digital forensic devices casts doubt on this process. Criminal defendants facing incarceration or death should have a right to inspect the secret code in the devices used to convict them.