The release and exoneration this month of Brandon Mayfield, the Portland, Ore., lawyer arrested in connection with the Spanish train bombings, raises important questions about the nature of scientific evidence. Mayfield, a 37-year-old lawyer, ex military officer, and convert to Islam, was jailed for two weeks after the FBI discovered his fingerprint on a bag of detonators recovered after the deadly Madrid bombing that killed 191 people in March. Mayfield, it was also quickly disclosed, represented a defendant in a child custody case who was linked to terrorism. After matching the print and reviewing the evidence, special agent Richard Werder swore out an affidavit and used it to get a material-witness warrant. Mayfield was quickly arrested and sent to jail. More quick and aggressive police work in a terrorism case, keeping the homeland secure.
Except for the part about how the fingerprint wasn't Mayfield's at all.
In the affidavit, Werder was unequivocal about whose fingerprint was on the bag—it was Mayfield's. "The FBI lab stands by their conclusion of a 100 percent positive identification," was the way the print match was described. They may have been 100 percent positive, but they were also 100 percent wrong. This sort of forensic puffery, usually practiced by government experts or agents, is both commonplace and deadly. Botched forensics, whether they result from oversight, sloppiness, ego (i.e., Martha Stewart's perjurious ink examiner), or malice can easily sink an innocent defendant who might be ill-equipped and sometimes unable to unscramble the convincing, if false, forensic hash cooked up by the government. And with fingerprint evidence, often elevated to "smoking gun" status by our culture and our courts, the chance for serious mischief is greatly increased.
In the Mayfield case, of course, the government did manage to correct its own error; small solace to Mayfield, who spent two weeks incarcerated as a witness to a crime he knew nothing about. The confusion, said Robert Jordan, the FBI agent in charge of Oregon, resulted from analyzing an image of substandard quality. Now Mayfield's case has a number of disturbing aspects to it, the arrest-first-ask-questions-later approach to the war on terror not alone among them. But one of the most frightening consequences of the Mayfield incident is the bureau's attempt to explain away Mayfield's total misidentification by blaming it on a bad digital print. The reality is that it's not the print that's bad, it's the science.
Clearly the digital image analyzed by the FBI wasn't so awful that they sought to see the original—they seemed to have passed up that opportunity when they were meeting with Spanish investigators in Madrid on April 21. Nor was the print so weathered that it couldn't be matched at all. And it didn't degrade in transmission, either. The FBI already has complex standards for electronic fingerprint transmission, which include things like geometric image accuracy and modulation transfer function standards *. No one suggested that there was a degraded print problem in the affidavit supporting the warrant. On the contrary, the FBI ran the print, matched it to Mayfield, claimed total certainty, and set about getting him into custody.
The use of digital prints isn't at all unusual. The FBI has already admitted that they regularly use digital images of fingerprints, and that, in this case "it was absolutely acceptable to examine a digital image." Ultimately, as I predict the FBI's internal investigation will conclude, the use of the digital print will turn out to have been just fine. So, what was the real fingerprinting problem, and why doesn't the FBI want to address it?
For generations, and until DNA came along, fingerprint evidence has been touted as the ultimate forensic tool. So unique and special are our fingerprints that DNA itself is often described as a "genetic fingerprint." And that essential truth remains. Done correctly, fingerprint analysis can be a powerful forensic tool of identification. The problem is that there aren't universal standards for what "done correctly" means. The supposed science of fingerprints is more like an elaborate boys club of certified examiners who decide—subjectively and not always consistently—what constitutes a match. This absence of basic uniform standards is the dirty little secret of Mayfield's fingerprint problem.
Fingerprint matches are made on the basis of what's known as "points of comparison," as a quick look at your thumb will demonstrate. What you will see are the friction ridges that comprise your unique fingerprint. The friction ridges whirl and spit, creating unique patterns that ultimately become the biometric data every burglar loves to hate. Comparing prints is a matter of looking for places where the ridges join or split—something that can be compared between prints. These points of comparison are used to both exclude prints (prove they are not the same) and to match prints. The problem is this: Print examiners and even the computers that do the preliminary scans don't actually match the entire print. In deciding if a print matches they almost always decide on the basis of a partial analysis.
Running a fingerprint against the massive database that contains all of the fingerprints from all of the people arrested all across America is a daunting technological task. It is accomplished by the feds with a system they call IAFIS—the Integrated Automated Fingerprint Identification System. IAIFIS does the heavy lifting of initially comparing a latent print against the vast database. When IAFIS finds what it considers a match, it spits it out, and a human (more often several) takes over.
In his 1892 book, aptly titled Finger Prints, Sir Francis Galton described a method of comparing points of similarity that is still used today. Indeed, despite high-tech labs and CSI:Miami, the process of fingerprint comparison at the human level hasn't advanced much since Galton's day. It still involves magnifying glasses and lots and lots of patience. Examiners comb over two prints, stripped of any identifying information, in order to find and highlight points of comparison. It is generally understood that there are between 35 and 50 points on a typical finger. How many constitute a match? The rather unscientific answer is, it depends. Some police departments require 10, others 12, some are satisfied with eight. This lack of uniformity can mean that one agency (the FBI, say) may declare a print match while another (the Spanish National Police, say) says no. Ultimately, as Simon Cole, the author of Suspect Identities: A History of Fingerprinting and Criminal Identification explains—and the FBI acknowledges—the decision to declare a match is a subjective one, based on the totality of the circumstances and the examiner's knowledge and experience.
Those subjective decisions mean that that the government can profess certainty and still be dead wrong. Without agreement on essential baseline standards, fingerprinting will remain a practice rather than a science. Make no mistake about it, fingerprints are valuable forensic evidence, based on unique biometric data. But when the evaluation of that data rests on a because-I said-so analysis, the door is wide open for injustice. And as Brandon Mayfield's case amply demonstrates, taking the government's say-so as definitive simply isn't enough. And when psudeoscience is turned loose in the context of the war on terror, the results may well terrify.