Skip navigation
The Habeas Citebook: Prosecutorial Misconduct - Header
× You have 2 more free articles available this month. Subscribe today.

Fingerprint Analysis: High Stakes, Low Qualifications

Forensic science was long considered a foolproof means of analyzing evidence to determine the identity of individuals involved in a crime or their methods of committing it. If the people in the lab applied their technical expertise to a case and the results pointed toward a certain suspect, a guilty verdict was almost assured. After all, what jury would argue with the objective standards of science?

A 2009 National Academy of Sciences report cast old assumptions about the field of forensics into serious doubt. The study found that all the pattern-matching disciplines, where evidence from a crime scene is compared to a pattern connected to a suspect, are actually very subjective, meaning that experts examining the same piece of evidence can—and often do—reach conflicting conclusions. It went on to say that, except for DNA analysis, most disciplines of forensics had no solid scientific basis.

Some analyses, such as matching bite-marks and handwriting, had already been criticized for uneven results, but perhaps the biggest surprise was that friction ridge, or fingerprint, comparisons lacked objective standards as well. This particular field revolves around the common perception that no two individual’s prints are identical, but that has never been proven. Even if it were assumed to be true, the bigger problem is that prints taken at a crime scene are rarely whole or clear. Such smudged or “noisy” prints could be matched to a number of individuals, however unique one’s fingertips might be, with over seven billion people in the world there are many whose prints are near to identical. When investigators lift a print but lack a suspect to compare it against, they run it through a computer database such as the FBI’s Automated Fingerprint Identification System, which in turn generates a list of “close non-matches.” While the actual source of the print may or may not appear on the list, it is certain that the vast majority of those implicated as potential suspects have no connection to the crime. In one case, faulty fingerprint analysis by the FBI led to false allegations against an Oregon lawyer for the 2004 bombing of a train in Spain.

Considering the consequences that errors in this field could exact on people’s lives, either by clearing a guilty party or condemning an innocent one, the examiners tasked with performing latent fingerprint analysis must be forensic experts who are regularly tested and professionally certified.

And they are. Sort of.

Federal sources indicate that 98% of accredited public crime lab employees take proficiency exams. The problem is that almost everyone who sits for the exams—even those with no forensic training—passes them with flying colors, The Intercept reports.

“We looked at the passage rates year in and year out, and they’re all in the mid to high 90s,” said Brendan Max, a public defender in Cook County, Illinois.

To gauge the difficulty of the exam, Max took it in early 2018 along with two of his colleagues. None of them had any background in friction ridge analysis, yet all three missed only one of the 12 questions—a score of roughly 92%. Collaborative Testing Systems (“CTS”), the top provider in the U.S. of exams for the forensic fields, often receives complaints from technicians that the tests are both too easy and do not resemble the actual work done in the lab.

“If they’re trying to test a certain level of competence, it’s not testing that,” said Heidi Eldridge, a research scientist at the nonprofit RTI International.

Test providers like CTS are unable to duplicate the lab conditions that technicians face on a written exam, but the real obstacle limiting more difficult and comprehensive testing is money. With the cost of testing exceeding $300 per person, the crime labs that picked up the bill do not want to pay for something that their employees might not pass. The response has been to simplify the tests to a ridiculous degree. Of the 360 individuals who took the test along with Max and his colleagues, 348 received perfect scores.

The desired effect is to make forensic technicians appear to be infallible experts whom judges and juries will trust without question.

“The problem ... is that we use this as a shield when we go to court,” Eldridge said. “The moment we make that claim and we use the proficiency tests as evidence of expertise, now we’re claiming it’s measuring something that it’s not measuring.”

The federal government funds over 400 crime labs across the country, most of which have no oversight, fewer than two dozen employees, and are attached to local law enforcement. Only a few labs, such as the Houston Forensic Science Center, operate independently and have enough staff to conduct blind tests and other protocols to ensure competency. Employees take the same proficiency exams as technicians at other labs, but they can also testify to error rates and the possibility that mistakes can occur.

The Houston lab is asked to perform around 30,000 forensic analyses annually, which is a small fraction of the cases generated nationwide. For the vast majority facing forensic evidence at trial, justice remains a roll of the dice. That’s a far cry from the perception of near-infallibility many Americans have of forensic sciences. 

 

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

CLN Subscribe Now Ad 450x600
Advertise Here 4th Ad
Federal Prison Handbook - Side