Faulty Forensics and Lab Scandals Highlight Urgent Need for Enforceable Scientific Standards
by Matt Clarke
A 2009 report by the National Academy of Sciences (“NAS”) criticizing the varying quality of crime labs throughout the nation and questioning the scientific basis of several forensic methods that were routinely used to convict criminal defendants led to the hope that enforceable standards would soon be developed for forensic labs and the scientific methods introduced in court validated. That hope has proven to be overly optimistic, as prosecutors across the nation continue to introduce forensic testimony that makes unproven claims based on invalidated and even disproven methods such as bite mark analysis, microscopic hair comparison, and bullet groove and tool mark comparison. Even the gold standard cited in the NAS report—DNA testing—has proven to be fallible when comparison of complex, multiple-donor crime scene samples depends upon the subjective interpretation of forensic technicians.
The problems with forensic tests themselves are daunting, but they pale in comparison to the problems caused by corrupt and inept crime lab technicians who have tainted tens of thousands of criminal cases in recent years. Even when the technicians do their job honestly and competently, some prosecutors put a spin on the results that is anything but scientific.
Bite Mark Analysis
One common forensic method that is thoroughly debunked in the NAS report is bite mark analysis in which the teeth of a suspect are compared to bite marks left on human skin. The report points out that bite marks on skin change over time and can easily be distorted. Thus, they cannot be used to reliably identify one person as the source of the bite mark.
Despite such a damning report from the nation’s foremost scientific organization, it took until February 2016 for an official governmental entity to propose a moratorium on the use of bite mark evidence in criminal trials. In doing so, the Texas Forensic Science Commission noted that the validity of bite mark comparison has not been scientifically established.
The call for a moratorium came too late for former U.S. Navy sailor Keith Allen Harward, 60, who spent 33 years in prison after his conviction for the murder of a Virginia man and the rape of the man’s wife a few blocks from the Newport News naval shipyard in 1982. Key to his conviction was the testimony of so-called forensic bite mark experts. Lowell J. Levine told a jury that there was “a very, very, very high degree of probability” that Harward’s teeth left a bite mark on the wife’s leg. Another expert, Alvin G. Kagey, also linked the bite mark to Harward, testifying “with all medical certainty” that “there is just not anyone else that would have this unique dentition.” That testimony and the fact that the wife had noticed her assailant was wearing a military uniform were enough to convict Harward.
It turns out that someone else did have the “unique dentition” left on the victim’s leg. DNA evidence not only exonerated Harward, but it also revealed the actual perpetrator was Jerry Crotty. He died in an Ohio prison in 2006 while serving time for an abduction. Crotty served aboard the U.S.S. Carl Vinson with Harward in 1982. On April 7, 2016, the Virginia Supreme Court issued a writ of actual innocence declaring Harward innocent of the crimes.
“Mr. Harward is at least the 25th person to have been wrongly convicted or indicted based on discredited bite mark evidence,” according to Chris Fabricant, Director of Strategic Litigation for the Innocence Project, an organization affiliated with the Benjamin N. Cardozo School of Law at Yeshiva University. Fabricant warns: “We have no idea how many other people may have been convicted based on this [type of] evidence, but any conviction based on this grossly unreliable technique is inherently flawed. Every state in the nation should be conducting reviews to see if there are others like Mr. Harward sitting in prison for crimes they didn’t commit. Moreover, that this technique is still used in our justice system, including current capital prosecutions, presents a public safety threat.”
Harward is hardly alone in having been convicted based on bite mark comparison testimony. Eddie Lee Howard has been on Mississippi’s death row for over two decades after having been convicted of the rape and murder of an 84-year-old woman. In his trial, bite marks found on the exhumed body of the victim were compared to Howard’s teeth. The identification was made by Mississippi dentist Dr. Michael West, a for-hire forensic dental expert for the prosecution. His freewheeling methods “put a huge black eye on bite-mark evidence,” according to Dr. Richard Souviron, a Florida dental expert who helped identify serial killer Ted Bundy in 1979. Unfortunately, that identification helped ensure the popularity of the questionable forensic method among prosecutors nationwide.
In a May 26, 2016 opinion, the California Supreme Court overturned the 1997 murder conviction of Bill Richards for the murder of his wife. The Court determined that false forensic bite mark testimony influenced the outcome of the trial. During the trial, legendary forensic dentist Norman “Skip” Sperber testified that, based on his 40-plus years of forensic dentistry, only “one or two or less” out of 100 people would have the same “unique feature” in their lower teeth found in a bite mark on the wife’s hand and in Richards’ teeth.
In 2008, Sperber recanted his testimony, saying he had cited statistics that lacked scientific support. His recantation, along with new DNA evidence that pointed to an unknown male and the testimony of a defense forensic dentist, persuaded the trial court to conclude that the evidence now pointed “unerringly” to Richards’ innocence.
The state appealed that decision, and the California Supreme Court reversed it in a tortured 2012 opinion that California Lawyer magazine labelled the worst decision of the year. Remarkably, the Supreme Court “ruled that expert testimony was merely opinion and therefore could never be considered true or false.”
In response to that baffling decision, the California legislature passed the “Bill Richards Bill,” which is often referred to as the “junk science” statute. The bill amended the penal code by making recantation of an expert or changes invalidating the science underlying the original testimony a statutory basis for reversing a criminal conviction. Because of the new statute, Richards was able to appeal his case once again, and this time, the California Supreme Court unanimously agreed that “it is reasonably probable that the false evidence presented by Dr. Sperber at petitioner’s 1997 jury trial affected the outcome of that proceeding.” Accordingly, it overturned his murder conviction.
California’s “junk science” statute is only the second one passed in the nation. Texas was first state to do so in 2013.
Microscopic Hair Comparison
Hair analysis is another field of forensic science roundly criticized in the NAS report for lacking scientific validation. An ongoing review of the cases in which FBI hair analysts testified reveals that erroneous statements were made in over 90% of the cases tried before 2000. Analysts often falsely testified that a hair could be matched to a specific person. Some of the reviewed cases had already been overturned due to post-conviction DNA testing.
On February 2, 2016, a Massachusetts court vacated the conviction of George Perrot for a 1992 rape and burglary after finding the conviction was based upon an FBI expert’s erroneously overstated hair analysis. The 79-page opinion marked the first time a court conducted a thorough review of the science of microscopic hair comparison. The court conducted a two-day hearing during which it heard testimony from multiple defense and prosecution experts.
“The decision is vitally important because it will be followed by many other courts around the country which will have to decide how to deal with this erroneous testimony,” according to Fabricant. “While we don’t know how many cases may ultimately be reversed because of the use of this scientifically invalid evidence, we know from the preliminary findings of the review that FBI agents, over a period of more than two decades, erroneously testified or provided erroneous reports in more than 957 of the cases where microscopic hair analysis was used to connect a defendant to a crime.”
The Innocence Project and National Association of Criminal Defense Lawyers urged the FBI to conduct the review following the DNA exonerations of Donald Gates, Santae Tribble, and Kirk Odom, who were convicted in separate cases involving testimony by FBI hair analysts. Erroneous hair analysis testimony contributed to 207 of the more than 337 convictions that were later reversed based upon DNA evidence.
Tribble spent 28 years in prison and later won a $13.2 million award against the District of Columbia. He was convicted after a FBI agent testified that the chances were “1 in 10 million” that a hair from a stocking mask came from someone else. While incarcerated, Tribble developed a heroin addiction and contracted HIV and hepatitis. He suffers from severe depression, and though only 55, he is not expected to survive beyond 2019.
Tribble was held in solitary confinement for periods of up to nine months at a time. Additionally, he was “tasered, tear-gassed, and, at one point, held in four-point restraints and strapped to a concrete bed for four to five days” during a 1999 prison transfer. D.C. Superior Court Judge John M. Mott wrote that “Mr. Tribble’s ordeal did not merely deprive him of his liberty in a constitutional sense—it ruined his life, leaving him broken in body and spirit and, quite literally, dying.”
DNA testing established that none of the 13 hairs found in the stocking cap that were located near the crime scene came from Tribble or any of his alleged accomplices.
Odom, 54, spent over two decades in prison for rape. A D.C. court ordered the District to pay him $9.2 million. The District settled a lawsuit brought by Gates, 64, for $16.65 million. He alleged that police framed him for a 1981 rape and murder.
Bullet and Shell Casing
Tool Mark Comparisons
On January 22, 2016, a D.C. court of appeals ruled that claims by a forensic examiner that a bullet or shell casing can be matched to a specific weapon lacked a scientific basis and should be barred from criminal trials as misleading. A D.C. police expert had testified that three bullets came from a specific gun in the murder trial of Marlon Williams. He was convicted and appealed.
In the opinion, Associate Judge Catherine Easterly wrote that the erroneous testimony in the trial was “more than regrettable [as the government had characterized it]. It was alarming” like “the vision of a psychic” with “foundationless faith in what he believes to be true.” Unfortunately, Williams lost the appeal because his trial lawyer failed to object to the testimony.
Other Forensic Questions
Bite mark and hair analysis are the low hanging fruit of questionable forensics. Yet many of the methods believed to be on more sound scientific footing also suffer from lack of validation and other issues.
Forensic analysis of lead in bullets and matching of voice prints have already been discarded as scientifically useless, but not before they were used to help secure many convictions. The pattern of burns supposedly caused by liquids has been discredited for being scientifically unsound. Such burn pattern testimony led to the 2009 execution of Todd Willingham in Texas, despite the fact that the so-called science had been disproven two years earlier.
Even fingerprint comparison, long accepted in American courts, has problems. The problems are not with the statistics that set the probability that one fingerprint is the same as a randomly chosen fingerprint. Instead, the problem lies with the subjective determination by individual analysts as to whether a suspect’s fingerprint matches the unknown fingerprint with which it is being compared.
That is “where it gets a little fuzzy,” according to Glenn Langenburg, a fingerprint examiner with the Minnesota Bureau of Criminal Apprehension. When fingerprint examiners look at multiple fingerprints from the same source and different sources for protracted periods, “their brains get calibrated” to some internal threshold of similarity resulting in dissimilarities being ignored and similarities emphasized, Langenburg noted. This is especially true when dealing with the partial or degraded fingerprints typically found at a crime scene. That is an extremely important point because, while it takes multiple points of similarity to consider a fingerprint a “match,” it requires only one unexplained point of dissimilarity to prove they belong to different people.
The subjective nature of fingerprint analysis is demonstrated when fingerprint examiners are given blind tests. In one study of 169 examiners, there were 7.5% false negatives—errors where examiners said prints from the same person came from different people—and 0.1% false positives where examiners concluded prints from different people were from the same person.
Likewise, the recognized gold standard in forensics—DNA testing—loses a little of its luster when the subjective human element is introduced as part of the examination process. This is especially true when there is very little DNA available and/or the available DNA sample contains DNA from two or more donors.
Shannon Morris, Melissa Lee, and Kevin Rafferty have filed a lawsuit against the New York State Police crime lab that formerly employed them. They allege that when they tried to correct errors in DNA testing at the lab, they were silenced and fired because the errors were favorable to the prosecution.
The department was implementing a computerized DNA analysis called TrueAllele that would have eliminated the errors that occur when a technician subjectively interprets a complex mixture containing DNA from more than one person recovered from a crime scene. However, the investigation into their allegations was used as an excuse to cancel implementation of TrueAllele.
Similarly, in a recently filed civil rights lawsuit, Dr. Mariana Stajic alleges she was forced out of her position as laboratory director for the New York City Medical Examiner’s office after she criticized a DNA testing method known as low copy number (“LCN”). Other critics claim that the LCN method, which uses fewer strands of DNA than is recommended by the manufacturer of the testing equipment or the FBI, is unreliable. Stajic also served on the New York State Commission of Forensic Science and reportedly angered her superiors by voting with defense attorneys on the commission to require the public release of a study of the LCN method.
Greg Hampikian, a professor of biology and criminal justice at Boise State University and director of the Idaho Innocence Project, has spoken out publicly about contamination issues that plague crime scene DNA samples—especially those tested using smaller sample sizes than recommended by the FBI.
Cross contamination is what happened in the Amanda Knox case. Italian investigators found small amounts of Knox’s DNA on the handle of a knife, a small amount of her roommate’s DNA on the knife’s blade, and a tiny sample of her boyfriend’s DNA on the clasp of her roommate’s bra. They used this to tie both Knox and her boyfriend to the murder of her roommate. But the bra had not been collected until 48 days after the murder. During that time, it had been moved around the residence and repositioned multiple times by investigators photographing the scene. Further, the knife had been used by Knox for cooking and was collected from a kitchen drawer.
Hampikian and nine other prominent geneticists publicly released a demonstration showing that the small amounts of DNA could have been the result of contact transfers. They collected five empty soda cans from the office of the university’s dean of arts and sciences and placed them in evidence bags. Without changing gloves, they placed five newly purchased knives in separate evidence bags. On one knife, they found DNA at below-FBI-recommended levels from a member of the dean’s staff who had never touched or even been in the same room with the knives. Thus, they demonstrated the problems of cross contamination associated with using the LCN technique.
In 2013, National Institute of Standards and Technology (“NIST”) geneticist Michael Coble tested the reliability of 108 forensic labs around the nation. To do this, he told them a mix of DNA from several people had been discovered on a ski mask discovered at the scene of a robbery. He asked the labs to determine whether a “suspect” was part of the mixture using a separate DNA sample he supplied. 73 of the 108 labs got it wrong, concluding the mix contained the suspect’s DNA when it did not. “It’s the Wild West out there,” warned Coble. “Too much is left up to the individual analyst’s discretion.”
In May 2015, NIST awarded $20 million to a team of approximately 30 legal professionals and statisticians to aid in developing tools for analyzing the strength of an apparent forensic match. The resultant Center for Statistics and Applications in Forensic Evidence is based at Iowa State University. Its primary mission is “to build a statistically sound and scientifically solid foundation under two branches of forensics, pattern evidence (including fingerprints and bullet marks) and digital evidence (including data from cell phones and computers).”
The task is complex and may not even be possible for some items such as shoe soles, which involve a large number of ever-changing products, counterfeits, repaired soles, and shoes sold only overseas that sometimes make their way here to this country. But at least a serious attempt is underway to introduce scientific standards and rigor to more forensic sciences.
Crime Lab Scandals
Even when using scientifically sound methods, forensic labs are subject to all the foibles of having human beings performing the tests. Lab technicians have been caught faking test results because they had fallen behind in their workloads or because they wanted to support the prosecution’s case. They have been busted stealing drug evidence for self-use or sale. And they have been assessed as being incompetent.
Crime labs in Massachusetts are still reeling from the scandals involving former technicians Annie Dookan and Sonja Farak. Dookan was convicted of tampering with evidence in 2013. While working at the Amherst and Hinton crime labs, she falsified reports, claiming she had performed tests she had not, and the falsified results invariably favored the prosecution. She was involved in 257 of the of drug prosecutions—more than 24,000 cases—that came through the crime labs during her tenure.
“There are not enough lawyers in the state for this,” said Massachusetts ACLU legal director Matthew Segal. “Even if there were only one scandal in town, it would already be a crisis.”
But there is more than one scandal. To support her drug habit while working for nearly a decade at the Amherst crime lab, Farak stole lab reference samples used to identify suspected drugs and replaced them with other compounds such as baking soda. Eventually, she began stealing portions of the samples submitted by law enforcement and even smoking crack at work. This led to the discovery of the thefts in 2013.
Farak’s troubling behavior not only tainted the cases she worked on, but it calls into question all of the drug testing work done by all of the technicians at the lab, because they all used the same reference samples. But defense attorneys were unaware of the widespread contamination of evidence while prosecutors kept Farak’s actions secret for two years even after she was arrested.
“It’s just horrible,” said Segal. “This is why it’s not just about two chemists, it’s about an entire system that allowed this to happen, and once it did happen didn’t take steps to remedy it.”
And Massachusetts is hardly alone in having major crime lab scandals. Lab technicians in California and Montana have been convicted of stealing drugs from submitted evidence. Texas Department of Public Safety crime lab technician Jonathan Salvador was fired for fabricating drug tests, tainting nearly 5,000 cases in 30 counties.
In Minnesota, the St. Paul Police crime lab was shut down for six months in 2013 after inspections found it was filthy with technicians accidentally contaminating some samples and making fundamental mistakes testing other samples. Public defenders identified 1,700 drug cases tainted by the lab’s incompetence.
One former technician with the Oregon State Police crime labs reportedly stole drugs and gave false testimony while employed at four different labs from 2007 through 2015. Her misconduct not only affected the thousands of cases she personally analyzed but other technicians’ cases as well. She had access to all of the drug evidence and did not limit her thefts to her own cases.
“The scope of this alleged crime is massive,” said Deschutes County District Attorney John Hummel. “We have evidence that the suspect in this case would go to the lab late night, late weeknights and on weekends when nobody else was in the lab, and that she had unfettered access.”
After being alerted by a tipster in 2016, Houston television station KHOU obtained dozens of DNA samples from the Houston Police Department crime laboratory and sent them to independent experts for analysis. The results established that lab technicians were routinely misinterpreting even the most basic samples.
“If this is incompetence, it’s gross incompetence ... and repeated gross incompetence,” said attorney and University of California, Irving Professor of Criminology William Thompson. “You have to wonder if [the technicians] could really be that stupid.”
Josiah Sutton doesn’t wonder—he knows they are that stupid, and he has paid a steep price for their stupidity. When he was 16, Sutton and his 19-year-old neighbor were arrested for the 1998 rape of a 41-year-old Houston woman even though neither matched the woman’s description of her two assailants. At his trial, crime lab analyst Christy Kim testified that Sutton’s DNA was “consistent” with the DNA she had obtained from the victim and her clothing. Sutton was convicted and sentenced to 25 years in prison.
Hearing an earlier KHOU report on problems at the crime lab, Sutton’s mother tried to find help for her son. Most turned her away, stating that it was impossible to challenge DNA evidence. Finally, after he discovered a couple of unrelated cases in which a person was nearly falsely convicted by the crime lab’s false DNA evidence, KHOU reporter David Raziq agreed to hand carry Sutton’s files to Thompson.
Thompson soon discovered that three different DNA profiles obtained from the victim’s blood and saliva differed substantially from one another. The tech could not even properly prepare a simple profile from a single source. Then how could she analyze a complex sample with multiple donors, wondered Thompson.
Thompson found a profile from semen left at the crime scene. It did not match Sutton, yet he and his lawyers had never been told about that exculpatory evidence. This led to a television report on Sutton’s case. Following the report, a judge ordered the DNA evidence reprocessed at a private testing facility. The results exonerated Sutton, and he was released in 2003, four years after his arrest. In 2006, a cold case hit on the FBI’s DNA database resulted in the arrest of a different man who confessed to the rape.
Sutton’s case is hardly unique. In NYU law professor Erin Murphy’s book “Inside the Cell: The Dark Side of DNA Testing,” Murphy highlights the case of the Phantom of Heilbronn, whose DNA was discovered at over 40 crime scenes throughout Europe. The 15-year hunt for the Phantom tied up untold amounts of police resources. Yet the Phantom turned out not to be the master criminal everyone expected, but a worker at the factory in Austria that manufactured the DNA testing swabs. It was a simple, yet very instructive, case of contamination.
Murphy also reports how California homeless man Lukis Anderson was arrested for murdering millionaire Raveesh Kumra at his mansion outside San Jose. The only problem was that Anderson was in a coma in the hospital when the murder occurred. Nonetheless, police kept him in jail for five months, suspecting him as the murderer. His lawyer was finally able to produce records showing that the same fingertip oxygen-monitoring device used by paramedics on Anderson had later been placed on Kumra’s hand, explaining how Anderson’s DNA turned up under Kumra’s fingernails.
In yet another case of human error resulting in misleading DNA test results, Nevada teenager Dwayne Jackson was exonerated two years after he pleaded guilty to a robbery he did not commit. He pleaded guilty after being presented with seemingly damning DNA evidence. He was exonerated after the police lab admitted it had accidentally swapped the DNA of the real culprit with Jackson’s.
A Different Kind of Problem
Dozens of crime labs have experienced similar scandals caused by corrupt or inept technicians, but one is currently dealing with a scandal caused not by sloppy lab work but by absurd policies. The Michigan State Police (“MSP”) Forensic Science Division crime labs have falsified reports on marijuana because of an MSP policy that allows prosecutors to charge drug defendants with crimes they did not commit. The policy, promulgated in 2013 by Assistant Attorney General Ken Stecker, instructs the labs to report marijuana edibles and oils as Schedule I synthetic THC (the active ingredient in marijuana). That classification constitutes a felony. Both the law and science classify plant-based marijuana edibles and oil as marijuana, the possession of which is a misdemeanor.
This policy is part of the attorney general’s crusade against medical marijuana patients, which has resulted in such cases making up 40% of the lab’s workload. This is the same lab that recently took five years to clear a backlog of untested rape kits. Arrests for marijuana possession have increased 17% in Michigan since it legalized medical marijuana in 2014. During the same time period, overall arrests declined 15%.
The Need for Standards and Accreditation
At a meeting of the National Commission on Forensic Science in December 2015, Deputy U.S. Attorney General Sally Yates announced the U.S. Department of Justice’s new policy requiring its prosecutors to use only evidence processed by accredited crime labs. The commission had overwhelmingly voted in favor of mandatory accreditation in April 2015. Unfortunately, implementation is delayed until 2020. Currently, accreditation is voluntary, except in the few states that actually require crime lab accreditation.
Approximately 17% of publicly-funded crime labs lack accreditation. The rate is believed to be much higher among the nation’s many private labs. The Laboratory Accreditation Board of the American Society of Crime Laboratory Directors is a primary forensics accreditor. It has accredited 356 publicly-funded labs, but only 26 of the hundreds of privately-funded labs have been accredited by the board.
Some commission members complain that universal accreditation is not enough. They believe the accreditation process must also be rigorous. One problem is that some labs are accredited by private entities. Often the private accrediting entity is associated with a professional forensics association. However, each entity sets its own standards. Many have been criticized for lacking rigorous standards and having little influence in correcting problems when they are identified. Further, many accreditors allow labs to select which cases the accreditors review, rather than having them selected at random.
“What’s the chance you’ll give them your best-ever casework? It’s high. That’s what I would do,” said Case Western Reserve University law professor and evidence expert Paul Giannelli.
Still, mandatory accreditation is a step in the right direction according to commission vice chairman Nelson Santos. It might even encourage lab technicians to step up their professional game.
“I think it’s made our lab credible—not just to prosecutors and defense attorneys and courts, but to the staff themselves,” said commission member Cecelia Crouse, who is the director of the Palm Beach County Sheriff’s crime laboratory in Florida.
Let’s hope she is right because jurors now demand scientific evidence because of the so-called CSI effect, and many tend to believe that type of evidence even if it is seriously flawed. Thus, until rigorous standards are applied to the nation’s crime labs, we will continue to see wrongful convictions based upon flawed science, bad science, and pseudo-science.
Sources: www.statesman.com, www.bostonherald.com, www.nytimes.com, www.washingtonpost.com, www.sciencemag.org, www.rawstory.com, abcnews.go.com, www.thedailybeast.com, newyork.cbslocal.com, www.innocenceproject.org, www.bbc.com, www.huffingtonpost.com, https://thecrimereport.org, www.pbso.org, www.rsc.org, https://psmag.com, www.ktvz.com, https://archive.theweedblog.com, www.themarshallproject.org, https://theintercept.com, www.theatlantic.com, http://forensicstats.org