Skip navigation
Disciplinary Self-Help Litigation Manual - Header
× You have 2 more free articles available this month. Subscribe today.

The Pseudoscientific Practice of Blood Spatter Analysis How the Desire for Convictions Drives Flawed Prosecutions

by Anthony W. Accurso

The forensic science known as Bloodstain Pattern Analysis (“BPA”)—a.k.a. blood spatter analysis—is undergoing significant development after being the object of intense criticism regarding its reliability in the context of criminal prosecutions. Despite being practiced for over 150 years, this field has undergone two periods of dramatic change: first, in the mid-twentieth century, practitioners sought to standardize and popularize the application of BPA in courtrooms, and second, since the dawn of the twenty-first century, scientists and policymakers have sought to reform its application through better scientific understanding and rules to lessen the likelihood of flawed prosecutions.

Yet, the utility of blood spatter analysis in prosecuting emotionally-charged, violent crimes often puts its most frequent practitioners—law enforcement officers who often receive as little as 40 hours of training—at odds with the uncertainties inherent to a still-developing science. This conflict results in forensics “experts” opining on blood pattern evidence in criminal trials, despite lacking a solid scientific foundation for the conclusions they reach yet authoritatively testify before juries. Thus, while claiming the authority of science to persuade juries about the guilt (or sometimes innocence) of a defendant, these “experts” risk peddling unreliable pseudoscience, resulting in the subversion of justice. The result is that an innocent person can spend decades in prison for crimes they did not commit, while the actual perpetrator remains at large, free to commit more violent offenses.

Some of these failures have been publicized in a way that drew attention to the (mis)use of BPA in courts and prompted reforms. How these failures unfold can be illustrated by the case of David Camm.

Eight Flecks of Blood

Around 7 p.m. on the evening of Sep. 28, 2000, David Camm arrived at the Georgetown Community Church in his hometown of Georgetown, Indiana, to play basketball. David was a former Indiana State Trooper who went out to shoot hoops with 11 other guys around the time his wife and children were returning home from swim practice.

Later that evening, at around 9:30 p.m., David returned home to a horrific scene: his wife, Kimberly, and their two children, Bradley (age 7) and Jill (age 5), had been shot to death in the family’s garage.

Floyd County law enforcement documented the scene of the crime, including having photographer Robert Stites take pictures of all the blood evidence. Among these photos was a shot of the t-shirt worn by David that evening, which bore eight flecks of blood in a small grouping.

The BPA specialist used by Floyd County was Rodney Englert. Though he did not visit the scene of the murder, Englert concluded from the photo of Camm’s t-shirt that the pattern of blood on it was “backspatter”—stains created from a high-velocity impact that deposited on the shirt when its wearer was near the victim(s) during the shooting.

Three days after losing his wife and children, David Camm was charged with their murder. Englert’s interpretation of the eight flecks of blood would not only be the chief piece of evidence upon which the State’s case rested, it also largely drove the police investigation from that point forward.

How this small amount of blood came to be so influential is wrapped up in the history of BPA as a forensic science, and its acceptance in court systems around the country.

Macabre Experiments

While the first mention of blood spatter at trial comes from an 1880 Mississippi Supreme Court decision it wasn’t until the middle of the twentieth century that BPA was officially admitted as an area of expertise in criminal trials.

Dr. Paul Leland Kirk first testified about the significance of blood evidence in 1954 in the case of Sam Sheppard, an Ohio doctor accused of murdering his wife.Dr. Kirk was a renowned biochemist who, after working on the Manhattan Project helping the U.S. develop its first nuclear weapons, turned his significant scientific expertise toward the field of criminology and established a first-of-its-kind academic program in criminalistics at the University of California at Berkeley in the 1950s.

Sheppard’s attorney hired Dr. Kirk to review the blood spatter evidence, and Kirk offered an interpretation of the events that the defense believed exonerated Sheppard. Though he was convicted at his first trial, Sheppard was acquitted at his retrial in 1966, largely because of Dr. Kirk’s testimony.

Shortly after Sheppard’s first trial, Dr. Kirk also testified in People v. Carter 312 P.2d 665 (Cal. 1957), a case involving a California man who was accused of beating to death the owner of the Log Cabin Bar in Chico. Though this was only his second time testifying, the California court admitted Dr. Kirk as an expert in BPA, and this classification was upheld on appeal in 1957.

In that decision, the California Supreme Court stated: “Dr. Kirk was permitted to testify that he made certain experiments in blood dynamics: that he had beaten bloody objects with instruments of different shapes in order to grasp the relationship between blood spots and their causes.” This record was “introduced to qualify him as an expert in blood dynamics, to show that by training and experience he was able from an analysis of the size and shape of blood spots to determine their source with some degree of accuracy, and to enable the jury to test the reliability of his opinions by revealing their foundation.”

Dr. Kirk testified that the blood patterns at the scene of the crime enabled him to infer where in the bar the victim had been beaten and that the defendant “must have been not more than two and one-half feet from the source of the blood.”

The Court concluded that, “[t]hese inferences required knowledge and experience beyond those of ordinary jurors and could assist them to weigh the evidence more perceptively than they could unaided.”

It was also in the 1950s that the greatest proselytizer of BPA got his first taste of its possibilities. Herbert MacDonell was a chemistry undergraduate who got a job in a Rhode Island state crime laboratory. MacDonell would later move to Corning, New York, to work for the local corporate giant, Corning Glass Works—a company best known for CorningWare casserole dishes and the Gorilla Glass that provides sturdy screens for modern smartphones. MacDonell worked for Corning during the day while teaching crime lab forensics at night at a community college.

This expertise in forensic science brought him to testify in a New York murder trial in 1968. Steven Shaff, a veterinarian, had shot a former employee and claimed it was an accident. Investigators alleged that Shaff shot the victim in his vehicle, but Shaff claimed the victim had thrown open the vehicle’s door, causing Shaff’s gun to discharge. Though MacDonell’s expertise had the limited use of pointing out that blood present along the inside edge of the car door (where it joins with the door frame) could only have been deposited there while the car door was open, this case ignited MacDonell’s imagination for the possibilities of blood spatter evidence.

MacDonell constructed a lab in the basement of his home and conducted macabre experiments with blood—similar to the ones described by Dr. Kirk—to further his expertise. According to ProPublica “he described shooting dogs to record the resulting spatter and drenching a female student’s hair in human blood then having her shake it around to photograph the patterns.”

A year after testifying in the Shaffcase, MacDonell got the Department of Justice (“DOJ”) to fund his work. This culminated in his 1971 report, published by the DOJ, “Flight Characteristics and Stain Patterns of Human Blood.”

It should be noted that, even though Dr. Kirk and MacDonell had been conducting “experiments,” their activities were merely setting the stage for a true scientific analysis of blood patterns in forensics. MacDonell acknowledged this in the opening to his 1971 report, stating: “[f]inal conclusions should be considered from the legal viewpoint of ‘proof within a reasonable scientific certainty.’ Little attempt has been made to express data in this report in a statistical manner.”

Despite acknowledging the qualitative, rather than quantitative, nature of his work, MacDonell began traveling to conferences and industry meetings to present his research. This spread the word about his claims to be able to derive meaning from patterns of blood, and he was more frequently hired by police and prosecutors to consult on, and testify about, blood spatter evidence. He soon quit his job at Corning Glass in favor of working full time as an instructor and forensic expert for hire.

In 1973, he began teaching a 40-hour class he dubbed “Bloodstain Evidence Institutes.” Rather than hire MacDonell to consult on every violent crime in the country (Dr. Kirk diedprior to the publication of MacDonell’s DOJ report, and few other researchers publicized their willingness to testify in this area), police departments preferred to have a trained “expert” on staff to analyze cases.

Though he would emphasize his own scientific background when called to testify, MacDonell would advertise to his students that there were “no minimum educational requirements to be accepted into the class.” In an interview with a reporter for ProPublic, MacDonell could recall failing only five students in his 38 years of teaching classes, compared to the more than 1,000 students he taught. Also, according to ProPublica, “[b]y 1982, [he] taught 19 institutes in eight states (Mississippi, New York, Florida, Alabama, Indiana, Illinois, Louisiana and Colorado)” and “gave single-day seminars in Germany, Italy, England, Switzerland and Canada.”

MacDonell later taught what he called “advanced” classes, and the graduates of these classes would go on to form the International Association of Bloodstain Pattern Analysts (“IABPA”) in 1983. The group would subsequently begin publication of its Journal of Bloodstain Pattern Analysis—similar to how other scientific fields have published peer-reviewed journals.

By the late 1980s, MacDonell was becoming something of a superstar in the forensics world. A 1987 three-part series about MacDonell referred to him as “an American Scotland Yard, all by himself”—a quote lifted straight from the back of his coauthored memoir, The Evidence Never Lies: The Casebook of a Modern Sherlock Holmes.

It was against this background that the Iowa Supreme Court upheld MacDonell’s certification as a reliable expert with regards to BPA in State v. Hall, 297 N.W.2d 80 (Iowa 1980). The Court stated, in regard to BPA, that “the study of blood characteristics is relatively uncomplicated,” and its “observations are largely based on common sense, and in fact, lie close to the ken of an average laymen”

Regarding the reliability of MacDonell’s methods, the Court pointed to: “(1) Professor MacDonell’s considerable experience and his status as the leading expert in the field; (2) the existence of national training programs; (3) the existence of national and state organizations for experts in the field; (4) the offering of courses on the subject at several major schools; (5) use by police departments throughout the country in their day-to-day operations; (6) the holding of annual seminars; and (7) the existence of specialized publications.”

What the Court does not mention, according to reporting by the New York Times, is that “MacDonell himself is the source of almost all of these indicators of reliability.” Indeed, as state after state upheld the use of BPA experts as reliable interpreters of evidence in criminal trials, nearly every case involved MacDonell or someone he had trained or referenced such a case when deferring to another court’s determination.

Thus, while Dr. Kirk was the first expert certified by a state court to testify on BPA, MacDonell and his students were responsible for its spread and acceptance in a majority of jurisdictions in the United States. In 2004, a Texas court of appeals wrote: “Have any courts held blood spatter analysis to be invalid? The short answer is no.” Holmes v. State, 135 S.W.3d 178 (Tex. App. 2004).

MacDonell would continue to testify until 2012 when he was charged in Corning Town Court with forcible touching, two counts of endangering the welfare of a child, exposure of a person, and aggravated harassment in the second degree, all stemming from the alleged sexual abuse of two young girls, ages 11 and 16. He pleaded guilty to the harassment charge in exchange for the state dropping the other charges, though he later publicly expressed regret over the decision while claiming his innocence. But this conviction meant that he could no longer testify in court because his credibility would be too easy to undermine on the witness stand.

However, by the time MacDonell retired, he had certified over 1,000 of his students in BPA. And unlike Dr. Kirk—who had enough scientific training to contribute to the Manhattan Project—and MacDonell—who had at least a chemistry degree and himself taught at the college level—these trainees in the field of BPA were mostly law enforcement professionals with little or no scientific background, many of whom lacked a basic understanding of the alleged science underpinning their methods. And even more troubling is that some of these trainees themselves began teaching classes on the subject.

“I think if you were to do a study,” Ralph Ristenbatt, an instructor of forensic science at Pennsylvania State University, said, “of all the people who call themselves bloodstain-pattern experts and you looked at the genealogy, if you will, of how they’ve obtained their training, it’ll all likely come back to Herb MacDonell through some means.”

Pseudoscientific Practice
Versus Scientific Study

At David Camm’s first trial, the State’s BPA expert, Rodney Englert, provided key testimony that the eight flecks of blood on Camm’s shirt were consistent with “high-velocity impact spatter,” a pattern he claimed could only have occurred if Camm were nearby his wife and children when they were shot

However, Camm’s defense attorney hired their own BPA expert who—after reviewing the same evidence as Englert—concluded that the blood on the shirt was a “transfer stain,” deposited by his daughter’s hair when he tried to save his children.

Later, at his retrial in 2013, forensic scientist Robert Shaler testified that the stain’s meager size was “too little information from which to draw any meaningful conclusion.”

The disparity of interpretations by various BPA experts speaks to the central issue at play: Though BPA started out like many other fledgling fields of scientific inquiry, its current practitioners are more pseudoscientific—they try to appear scientific despite lacking sufficient training and experience, all while peddling unfounded confidence to juries and derailing the pursuit of justice.

But it turns out that, as shocking as this truth is, BPA is not alone among forensic “sciences” facing harsh criticisms about their reliability.

The systemic failure of forensics was revealed after the advent of DNA testing—the gold standard of forensic science—when the retesting of evidence began to overturn convictions. Crediting the Innocence Project, ProPublica concluded that, “[o]f the 250 DNA exonerations that occurred by 2010 throughout the United States, shoddy forensic work—which ranged from making basic lab errors to advancing claims unsupported by science—had contributed to half of them.”

As the mounting number of exonerations exposed the sorry state of forensics in the United States, calls intensified for a systemic review of various fields that had long been criticized as pseudoscientific. This prompted the federal government to conduct an in-depth analysis of the state of the science underpinning the fields of forensic practice.

The first major report bearing on BPA was published in August 2009 by the National Academy of Science—a private, non-profit organization created by the U.S. Congress in 1863 with the mission to advise the nation on science, engineering, and medicine. The report, titled “Strengthening Forensic Science in the United States,” was a devastating shot across the bow of the forensics community.

The portion of the report focusing on the reliability of various forensic fields classified them largely into two categories: laboratory based (e.g., DNA analysis, toxicology, and drug analysis) and those based on the “interpretation of observed patterns” (e.g., fingerprints, writing samples, tool marks, bite marks, and specimens such as fibers, hair, and fire debris).

It stated that “[t]he level of scientific development and evaluation varies substantially among the forensic science disciplines,” and leveled its harshest criticisms against the pattern interpretation disciplines.

Even ones that had been near-universally accepted by those within the criminal justice system were questioned. Friction Ridge Analysis—commonly known as fingerprint comparison—is accomplished by the ACE-V method (“Analysis, Comparison, Evaluation, and Verification”). The report stated that ACE-V “is not specific enough to qualify as a validated method”; “does not guard against bias; [and] is too broad to ensure that two analysts following it will obtain the same results.”

Hairexamination has long suffered from a lack of scientific rigor, and the report concluded that, “due to the fact that so many of the characteristics coded are subjective—for example, color, texture—it was not possible to get complete reproducibility between two or more examiners coding the same hair.” The report’s authors cautioned against ever trusting a claimed hair sample match based on visual examination alone, saying, “[i]n cases where there seems to be a morphological match (based on microscopic examination), it must be confirmed using mtDNA analysis; microscopic studies alone are of limited probative value.”

Subsequently, in 2015, the FBI admitted that two dozen examiners in one of its hair analysis labs had given flawed testimony in hundreds of cases. Of these, 32 defendants had been sentenced to death, and 14 of them were eventually executed or died in prison, according to Reason.

BPA was similarly criticized with the bottom-line assessment being that “[t]he uncertainties associated with [BPA] are enormous” and “that some experts extrapolate far beyond what can be supported.” The report notes that some aspects of BPA are supported by scientific studies—“for example, if blood spattered quickly or slowly.” Other conclusions lack a scientific underpinning, with the report stating that, “[a]lthough the trajectories of bullets are linear, the damage that they cause in soft tissue and the complex patterns that fluids make when exiting wounds are highly variable.”

While forensic disciplines such as Forensic Odontology (bite mark analysis) and Polygraphy (lie-detector tests) have “no scientific support” for the claims made by their practitioners, the DOJ was confident enough in the basis of BPA that it funded a follow-up study whose purpose was “to produce the first baseline measure of reliability for the major BPA method of pattern recognition”

“Reliability Assessment of Current Methods in Bloodstain Pattern Analysis” was published in June 2014 by the National Institute for Justice (“NIJ”), the research arm of the DOJ. The study’s authors invited 27 analysts from North America, Australia, and Europe to identify various patterns created by the research team. All of the analysts had at least 80 hours of training, had been active in BPA casework for a minimum of five years, and had been qualified by a court as a BPA expert to provide testimony. The results were quite damning.

The team produced various stains—drip patterns, transfer patterns, blunt force impact spatter, cast-off patterns, and firearms-related patterns (both back spatter and forward spatter)—onto 16 inch by 16 inch tiles, representing hard surface targets (painted, wallpaper, and chipboard) and fabric surface targets (polyester pants, blue denim jeans, and grey cotton sweatpants).

The study’s authors noted limitations in their methods, notably that some of the stains were produced under “ideal” conditions—meaning that these patterns were less “messy” than real-world examples and should thus be easier to identify, making them less representative of analysts’ competency when conducting actual casework. The team said the “approach used here was designed to help define the upper limit of pattern classification reliability by focusing attention on method reliability rather than on analyst competency.” (emphasis in original).

Even with these caveats in place, the results were stunning. When presented with open-ended pattern identification scenarios, over half of the responses were labeled “inconclusive.” Of patterns where a classification was attempted, there were success rates of “close to [] 50%” and 64% for hard surfaces and fabric surfaces, respectively. When presented with multiple choice responses, the patterns listed as inconclusive dropped to 17% and 14%, where 14% and 23% “of these classifications did not include the correct pattern type,” again listed respectively with hard surface and cloth targets

Aside from troublingly large rates of unclassified or misclassified patterns, the study’s authors noted two more structural problems with BPA methodology.

The first is that pattern analysis is often performed by investigators involved with other aspects of the case—recall that many experts certified by MacDonell were police—which “means that at the stage of pattern classification, additional case-specific information, such as medical findings, case circumstances and even witness testimony is being allowed to factor into analysts’ interpretations.”

This leads to “the well-known phenomenon of confirmation bias. Where a scenario was offered that deliberately pointed analysts towards an incorrect classification, the proportion of misclassifications that resulted was significantly higher than that observed for patterns with neutral scenarios.”

The authors say this issue is compounded by the second structural problem, “that current pattern classifications used in BPA are described in terms of pattern formation mechanisms, which actually makes them components of a reconstruction theory, rather than a summary of pattern characteristics.”

This poses a significant challenge to reliability because, as the study authors noted, “[i]t is well known that different bloodletting mechanisms can give rise to bloodstain patterns that possess similar or indistinguishable characteristics” For instance, “impact, expiration and transfer patterns can all feature small bloodstains and can be confused with one another, especially on fabric.” This means that an analyst, with insufficient training or who is influenced by misleading case details, may fail to accurately distinguish a bloodstain on a suspect’s clothing as being deposited during an attempt to resuscitate a victim, versus being deposited when the suspect beat the victim with a blunt object—two scenarios that speak to diametrically opposite courses of action.

A similar finding was included in a report written by New York criminal defense attorney Michael Litman, which said that “the exact same blood spatter trajectory would appear very differently when it hits an absorbent material (such as a bedsheet) as opposed to a non-absorbent material (such as a solid wall). It has been found that when blood spatters on absorbent materials, the patterns can appear distorted and lead forensics scientists and investigators to misinterpret the direction from which the blood traveled.”

The most recent, comprehensive review of forensic sciences was a 2016 report by the President’s Council of Advisors on Science and Technology (“PCAST”), a group organized at the direction of President Obama to review the use of forensic sciences in criminal prosecutions.

The group found that several common forensic methods “have revealed a dismaying frequency of instances of use of forensic evidence that do not pass an objective test of scientific validity.” For instance, with regards to bite mark analysis, the report said that “available scientific evidence strongly suggests that examiners not only cannot identify the source of [a] bite mark with reasonable accuracy, they cannot even consistently agree on whether an injury is a human bite mark.” The report was no less scathing in its review of BPA, saying, “[o]ur results show that conclusions were often erroneous and often contradicted other analysts.” Also, “[b]oth semantic differences and contradictory interpretations contributed to errors and disagreements, which could have serious implications if they occurred in casework.”

The 2014 NIJ study assessing the reliability of BPA observed that, “[l]ike many other disciplines from the early days of forensic science, its use and acceptance occurred without rigorous validation,” because “very little is known about [BPA’s reliability] beyond the instincts of experienced instructors and investigators who have observed over many crime scene and practical sessions in the classroom.”

Similar studies have attempted to statistically validate error rates in bloodstain-pattern recognition, but each of these acknowledges that lab-created blood patterns made using materials crudely analogous to victims’ bodies (such as shooting a blood-soaked sponge with a pellet gun) fail to capture the complexities of real crime scenes and that “it is generally impossible to know with certainty the ‘true’ mechanistic cause of a bloodstain pattern at a crime scene.”

It was with these limitations in mind that the PCAST report recommended expert witnesses be required to “disclose error rates in their testimony and, where methods haven’t been scientifically verified, not use them at all,” according to Reason.

While the Obama administration took no firm steps to implement this recommendation, the Trump administration’s response was openly hostile. According to Reason “[i]n 2017, Attorney General Jeff Sessions disbanded the National Commission on Forensic Science, an independent panel of scientists, law enforcement, judges, and defense attorneys created by the Obama administration in 2013 to review the reliability of forensic science used in trials.”

Sessions then replaced it with the Forensic Science Working Group—headed by a former state prosecutor—with the more limited mission of developing “uniform language” for federal experts’ testimony. After Sessions stepped down, Deputy Attorney General Rod Rosenstein complained that “forensic science has been under attack” by critics who took “an erroneously narrow view of the nature of science and its application to forensic evidence.”

The different attitudes reflected in the reactions of the Obama and Trump administrations mirror the two predominant attitudes present within the U.S. population when people are faced with pseudoscientific claims. Confronting the inconvenient truth that a valuable tool used in police work (BPA) was unreliable, the Obama administration tried to determine what about BPA was salvageable by funding studies into the field’s limitations—a long-term strategy that would take time but would ultimately increase the public’s confidence in BPA’s use by determining what about BPA was legitimately grounded in science.

The Trump administration’s response was to value the usefulness of BPA (and other forensic sciences) in securing convictions and to silence rigorous scientific review in favor of this prosecutorial value. Though this dichotomy is playing out in our predominate political parties, it is further mirrored in attitudes of real Americans.

Take astrology for instance. Peddlers of horoscopes claim that the movements of planets, moons, and stars have definite and predictable effects on the lives of individuals and that these effects relate somehow to a person’s birth date (or birth minute, depending on the peddler).

This is clearly pseudoscientific for three reasons, though any one of them would be enough to classify astrology as unscientific: (1) there is no plausible mechanism that has been identified that would cause these effects; (2) there is no error-correcting method by which practitioners can improve their predictions; and (3) the predictions made are often too vague to be tested, or when they are specific enough, are no better at predicting outcomes than random chance.

Yet, according to an Insider poll, a full 13% of Americans believe astrology to be credible. This means that nearly 43 million of our neighbors believe in horoscopes, and many of these people will make decisions based on this belief.

Most people do not believe in astrology but say little about it because it has no perceptible effect on their lives. Fortunately, this pseudoscience is generally harmless.

Tragedy arises when unfounded belief in pseudoscience has real and devastating consequences for others. Most American have no vested interest in whether the practice of BPA is grounded in science because it will never affect their lives. For persons charged with a violent crime—especially innocent persons—the confidence a judge and jury place in a BPA expert can have life-changing consequences.

“An expert who says, ‘This is what the physical evidence shows,’ is extremely persuasive, especially in a criminal case,” said Judy Royal, a staff attorney with the Center on Wrongful Convictions, to ProPublica. “Jurors don’t understand when an expert is overstating findings or going beyond what can be tested and replicated.”

‘A Judge Should Be Able to Recognize Unqualified Charlatans’

Our criminal court system is supposed to be the way that our society collectively determines the truth of our history as it relates to violations of the social contract. As such, we expect it to have a mechanism for determining the reliability of evidence provided by the opposing parties. The system does have such a mechanism, but it is deeply flawed and reflects Americans’ historically ambiguous relationship with science.

In detailing this flaw, we must first recognize that we have not one court system but rather an overlapping patchwork of court systems. The criminal courts in each state and federal judicial district will often share best practices, but they don’t always agree on what constitutes the “best.”

Courts in general have rules about who is allowed to interpret evidence for a jury. It is seen as generally preferable to present evidence to the jury directly, but juries cannot always be expected to possess the knowledge and experience in every field of inquiry. Consequently, experts are allowed in court to present their opinions.

For instance, Idaho Rule of Evidence 702 states: “If scientific, technical, or other specialized knowledge will assist the trier of fact to understand the evidence or to determine a fact in issue, a witness qualified as an expert by knowledge, skills, experience, training, or education may testify thereto in the form of an opinion or otherwise”

The commentary on this rule states: “Because of the rules’ emphasis on liberalizing expert testimony, doubt about whether an expert’s testimony will be useful should generally be resolved in favor of admissibility.... Expert testimony may also be excluded when it would confuse the jury, be more prejudicial than probative, or be needlessly time-consuming.”

Though these specific examples come from Idaho, they are representative of the analogous rule present in courts all across the country.

The way this plays out then is that one party puts forward a witness for the purposes of interpreting some piece of evidence. That party states the qualifications of the witness in relation to the evidence, while the opposing party attempts to argue that those qualifications are insufficient. In the end, the judge will determine whether the witness is sufficiently qualified.

Given the proliferation of scientific disciplines, as well as practitioners of pseudoscience, some courts in the modern era have sought to establish a review of more than just an expert’s qualifications and instead look at whether the methods underlying their analysis are sufficiently scientific to justify their conclusions.

For courts seeking to adopt a more thorough analysis of the underlying science, many have adopted the test from Frye v. United States, 293 F. 1013 (D.C. Cir. 1923). The Frye Court was faced with the decision whether to approve testimony by a polygrapher about the lie-detector test performed on the defendant.

In determining that the polygraph was insufficiently reliable to admit into evidence, the Court said the following: “Just when a scientific principle or discovery crosses the line between the experimental and demonstrable stages is difficult to define. Somewhere in this twilight zone the evidential force of the principle must be recognized, and while courts will go a long way in admitting expert testimony deduced from a well-recognized scientific principle or discovery, the thing from which the deduction is made must be sufficiently established to have gained general acceptance in the particular field in which it belongs.”

A review by the Court of the scientific community at the time of the Frye decision showed no consensus that the polygraph could reliably do what its practitioners claimed—that it could distinguish truth from lies. In the intervening 100 years, no court has found otherwise, and polygraph results are generally inadmissible as proof of guilt. Yet this does not stop investigators from subjecting suspects to these tests because law enforcement officers generally believe in this pseudoscientific device despite multiple studies that have shown it to be no better than chance at detecting lies.

The wisdom of the Court’s decision in Frye is that the U.S. Court of Appeals for the D.C. Circuit recognized the difficulty faced by judges in determining whether a scientific expert is credible, especially when these judges lack sufficient scientific training themselves. The Court therefore chose to offload the problem to the scientific community—basically, the expert’s peers. Thus, if sufficient scientific studies have been published that validate the method used by a proposed expert, and the expert is sufficiently qualified, then their opinion may be presented to the jury.

However, not all courts have adopted the Frye standard. Some courts, such as in the state of Idaho, have specifically rejected additional admissibility rules for evidence, preferring instead to trust judges to make these decisions on a case-by-case basis.

This is troubling because there is no reason to believe that judges are any better at sorting credible scientific theories from pseudoscience than the general public, even when assisted in the task by attorneys. In a country where approximately 6 million people believe the Earth is flat—again, according to Insidershould we really allow judges alone to make this decision?

In addition to the problem of judges not having sufficient scientific training, courts often fail to weed out pseudoscience, relying on the findings of other courts to guide them. This practice is called “precedent,” and while it works well enough for matters of legal importance, it fails as a method of excluding pseudoscientific claims.

“Precedent is like a child’s game of telephone,” federal Judge Nancy Gertner said in a statement to ProPublica. “You start off saying something. You whisper it down the line and you continue to whisper it even though it no longer makes sense.”

The first few courts to uphold the admittance of BPA either rejected Frye, failed to apply it correctly, or were too dazzled by the claims to expertise of Dr. Kirk and Herbert MacDonell. And MacDonell could dazzle an audience. ProPublica described the “strange sight in the 1985 Texas courtroom, [where] the jurors, the judge and even the defense attorneys watched, rapt, as MacDonell laid the mirror flat and then climbed up on a chair, holding the vial and dropper,” and proceeded to drip blood on the mirror. This had been hastily borrowed from the courthouse bathroom, and the vial contained MacDonell’s own blood, drawn earlier that day at a nearby hospital.

Subsequent courts would look to these decisions to admit BPA evidence and conclude that it “is considered a proper subject of expert testimony.”

A Texas Court of Appeals concluded in 1987 that MacDonell’s testimony was admissible by stating, “MacDonell’s studies are based on general principles of physics, chemistry, biology, and mathematics, and his methods use tools as widely recognized as the microscope; his techniques are neither untested nor unreliable.” Lewis v. State, 737 S.W.2d 857 (Tex. App. 1987).

In light of recent criticisms of BPA and studies demonstrating the clearly unreliable results proffered by many of its experts, the statement of the Court of Appeals in Lewis amounts to a monumental failure akin to saying that an expert is trustworthy because they wear a lab coat and regularly use a microscope.

MacDonell was not able to dazzle everyone, despite his best efforts. In his dissent in the 1980 decision of the Iowa Supreme Court in Hall Judge Mark McCormick wrote, “I am unable to agree that reliability of a novel scientific technique can be established solely on the basis of the success of its leading proponent in peddling his wares to consumers.”

Similarly, in his dissent to the Idaho Supreme Court’s decision in State v. Rogers, 812 P.2d 1208 (Idaho 1991), Judge Stephen Bistline wrote: “The danger presented by expert testimony interpreting blood-spatter evidence is that the prosecution is provided with an expert who appears to be able to reconstruct precisely what happened by looking at the blood left at the scene of a crime. However, a quick review of the ‘science’ relied upon by the expert suggests that we would be better off proving guilt beyond a reasonable doubt without the help of such experts.” (emphasis in original)

But these judges were the dissenters on panels of judges that otherwise approved of BPA experts. Once these courts of appeals signed off on allowing BPA into courtrooms, precedent—and a few courtroom performances by MacDonell or his students—allowed this forensic pseudoscience to spread through court systems like a virus

Trial courts often now routinely admit BPA experts on their stated qualifications alone, often with little argument from defense attorneys.

“If the courts routinely admit junk science,” Judge Gertner, said, “the lawyer with a finite amount of resource is not about to say I will spend this dollar on a challenge if it’s not going to make a difference.”

In an ironic twist, MacDonell himself has been hired on several occasions to refute interpretations of BPA evidence proffered by his former students. He responded to the criticism that he paved the way for unqualified persons to testify in courts around the country by stating in one of his books, “[t]he fault for permitting such individuals to testify as an expert must rest with the opposing attorney” and added that “a judge should be able to recognize unqualified charlatans.”

Turning back to the case of David Camm, several BPA experts signed an ethics complaint in 2003 with the American Academy of Forensic Scientists, claiming that Rodney Englert misrepresented his education, training, and experience. Among them was MacDonell, who referred to Englert as “a forensic whore,” a “liar-for-hire,” “a very smooth charlatan,” and “[t]he Bin Laden of Bloodstains.”

All of these experts expressed opinions in support of David Camm’s innocence, and all of them were sued by Englert for slander.

‘As Accurate as a Ouija Board’

Despite decades of criticism leveled against BPA, including damning government reports and validation studies that show alarming error rates, workshops are still being taught that certify attendees as so-called experts. After a short period of informal apprenticeship to establish the requisite experience, workshop attendees can often meet the minimum requirements to testify as an expert in a criminal trial.

Police investigators who receive this certification will testify on behalf of the State during a criminal trial, saving the State money by not having to outsource its expert witnesses. This savings is no small amount either. Depending on the nature of a person’s expertise and the amount of time spent on the case—including travel and time in the courtroom—such experts can cost the State thousands of dollars per case. And when a certified BPA expert is hired by a defense attorney, the expert themselves will personally profit from these consulting fees. They can earn still more money by teaching these questionable skills to others, like MacDonell did.

This is clearly a viable and profitable business model, as MacDonell was able to support his family on an income solely based on consulting and teaching. There appears to be few hurdles standing in a person’s way when considering such a career.

The 2009 NAS report stated that “the interpretation and reconstruction of blood stains requires at minimum, an understanding of applied mathematics, the physics of fluid transfer, and the pathology of wounds.” Most graduates of BPA workshops like those operated by MacDonell are police officers who lack this background in science, and the NAS has specifically said that these workshops are not an adequate substitute for a scientific background with appropriate training, experimentation, and experience. The NAS was further critical of certifications awarded to trainees based on a minimum number of hours spent in a class, because these persons regularly fail to demonstrate a mastery of the “rigorous and objective hypothesis testing” and “complex nature of fluid dynamics” that is “essential to the formation of reliable opinions about the cause of blood patterns.”

Despite these criticisms, governments have mostly failed to make any regulatory changes in this area, so becoming a BPA expert, earning consulting fees, and passing on these skills is as easy as ever.

Tom Bevel spent nearly three decades in the Oklahoma City Police Department, where he developed an interest in blood spatter. He attended both the 40-hour basic and the 80-hour advanced courses taught by MacDonell. He was also the first president of the IABPA. These achievements were endowed despite Bevel having no advanced degrees in any scientific field relating to BPA nor any other significant scientific training. He regularly contracted his services as a BPA expert, most often testifying for the prosecution.

Bevel was a co-founder of Bevel, Gardener & Associates, an Oklahoma firm that conducts BPA training workshops similar to MacDonell’s. His partner in this venture, Ross Gardner, was also his co-author on the textbook, Bloodstain Pattern Analysis with an Introduction to Crime Scene Reconstruction, 3rd Edition. Though it should be noted that Ralph Ristenbatt, the previously mentioned instructor of forensic science at Penn State, stated in a review of Bevel and Gardner’s textbook that it “lacks scientific integrity.”

Bevel has also testified or consulted for the prosecution in several high-profile cases where his interpretation of BPA evidence was criticized or controverted by analysts with more scientific training.

Bevel was one of several analysts who consulted on David Camm’s caseand sided with Englert in his flawed and unfoundedly confident interpretation regarding Camm’s guilt, according to an organization called Investigating Innocence. Camm, as will be explained in more detail, was eventually acquitted on retrial.

Bevel was also the lead analyst of blood spatter evidence in the trial of Warren Horinek, a former Fort Worth, Texas, police officer accused of shooting his wife on the night of March 14, 1995. The investigators and analysts initially agreed with Horinek that his wife committed suicide, and the district attorney refused to press charges against him. His wife’s family hired their own private attorney and investigator, and they used a legal loophole in Texas to bring evidence before a grand jury.

At trial, it looked as though Horinek would prevail, at least until Bevel took the stand and confidently asserted that a minute amount of blood on Horinek’s nightshirt had to originate from a “high-velocity occurrence,” such as a gunshot, instead of being expirated by his wife while he performed CPR on her, as he claimed.

After Horinek was convicted and sentenced to 30 years in prison, one juror stated that it was Bevel’s testimony that turned the jury.

Subsequently, based on a request from one of the police officers involved in the case, the evidence was reviewed by MacDonell and Anita Zannin—a prominent lab analyst who worked with MacDonell—both of whom concluded that Bevel’s conclusion was utterly incorrect, because there was no way to determine whether the blood pattern was the result of CPR or a gunshot.

Horinek is still in prison, despite efforts to overturn his conviction. He will be eligible for release in 2026.

Bevel was also responsible for teaching Robert Thorman during a BPA workshop in 1985, and Thorman would testify that year in the trial of Joe Bryan—a Huntsville, Texas, high school principal accused of shooting his wife, Mickey, an elementary school teacher. Though no other evidence linked Bryan to the scene of the crime—he was attending a conference out of town the night his wife was murdered—Thorman testified that minute flecks on the lens of a flashlight found in Bryan’s trunk could have only been high-velocity backspatter, deposited there when Joe shot Mickey. Bryan was convicted at trial and sentenced to 99 years in prison.

A subsequent review of the evidence by professionals with more scientific training than Thorman—including Ralph Ristenbatt, the Penn State professor—concluded Thorman’s theory of the crime was “totally specious, and there’s no evidence to support it.”

In 2011, an attorney for Bryan obtained evidence from the case to have it tested for DNA. The report concluded that the partial profile on the lens of the flashlight “was too limited for a meaningful interpretation.” Yet one line in the report stood out: “A presumptive test for blood was negative on the lens.”

After learning that the flecks on the flashlight were not blood, Thorman publicly acknowledged that his conclusions were unsustainable. However, convictions are notoriously difficult to overturn in Texas, and Joe Bryan was eventually paroled March 31, 2020, after serving 33 years in prison. The Innocence Project of Texas filed a petition that same year asking the U.S. Supreme Court to overturn his conviction.

In a Connecticut case, EverettCarr was brutally stabbed 27 times in his own home on the morning of December 2, 1985. Teenagers Ralph Birch and Shawn Henning somehow became the lead suspects in the case despite having no apparent connection to the crime. The teens were found sleeping in a vehicle they had stolen—not a good way to begin an interaction with law enforcement—though none of Carr’s blood was found in the vehicle or on the boys.

BPA expert Henry C. Lee testified at their 1989 trial that bloodstains found on a towel in Carr’s home were deposited in a way that was consistent with Carr’s blood spraying away from the assailants, explaining why they weren’t covered in Carr’s blood when they were found.

Thirty years after the teens had been convicted at trial, the Connecticut Supreme Court vacated their convictions because subsequent retesting of the towel revealed that the stain wasn’t even blood.

By the time the teens’ convictions were overturned, Lee had been a BPA expert for over 50 years and investigated over 8,000 cases, including the murder trial of O.J. Simpson.

He also took part in the investigation of the alleged murder of Lana Clarkson in 2003 by renowned music producer Phil Spector. According to two of Lee’s former colleagues, Lee picked up part of an “acrylic fingernail” at the murder scene and placed it in a vial. Four years later, the trial court ruled that Lee willingly failed to turn over the object—reports alleged that he hid or destroyed it—despite it being crucial to proving that Clarkson could not have committed suicide.

In another example from Texas, analystRodney Englert made an appearance in another high-profile case—the murder of 10-year-old Joel Kirkpatrick in the early morning hours of October 13, 1997, in Lawrenceville, Texas. His mother, Julie Rea, awoke to screams in the night. After checking Joel’s room and not seeing him in bed, she ran through the house until she confronted an armed intruder. The two engaged in a protracted struggle where Rea was injured and the assailant escaped. She then sought help from a neighbor who called the police.

Investigators failed to find evidence of forced entry, and the knife from Rea’s kitchen used to stab Joel yielded no fingerprints. Without any clues to go on, investigators became suspicious of Rea, a single mom recently divorced.

Englert, reviewing the blood-spatter evidence from the scene concluded that the crime scene had been “staged and manipulated” and was “not consistent with her story of a struggle.” During his testimony at Rea’s trial, Englert put on a show for the jury, explaining BPA’s basic principles while performing a lengthy demonstration using theatrical blood.

Rea was convicted, but evidence emerged soon after that she and her son had been the victims of a notorious serial killer named Tommy Lynn Sells. Sells was already facing execution in Texas for a nearly identical murder of a young Texas girl when he admitted to Illinois investigators that he had killed Joel as well.

This admission garnered Rea a retrial in 2006. Though the prosecution doubled-down on using Englert and his interpretation of the evidence, forensic consultant Kenneth Moses made detailed challenges of Englert’s methods and conclusions, ultimately stating “[t]here is no scientific basis for making such a claim.”

Rea was acquitted and later formally exonerated. Though she received a mere $87,057 in compensation for her imprisonment, that is nothing compared to the suffering caused by Englert’s scientifically unsupported testimony.

Englert’sBPA-related failures should have been apparent in David Camm’s case as well. Englert’s interpretation of the eight flecks of blood on Camm’s t-shirt as high-velocity backspatter drove the narrative that Camm murdered his family. This erroneous claim was unsupportable by the evidence, but it caused investigators to misinterpret or overlook real evidence pointing to the actual culprit.

A few months before Camm’s wife and children were murdered, Charles “Backbone” Boney was released from prison. Also known as “The Shoe Bandit,” Boney was known for holding women at gunpoint while stealing their shoes to satisfy his foot fetish.

Though Boney was questioned by investigators and initially denied meeting David Camm at the church that night, he later spun a story in which he sold a gun to Camm, allegedly wrapped in his prison sweatshirt.

However, the sleeve of this sweatshirt—which was found at the scene and had the moniker “BACKBONE” written on the inside collar—tested positive for Kim Camm’s saliva, which meant that Boney had held Rea hostage during their encounter. Also, Boney’s palm print was found on the passenger door of Kim’s Ford Brocno, where he placed his hand to brace himself while leaning into the vehicle to shoot Camm’s children.

At his third retrial in October 2013, the jury heard the full story regarding evidence of Boney’s involvement in the case, including the deal he made with investigators to testify against David Camm. They also heard testimony of forensic experts with more scientific training than Englert, who certified that the eight flecks of blood were so small that no reasonable conclusion could be based on their pattern.

“People see what they want to see,” opined Camm’s lead attorney, Richard Kamman, at his third retrial, about BPA. “It’s as accurate as a Ouija board.” Put another way, BPA appears to be forensic science’s equivalent to the Rorschach test, except people’s lives literally hang in the balance of what various BPA “experts” claim to see in the blood spatter.

Getting to Objective, Repeatable, and Reliable

Even a cursory review of how investigations and lives are derailed by unscientific forensics can leave a person with a sense of dread, especially considering there are likely hundreds, if not thousands, of innocent people who have been sentenced to decades in prison or possibly the death penalty.

It is worth noting that there have been some developments in relation to BPA that may make a small dent in the number of flawed prosecutions going forward.

The DOJ, through the NIJ, has spent $175 million between 2009 and 2017 on forensic research.One research group involving Daniel Attinger, a fluid dynamics specialist at Columbia University in New York, has received over $1.3 million in grants to study blood spatter. He previously studied how minor variations in fluids—such as between Coke and Diet Coke—can alter fluid stains. But blood is the “most complex fluid” he’s ever studied, Attinger said.

Some of this work to-date has involved improved analysis of how gravity affects blood in flight, which showed that previous BPA techniques to calculate the locations of victims contained untested assumptions about this effect.

A team at the University of Illinois in Chicago—including researcher Patrick Comiskey who has collaborated with Attinger—showed that blood is different from other fluids in that it becomes runnier as more force is applied to it, whereas most other fluids flow at the same rate regardless of the force applied. According to a PBS report, the team has also shown how complex blood can be due to its composition: “It teems with living cells and active enzymes, and its properties shift under the influence of minute fluctuations in temperature, or the presence of drugs.”

The researchers have been constructing a model to reverse-engineer “the type of gun, the bullets used, [and] the locations of the people involved, each with a score of the algorithm’s confidence in its results.” This model is incredibly complex, and attempts to assess the variances caused even “by the gas that escapes the muzzle of a gun after it’s fired.”

Attinger and Comiskey have stated their goal is to create a tool or app that investigators can use to reconstruct crime scenes based on blood evidence—including confidence scores in the results—all without investigators understanding the science behind it. However, because of the damage that such tools could wreak in the hands of overconfident experts who don’t understand their own limitations, others have warned about using novel techniques without rigorous validation.

“Forensic Science should be treated like any other consumer product,” M. Chris Fabricant, director of strategic litigation at the Innocence Project in New York City, said. “Before it’s allowed to be used on human beings, it should be scientifically tested and clinically demonstrated to be reliable, just like toothpaste”

Other researchers like Marilyn Miller, a forensics expert at Virginia Commonwealth University, are skeptical that tools, like the algorithm being tested by Attinger and Comiskey, could ever be accurate enough to be used in court. Perfect experimental conditions manufactured in a research lab are a far cry from the “highly textured, highly interrupted world of a crime scene,” she said to PBS

Miller is concerned that even if such tools are made available, “experts” in BPA may use them in court with more certainty than is warranted, especially when they haven’t been trained to understand research papers that are published, which impact their assessments.

Suzanne Bell, a forensics expert at West Virginia University, believes the broader issue relates to communicating the limitations of the evidence and the underlying science, rather than a willful ignorance.

“The problem isn’t that labs don’t want to do better,” Bell said. “It’s a resource issue: So much is demanded of [the personnel in forensics laboratories], who are busy doing cases... every minute they give to training, learning, and digesting this research takes away from casework.”

While resources may be an issue in accredited forensics labs, the issue with individual BPA experts—those certified in 40-hour workshops—is a combination of greed and willful scientific illiteracy among the analysts—and sometimes among the judges who allow them to testify.

Pamela Collof, a journalist reporting jointly for The New York Times Magazine and ProPublica, attended a 40-hour BPA workshop hosted in Yukon, Oklahoma, by the company founded by Tom Bevel and Ross Gardner. She describes days spent “[l]umbering around the garage in the biohazard gear we were provided—hooded Tyvek coveralls, latex gloves, safety goggles and masks secured with duct tape—[trying] to classify each bloodstain according to a dizzying taxonomy of spatters, drips, spurts, swipes and smears”

When the instructor began teaching how to trace the trajectory of blood to its source, he showed how a scientific foundation is dismissed in such trainings: “We’re not really going to focus on the math and physics; it just kind of bogs things down. I’ll teach you which keys on your calculator to press.”

The instructor also coached students on how to avoid difficult questions when called to testify in court on their analysis of evidence, saying they should avoid the word “probably” during testimony—a word which would give the opposing attorney an opening to ask questions about the reliability of the analysis: “You’ll be asked: ‘How probably? Eighty-five percent? Seventy-five percent?’ And you can’t say. It’s better just to say ‘the best explanation is...’”

This alarming disregard for scientific competency and accountability, along with several high-profile cases such as that of Joe Bryan, has prompted the Texas Forensic Science Commission to require changes in how BPA evidence is presented in Texas courts.

The Commission, created by the Texas legislature in 2005, is composed of seven scientists, one defense attorney, and one prosecutor. Their job is to assess the reliability and integrity of forensic sciences and then make recommendations to ensure that criminal convictions based on this evidence are sound.

The Commission held a hearing on BPA on January 22, 2018 and it invited leading forensic lab scientists, leading BPA analysts, and representatives from law enforcement. Some police officers who were also BPA analysts mocked the idea that college degrees made any difference in the reliability of BPA testimony. One remarked that “Thomas Edison was self-taught,” and another said of college degrees in science: “It means I can show up for a class, it means I can take a test and it means I can graduate.”

Tom Bevel—who makes money by certifying such police investigators as BPA experts—declined to attend the hearing but submitted a written statement to the commission recommending that analysts take more 40-hour classes, specifically two introductory and two advanced courses with different instructors.

Law enforcement’s resistance to higher standards for experts appeared to frustrate even the lone prosecutor on the Commission, Jarvis Parson, who exclaimed, “[w]e are talking about the liberty of individuals.”

And while researchers like Attinger have sought to make crime scene reconstruction more available, the Commission’s review stressed the dangers of continuing to allow experts without sufficient scientific expertise to testify in Texas courts. The final report stipulated that a bloodstain-pattern-based analysis must be performed by an accredited organization if it is to be allowed in court. Such accreditation is a labor-intensive process often too cumbersome for individual experts or even small firms and is more often appropriate for dedicated labs that employ well-trained scientists and technicians.

The Commission also mandated that analysts regularly undergo proficiency testing. Further, “[t]heir cases will be reviewed, and there will be an outside audit of their work each year. Their testimony will be monitored to ensure they aren’t overstating their findings in court”

Though the Commission’s decision only affects Texas courts, other courts look to large states like Texas and may adopt similar rules in response to the Commission’s decision. But there is no guarantee that this kind of accountability for bloodstain-pattern analysts will be replicated everywhere BPA is admitted into courts.

“If we don’t have technologies that are objective, repeatable and reliable, then we have no idea how many times we’re making the wrong decision,” according to Alicia Carriquiry, director of the Center for Statistics and Applications in Forensic Evidence, a government-funded project to measure the limits of forensic methods. “We don’t even have a way to estimate how many times we’re making the wrong decisions.”  

Anthony Accurso is incarcerated at a federal prison in Seagoville, Texas. He was born in Kansas City, Missouri. Prior to incarceration, he ran his own tech consulting company, Accurso Technologies. He is a paralegal. He is a consultant with Reset Missouri, a re-entry project. He is working towards a future where prisons don’t exist, and people are not put on lists of ignominy.

Sources: propublica.org, ncjrs.gov, crime-scene-investigator.net, nycriminaldefenders.com, pbs.org, news-leader.com, blog.expertpages.com, reason.com, washingtonpost.com, signalscv.com, thenextweb.com, nytimes.com, buting.com, nbcnews.com, texasobserver.org, investigatinginnocence.org, tampabay.com, davisvanguard.org, theadvocate.com, deathpenaltyinfo.org, expertinstitute.com, wcojp.org, metro.us, thedailybeast.com, lawofficer.com, rrcc.edu

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

PLN Subscribe Now Ad
Advertise here
Stop Prison Profiteering Campaign Ad 2