Skip navigation
The Habeas Citebook: Prosecutorial Misconduct - Header
× You have 2 more free articles available this month. Subscribe today.

Study Indicates Racial Bias Skews Criminality Risk Assessment Tool

by David Reutter

The use of software to predict future criminality is increasing in popularity. However, a study by ProPublica, an independent, nonprofit news agency that produces investigative journalism, found that prediction programs are racially skewed.

Risk assessments are gaining traction in courtrooms across the United States; some jurisdictions use them to set bond amounts while in others they are utilized to make sentencing decisions. Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin provide such assessment results to judges during sentencing hearings. [See: PLN, July 2017, p.22].

“Evidence has a better track record for assessing risks and needs than intuition alone,” Wisconsin Assistant Attorney General Christine Remington wrote in a brief defending the state’s use of risk assessments in criminal cases.

The most popular assessment program, COMPAS, comes from Northpointe, Inc., a for-profit company that uses answers to 137 questions to make risk assessments. The firm refuses to disclose the calculations used to arrive at its conclusions, calling them proprietary information.

That refusal raises due process questions, critics say. “Risk assessments should be impermissible unless both parties get to see all the data that go into them,” said Christopher Slobogin, director of the criminal justice program at Vanderbilt Law School. “It should be an open, full-court adversarial proceeding.”

ProPublica’s May 2016 findings support that position. It found that only 20 percent of the people predicted to commit violent crimes actually went on to do so. Moreover, it found that Northpointe’s “algorithm was somewhat more accurate than a coin flip,” because “of those deemed likely to reoffend, 61 percent were arrested for any subsequent crimes.”

The assessment program “was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way at almost twice the rate as white defendants,” while “white defendants were mislabeled as low risk more often than black defendants,” according to ProPublica.

On those points, 23.5 percent of whites and 44.8 percent of blacks were labeled as high risk but did not reoffend; 47.7 percent of whites and 20 percent of blacks were labeled lower risk but did reoffend.

ProPublica cited cases where judges raised bonds and increased prison sentences based on criminal risk assessments. The practice in Wisconsin was challenged by Eric L. Loomis, who received a six-year sentence in a drive-by shooting based in part on his score on a COMPAS risk assessment. Loomis claimed the assessment violated his due process rights due to the proprietary nature of the software program and because it used his gender as an assessment factor.

In July 2016 the Wisconsin Supreme Court ruled against him; the U.S. Supreme Court declined to review that decision on June 26, 2017. See: State v. Loomis, 371 Wis.2d 235, 881 N.W.2d 749 (Wisc. 2016), cert. denied.

“Advocates say these data-driven tools remove human bias from the system, making it more fair as well as more effective,” Laurel Eckhouse, a researcher with the Human Rights Data Analysis Group’s Policing Project, wrote in a February 2017 editorial in the Washington Post.

“But even as they have become widespread, we have little information about exactly how they work. Few of the organizations producing them have released the data and algorithms they use to determine risk.

“We need to know more, because it’s clear that such systems face a fundamental problem: The data they rely on are collected by a criminal justice system in which race makes a big difference in the probability of arrest--even for people who behave identically. Inputs derived from biased policing will inevitably make black and Latino defendants look riskier than white defendants to a computer. As a result, data-driven decision-making risks exacerbating, rather than eliminating, racial bias in criminal justice.”

In January 2017, Northpointe, along with Constellation Justice Systems, Inc. and CourtView Justice Solutions, Inc., rebranded to collectively form a company called “equivant,” based in Canton, Ohio.

Sources: www.propublica.org, Wall Street Journal, www.equivant.com, www.jsonline.com 

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Federal Prison Handbook - Side
Advertise here
The Habeas Citebook Ineffective Counsel Side