San Francisco launches experiment to curb racial bias
San Francisco is taking a step toward making justice colorblind.
The city is using artificial intelligence to strip police reports of identifying details, such as a suspect’s name, race and hair and eye color; his or her neighborhood and district and the names of witnesses in an effort to reduce racial bias.
The redacted reports will “take race out of the equation” by leaving only key facts when a prosecutor decides whether to charge a suspect with a crime, District Attorney George Gascón announced.
Once prosecutors decides on a preliminary charge, they will have access to the full unredacted report and police body cam video. If there is a change of heart, they will be required to say what “led to any changes in their charges,” cnet.com reports.
Gascón’s office worked with Stanford Computational Policy Lab data scientists and engineers to develop the AI system.
“The criminal-justice system has had a horrible impact on people of color in this country, especially African Americans, for generations,” Gascón told The Modesto Bee. “If all prosecutors took race out of the picture when making charging decisions, we would probably be in a much better place as a nation than we are today.”
It’s no surprise that “bias exists nationwide at all levels of the criminal justice system, from police making arrests and prosecutors deciding whether to charge suspects to court convictions and sentencing.”
“Hats off for trying new stuff,” said Phillip Atiba Goff, president of the Center for Policing Equity, about Gascón’s experiment. “There are so many contextual factors that might indicate race and ethnicity that it’s hard to imagine how even a human could take that all out.”
Sources: The Modesto Bee, washingtonpost.com, cnet.com
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login