Study: Technology Creates and Embeds Bias in the Criminal Justice System
by Douglas Ankney
Automatic License Plate Readers (“ALPR”), facial recognition technology, and predictive policing are some of the new weapons in the arsenal of the police state. And minority communities are caught in the crosshairs.
The failures of facial-recognition technology are widely known. According to a study by the Massachusetts Institute of Technology, the three most advanced systems had error rates of 1 percent for light-skinned males, 12 percent for darker-skinned males, and a whopping 35 percent for darker-skinned females. Systems have even mistaken black members of Congress for criminal suspects.
Predictive policing uses algorithms that build on historical crime data. The programs suggest future crimes will occur in past high-crime areas. But historical data is biased because police in the past targeted low-income and minority neighborhoods. Because of the algorithms, there is increased surveillance of those communities to find the predicted crimes. This perpetuates the designation “high-crime area,” resulting in further surveillance.
And the ALPR create massive databases tracking people’s movements that are accessible by hundreds of police agencies. With almost no oversight, the potential for abuse is astounding. Members and friends of groups unpopular with law enforcement — such as Black Lives Matter — can be unknowingly tracked, false criminal evidence planted, or worse.
Technology can be useful in the criminal justice system. But there must be demanding standards of accountability. Criminal justice personnel must be made aware of its limitations, including its inherent ability to create and perpetuate racial biases.
---
Sources: techdirt.com, themindunleashed.com, slate.com
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login