Skip navigation
CLN bookstore
× You have 2 more free articles available this month. Subscribe today.

Facial Recognition Software Gives Unreliable Results with Black Individuals and Leads to Unlawful Arrests

by Jo Ellen Nott

Two faculty members at Georgia State University in Atlanta in the Department of Criminal Justice & Criminology wrote in the May 18, 2023, edition of Scientific American Technology Section that artificial-intelligence-powered facial recognition will lead to increased racial profiling. In their research, Thaddeus Johnson and Natasha Johnson have found that law enforcement agencies that use Facial Recognition Technology (“FRT”) arrest an unequal number of Black people when compared to the arrests of white individuals. This is due to inherent flaws in the design and deployment of FRT.

The use of FRT to identify suspects has already infringed on the civil rights of two individuals in particularly egregious cases. Civil rights experts say Robert Julian-Borchak Williams, 42, of Farmington Hills, Michigan, is the first documented example of a Black man being wrongfully arrested due to an incorrect match produced by facial recognition technology. This arrest occurred in 2020 when police used FRT as their only lead and allowed a security guard who was not present at the time of the robbery to identify Williams from a photo line-up of suspects generated by the software and match it to the blurry security camera footage.

During the post-arrest questioning, Detroit police showed Williams’ in-focus driver’s license photo and the grainy image of the Black suspect taken by a security camera in the store where watches had been stolen. Williams said to the police during the interrogation “I picked it up and held it to my face and told him, ‘I hope you don’t think all Black people look alike.’”

After the unjust arrest in his front-yard during which his two daughters under the age of six cried and his wife looked on in disbelief, Williams was detained for 30 hours and finally released on bail until his court appearance. A Wayne County prosecutor announced at the hearing that the charges against Williams were dropped due to insufficient evidence.

FRT is currently used in “a quarter of local and state police departments nationwide and almost half of federal law enforcement agencies regularly access facial recognition systems, despite their faults,” according to the Georgia State researchers. They warn that this widespread use poses a real threat to Americans’ constitutional right to be free from unlawful search and seizure.

On the federal level, efforts have been made to manage the use of FRT. The Biden administration released the “Blueprint for an AI Bill of Rights” in 2022, which outlines practices to protect our civil rights in the design and use of AI technologies. While a helpful primer, the blueprint is nonbinding. The Facial Recognition and Biometric Technology Moratorium Act first introduced by Rep. Pramila Jayapal in 2021 was reintroduced in 2023 by congressional Democrats and is in committee.

H.R. 3907 aims to stop law enforcement’s use of FRT until policymakers can create regulations and standards that balance constitutional concerns and public safety. As the Georgia State academics point out, “both efforts fall short. The 2022 blueprint doesn’t cover law enforcement’s use of AI, and the moratorium only limits the use of automated facial recognition by federal authorities—not local and state governments.” But even if they do fall short, the measures are the start of a national conversation about the use of intrusive image retention and retrieval technology.

Because FRT has its flaws, the researchers recommend a set of guidelines to help prevent unnecessary arrests and protect constitutional freedoms:

Technology companies must employ a balanced representation of software designers. Most software developers are white men who consciously or subconsciously program their preferences and race biases into the algorithms. The software needs to learn more about darker faces and their distinguishing features.

Technology companies need to develop and use diverse image training sets. A disproportionate representation of white males in training images produces skewed algorithms. Because Black people are overrepresented in mugshot databases and other image databases used by law enforcement, AI is more likely to mark Black faces as criminal.

Law enforcement needs to use uniform similarity standards when identifying matches. After the software generates a lineup of potential suspects, it assigns a rank to each candidate based on how similar the algorithm believes the images are. Currently police departments decide their own similarity score criteria instead of using the software’s ranking, a practice which increases the chances for wrongful and missed arrests.

Law enforcement agencies need to provide police officers and detectives training on FRT’s pitfalls, human biases and historical discrimination.

Law enforcement officials and prosecutors should disclose that they used automated facial recognition when seeking a warrant.

The authors summarize their research by concluding that the racial profiling so far seen with the use of FRT is the result of “a lack of Black faces in algorithm training data sets, a belief that these software programs are infallible and the tendency of officers’ own biases to magnify these issues.”  

Sources: Congress.gov; National Public Radio; Scientific American

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

CLN Subscribe Now Ad 450x600
CLN Subscribe Now Ad
Disciplinary Self-Help Litigation Manual - Side