Skip navigation
CLN bookstore
× You have 2 more free articles available this month. Subscribe today.

Federal Facial Recognition Technology Fails Again

by Michael Dean Thompson

The federal government has again discovered that its use of facial recognition technology (“FRT”) harms Americans. The agencies using the technology are often doing so with little oversight or training, which is what the Government Accountability Office told us in 2023. U.S. Commission on Civil Rights Chair Rochelle Garza pointed out, “Unregulated use of facial recognition technology poses significant risks to civil rights, especially for marginalized groups who have borne the brunt of discriminatory practices.”

The technology is particularly harmful to Black women. The National Institute of Standards and Technology (“NIST”) found in 2019 that the very best FRT tools, from Amazon and Microsoft, were 100 times more likely to misidentify Black women than White men. Furthermore, misidentification grew dramatically for all systems comparing “wild” photos drawn from sources like social media. Concerns about the civil rights challenges led both of those developers to—at least temporarily—stop providing FRT to law enforcement. 

There are at least 18 federal agencies using facial recognition technology, yet there are no federal laws governing its use. Into this vacuum, the Justice Department has fed millions of dollars in awards for law enforcement agencies to acquire and use FRT. The consequences for everyday citizens of FRT use include wrongful arrests, discriminatory bias in policing, and unwarranted surveillance.

The Commission’s recently published 184-page report points out, “While a robust debate exists surrounding the benefits and risks associated with the federal use of FRT, many agencies already employ the use of this technology.” One of those agencies studied by the Commission was the Department of Housing and Urban Development, which is not exactly an agency that jumps to mind for the use of FRT. Yet, that agency has used FRT to identify and evict residents for even minor infractions.

The U.S. Marshals Service also uses FRT, one of the many agencies to use ClearView AI. Unlike Microsoft and Amazon, ClearView did not submit its FRT to the NIST for testing. ClearView also claims to draw billions of images from public spaces on the internet, the same “wild” sources that confounded the very best systems.

People passing through airports, seaports, and pedestrian lanes crossing the borders are subjected to FRT scrutiny. The U.S. Customs and Border Patrol claims its FRT system has an accuracy rate over 99% for different ethnicities. The report, however, notes that in 2023, civil rights groups complained that Black asylum seekers struggled to schedule an appointment due to FRT failures.

The Commission on Civil Rights would like to see a testing protocol that could be applied to the various law enforcement FRT systems. Beyond testing the systems’ efficacy like that of the NIST test, this would verify privacy safeguards and equitable use. But that would require action from lawmakers.

Commissioner Stephen Gilchrist declared, “Our nation has a moral and legal obligation to ensure that the civil rights and civil liberties of all Americans are protected.  ”

Source: USA Today

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

PLN Subscribe Now Ad
Advertise Here 4th Ad
PLN Subscribe Now Ad