Skip navigation
CLN bookstore
× You have 2 more free articles available this month. Subscribe today.

Police Departments Are Now Using AI to Write Reports

by Anthony W. Accurso

Axon, a company that makes products (including weapons) and services available to police departments, has begun selling a new product designed to use artificial intelligence (“AI”) to turn bodycam audio into police reports. However, a recent ACLU white paper expresses skepticism while cautioning against wider deployment.

The November 2024 white paper, written by Jay Stanley, Senior Policy Analyst with the ACLU’s Speech, Privacy, and Technology Project, is entitled “Police Departments Shouldn’t Allow Officers to Use AI to Draft Police Reports.” It begins by discussing how Axon’s new software, Draft One, is being used by police departments.

The software is backed by OpenAI’s GPT-4 large language model (“LLM”). It takes (just) the audio from an encounter and uses AI to generate a transcript. It then feeds the transcript through a separate AI process for drafting a police report, which it writes in the form of a first-person narrative. It prompts the officer to then add details and context in various places. The officer is then supposed to review the report and swear to its accuracy before submission.

Police in several jurisdictions have been using the program, sold as an add-on to their Axon package—which often includes tasers, body and dash cams, and virtual reality training headsets—and many are enthusiastic about the time it saves.

“It takes the audio and puts it all together for you,” said Meriden, Connecticut, Officer Danny Cruz. “It pushes the officer to read it thoroughly. It allows you to put in extra information. You can save an officer an hour of time for typing. To remember everything you said, the back and forth, takes hours. Things an officer forgets, you’ll find in the report.”

According to Gov Tech, “[f]or police in Fort Collins, Colo., the tool resulted in a 67 percent reduction in the time spent on report writing, which in turn frees up officers for more street duty, the spokesperson said, adding that ‘success stories’ about the product ‘mostly center on how much quicker officers can complete their paperwork while maintaining quality.’”

But maintaining report “quality” isn’t the only concern voiced by the ACLU. The white paper breaks down three main areas of concern about using AI for this function of policing.

First are the problems that plague all AI tools. The paper notes that “[b]ecause LLMs are trained on something close to the entire Internet, they inevitably absorb the racism, sexism, and other biases that permeate our culture.”

“Studies have found that automatic speech recognition systems perform much more poorly in accurately transcribing African American Vernacular English and other dialects of Black English,” wrote Stanley. “Errors in interpreting Black English can and have hurt people legally.” And it’s not just Black English that AIs have trouble with. Accents can cause problems as well.

This presents an issue when the officer is swearing to the veracity of a defendants’ statements as interpreted by the AI, which may be relied upon in court. It then places the onus on a defense attorney to review the audio and the report to check for accuracy.

Further, analysts at the independent surveillance trade publication IPVM noted customers report that Axon’s product “performs worse with longer interviews, pursuits, and traffic accidents” and “in loud environments or when unrelated conversations interject (such as radio chatter).”

The errors present in the generated transcript are then “laundered” by being fed through the second stage of the software to generate the report. And no, the software doesn’t maintain a separate copy of the original transcript, which according to the company is “by design.”

To counter the reliability problems of AI, the white paper notes that “Axon’s product includes two purported safeguards.” First, the AI always produces “insert statements” where it instructs the reporting officer to fill in the blank with needed details. Second, a police department can set the product to randomly insert silly sentences into the AI-produced draft, such as mentioning a flying squirrel entered the scene, which is intended to ensure that police officers are actually reading and editing the draft. However, Axon’s CEO conceded that “We’re generally getting not-great feedback I would say on that—most agencies are saying no, they wouldn’t use it.”

The second area of concern is with issues relating to corporate and technological transparency and privacy concerns. Any time evidence is used against a defendant in court, that defendant has the right to review the evidence against them and the processes by which it was procured. This is rooted in the Sixth Amendment’s right to confront one’s accuser but has expanded to a right to review technology-driven evidence. AI is inherently a “black box” in that we cannot say why it makes certain decisions, such as why it chooses one transcribed word over another similar word.

As one Axon executive put it in a webinar for prosecutors, “We turn off all the creativity, so that it sticks to the facts, and then we build basically a law enforcement layer on top.” That “law enforcement layer,” he explained, “is where the LLM is instructed in such things as what a good police report looks like, what kinds of information to include and not to include, and when to give police officers prompts to add additional information.”

“But what is the prompt that accompanies that text?,” asks Stanley. “That’s an example of the kind of element of an AI tool that ought to be public. If it’s not, a police AI system could well contain an instruction such as, ‘[m]ake sure that the narrative is told in a way that doesn’t portray the officer as violating the Constitution.’”

Because of how the models are “trained,” it’s not hard to “foresee computer scientists who work for OpenAI being pulled into a murder trial or the like due to questions raised by the defense about how the company’s LLM was trained.” These instructions can also change frequently, and the is no requirement that companies keep detailed logs about which prompts were in use at any time, further complicating the job of defense counsel.

The same webinar presentation also mentioned the issue of transcript drafts: “we don’t store the original draft, and that’s by design … because the last thing we want to do is create more disclosure headaches for our customers, and our attorneys’ offices.”

While Axon has an incentive to reduce inconveniences to its customers, the original transcript that underpins a police report may be crucial for a defense attorney and may be constitutionally required as part of disclosure rules. Laws that protect our civil liberties are not designed to maximize convenience for law enforcement.

Along these same lines, absent strict controls and transparency requirements, there is nothing stopping Axon from training the model to portray the use of its products in the field, such as tasers, in a more favorable light.

Such a concern was expressed by Anthony Tassone, CEO and co-founder of Truleo, a competitor to Axon. In his view, a “weapons manufacturer”—Axon sells Tasers—”should not be in the business of AI-generated reports, as that can lead to conflicts of interests in the case of mishaps or fatalities. The AI could help an officer or department to basically clean up a report in favor of law enforcement.”

The other major concern about police using such a product is that all the data is transmitted to OpenAI for processing. “Contractual arrangements or corporate policies may govern data handling,” noted the white paper, “but if they exist at all they can vary widely, change over time, and lack enforcement.” No corporation should have unlimited access to police data, especially where such data could be used against anti-corporate activists and labor unions.

The final area of concern raised by the ACLU’s white paper focuses on the essential functions of traditional methods for generating police reports. “It’s important to also capture the officer’s subjective experience and memory of an incident—which may be pivotal to determining whether to file charges and later, in any prosecution—which will be based on all five of an officer’s senses, as well as their perception of human nuances of the situation such as whether somebody is hostile or meek, frightened or bold,” wrote Stanley.

Courts already have access to body and dash camera video and audio, but having a separate account based on an officer’s impressions and recollection, however imperfect, contributes to the quality of evidence available for use in the courtroom.

“If the police report is just an AI rehash of the body camera video,” he continued, “then you no longer have two separate pieces of evidence—you have one, plus a derivative summary of it that can reshape and contaminate the officer’s memory or immunize the officer from accountability for illegal acts.”

Another concern with automating report generation is that it performs a less obvious role as a check on police power. “In addition to the instrumental role that police reports play in the justice system’s machinery, forcing police to write down reasons for their exercise of discretionary power (like stops, frisks, searches, consent searches, etc.) reminds them of the legal limits of their authority,” according to Stanley. “The act of writing reports functions as a form of internal mental discipline for police that continually reminds them of limits on their power.” Having supervisors review these reports allows them “to identify when an officer might not know or observe those limits” and creates a legal requirement for the supervisor to address potential issues.

Representatives for Axon, when pressed for comment, did not challenge any of the issues raised. The company instead mentioned that “[b]y default, Axon’s software is set not to be used for any incidents involving felonies or arrests.” But Axon’s executives have publicly stated that “the DAs and the actual agencies that are doing these reports … are rapidly turning off the restrictions” and that “most of the agencies that are live right now are using it on all incidents.”

The ACLU has suggested a different approach to meet the goal of saving police officers time on administrative tasks, such as writing reports. It’s an approach that has been implemented by Truleo, the aforementioned competitor to Axon. Officers use Truleo “to dictate a narrative of an incident—something they can do while driving—and the company’s technology then uses AI to ‘enhance’ that information and come back with ‘suggestions.’” Officers then “make edits and finish the report on their own.”

While this process is similar in some respects to Axon’s method, it takes all the determination tasks away from the AI and places them squarely where they belong—on the officer. It also means that the only audio being transcribed is the officer’s own speech, and they are unlikely to allow errors to persist where their own words are concerned.

“They are asking AI to make determinations,” warned Tassone, adding that Truleo has opened its AI to “random studies” and privacy checks. “You can’t ask AI to properly attribute criminality to people. That’s an officer’s job.”

Truleo has been advertising itself as the ethical alternative to Axon, and their engineers seem to have spent more time considering larger issues when developing their product than Axon’s developers.

“Police reports,” according to Stanley, “which are often the only official account of what took place during a particular incident, are central to the criminal proceedings that determine people’s innocence, guilt, and punishment.”

Such functions should not be left to the whims and errors of a nascent technology whose inner workings are intentionally hidden from the public.  

Sources: aclu.org, govtech.com

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

Federal Prison Handbook - Side
Advertise here
Stop Prison Profiteering Campaign Ad 2