Tech Policy4 views5 min read

Chula Vista Police to Use AI for Writing Reports

The Chula Vista City Council has approved a nearly $1 million deal for AI software that writes police reports, a move police say boosts efficiency.

Gabriel Serrano
By
Gabriel Serrano

Gabriel Serrano is a technology policy analyst for Neurozzio, specializing in the governance of emerging technologies. He reports on the intersection of artificial intelligence, data privacy, and regulatory affairs, with a focus on competition and consumer rights.

Author Profile
Chula Vista Police to Use AI for Writing Reports

The Chula Vista City Council has unanimously approved the purchase of artificial intelligence software to assist police officers in writing official reports. The city will spend nearly $1 million over four years on a system from Axon, a company known for body cameras and Tasers, sparking a debate between law enforcement efficiency and public transparency.

The software, called Draft One, transcribes audio from an officer's body-worn camera to generate a preliminary report. While police officials claim this will save significant time, civil rights organizations and privacy advocates have raised concerns about the accuracy, accountability, and potential for misuse in the criminal justice system.

Key Takeaways

  • Chula Vista's City Council unanimously approved a four-year, nearly $1 million contract with Axon for its AI report-writing software, Draft One.
  • The software analyzes body camera audio to create draft police reports, which officers must then review and finalize.
  • A pilot program involving 30 officers from November 2024 to January 2025 produced over 200 reports using the system.
  • Privacy advocates from the Electronic Frontier Foundation and the ACLU criticize the tool for potentially reducing transparency and officer accountability.
  • The decision comes as California considers a statewide bill to regulate law enforcement's use of AI in official documentation.

Details of the Axon Contract and Software

The Chula Vista Police Department (CVPD) will implement Axon's Draft One software across its force following the city council's approval. The technology is designed to streamline the documentation process, which is often a time-consuming part of an officer's duties.

According to department officials, the system uses a modified version of OpenAI's ChatGPT to process audio from body cameras. It transcribes conversations and actions, then organizes the information into a standard police report format. The officer is then required to review the draft, make corrections, and add any necessary context.

How Draft One Works

  1. The software analyzes audio from an officer's body-worn camera recording of an incident.
  2. Using AI, it transcribes the audio and generates a narrative summary of the events.
  3. The officer receives this draft report and is responsible for reviewing it for accuracy, making edits, and removing sensitive information.
  4. Before finalizing, the officer must agree to an acknowledgement stating they have reviewed the report and attest to its accuracy.
  5. Once the officer copies the text to finalize the report, the original AI-generated draft is deleted.

During a presentation to the council, Chula Vista's Assistant Police Chief, Dan Peak, emphasized that officers retain full responsibility for the final document. "Our officers are still accountable for everything they put in this police report, leading up to testimony in court," Peak stated.

Before the city-wide adoption, the CVPD conducted a pilot program. Sgt. Anthony Molina, a department spokesperson, confirmed that 30 officers participated in the beta test, using the software to generate more than 200 investigative and arrest reports.

Concerns from Privacy and Civil Rights Groups

Despite the police department's assurances, the use of AI in creating official legal documents has drawn sharp criticism from watchdog organizations. The Electronic Frontier Foundation (EFF) and the American Civil Liberties Union (ACLU) have voiced significant concerns about the technology's implications for the justice system.

Matthew Guariglia, a senior policy analyst with the EFF, highlighted the issue of accountability. He noted that because the AI's contribution is not clearly delineated in the final report, it creates ambiguity.

"The judge or the prosecutor doesn't know which portions were written by the AI and which portions were written by the officer," Guariglia explained. "It interjects a lot of uncertainty — and a lot of deniability for the officer."

A primary point of contention is that Draft One deletes the initial AI-generated text after the officer finalizes their report. Critics argue this lack of an audit trail could allow an officer to blame any inaccuracies or fabrications on the software. "They can just say, ‘Actually, that part in the report, the AI wrote that part,’" Guariglia added.

ACLU Position on AI in Law Enforcement

In a November 2024 report, the ACLU described AI tools for criminal justice work as "too untested, too unreliable, too opaque, and too biased." The organization warns against the premature adoption of such technologies without robust safeguards and independent oversight.

Another concern is the software's customizability. Axon allows agencies to disable features that label a report as having been generated with AI assistance. While Axon spokesperson Kristin Lowman stated that some safeguards, like a mandatory officer sign-off and a secure audit trail, cannot be disabled, critics remain worried about the potential for agencies to obscure the technology's role.

Legislative and Official Responses

The move by Chula Vista aligns with a broader trend of law enforcement agencies exploring AI, which has prompted legislative action. The California state legislature recently passed a bill that would impose stricter rules on the use of AI for police reports. If signed by Governor Gavin Newsom, it would require agencies to clearly mark AI-produced reports and, crucially, to store all AI-generated drafts.

This potential state law would directly address the concerns raised by the EFF about deleted drafts. At least one other state, Utah, has already enacted a law requiring law enforcement to label AI-written reports and attest to their accuracy.

Locally, Chula Vista city officials largely supported the police department's proposal. Councilmember Jose Preciado framed it as a practical measure. "I think we need to take stock of the fact that this technology is going to assist us in making more policing resources available," he said. "That's something we all need to support."

The San Diego County District Attorney's Office has not taken a formal position on the technology. Spokesperson Tanya Sierra said the office has no policy on AI transcription of body camera video. "We will continue to evaluate cases submitted to us for prosecution based upon the standard of proof beyond a reasonable doubt," Sierra stated. "We expect police departments to submit accurate police reports."

Axon defended its product in a statement, emphasizing its commitment to ethical AI. "As with narrative reports not generated by Draft One, officers remain fully responsible for the content," spokesperson Kristin Lowman wrote, though the company declined to name other agencies in the San Diego area using the software.