Artificial intelligence is being utilized by police departments across the U.S. to compose crime reports.
- More companies are using AI tools to assist police departments with administrative tasks.
- In California, Colorado, and Indiana, Axon, known for its Taser devices and body cameras, is testing its AI for report writing in the common police work.
- While police officers are impressed with AI's ability to draft reports in 10 seconds, legal experts are raising concerns about accuracy, transparency, and potential bias. These challenges could significantly impact the future of AI in both law enforcement and the courtroom.
Police departments are increasingly turning to artificial intelligence tools to ease administrative burdens, as law enforcement faces budget pressures and the need to recruit and retain staff.
Axon, well-known for its Taser devices and body cameras, was one of the first companies to develop AI specifically for the most common police task: report writing. Its tool, Draft One, generates police narratives directly from Axon's bodycam audio. Currently, the AI is being tested by 75 officers across several police departments, including Fort Collins, Colorado; Lafayette, Indiana; and East Palo Alto, California.
Draft One, a tool used by Axon CEO Rick Smith, is currently limited to drafting reports for minor incidents to allow agencies to become comfortable with the tool before expanding to more complex cases. Early feedback suggests that the tool reduces report-writing time by more than 60%, potentially cutting the average time for report completion from 23 minutes to just 8 minutes.
Sergeant Robert Younger of the Fort Collins Police Department stated that the tool saves approximately 45 hours per police officer per month. He was amazed by the accuracy of the draft report when he first tested it, as it did not rely on suppositions or guesses about someone's thoughts, feelings, or appearance. Instead, the report was well-written, balanced, chronological, based on facts, and included an introduction and an outcome. The draft report was produced in under 10 seconds.
Lawyers are concerned about AI reports in court
As AI becomes more prevalent in law enforcement, legal experts are raising concerns about accuracy, transparency, and potential bias. The future of AI in both policing and the courtroom will largely depend on how heavily these tools are used and how they are implemented.
"I believe that the potential issues with AI technology in terms of evidence admissibility, transparency, and bias mitigation are not worth the effort," stated Utah State Senator Stephanie Pitcher, a defense attorney at Parker & McConkie.
To ensure accuracy, AI in police reporting must be used with clear protocols and careful oversight, as agreed upon by Pitcher and other experts.
"According to New York trial attorney David Schwartz, if a police officer relies on artificial intelligence to draft a report, the report should be reviewed. The police officer should sign off and attest that the facts are truthful to the best of their knowledge. However, this could create problems for the police officer and prosecution at trial during cross-examination."
According to Smith, Axon's Draft One has built-in safeguards that require officers to review and sign off on each report before submission. The system also includes controls, such as placeholders for key information that officers must edit, to ensure that no critical details are missed. Additionally, the report undergoes multiple levels of human oversight by supervisors, report clerks, and others to ensure it meets agency standards before it's finalized.
Some law enforcement officials, such as Keith Olsen, a retired New York detective and president and CEO of consulting firm KO Solutions & Strategies, believe that there are no advantages to using AI for police reports.
"Olsen stated that the problem-solving approach being taken may not be necessary as it appears to be addressing a non-existent issue. Writing a police report does not take long, and the officer's perspective may be missed. Additionally, the officer still has to make changes and additions. There may not be any time-saving benefits, and a skilled defense attorney could identify several issues with the approach."
Truleo and 365 Labs are marketing their AI tools as high-quality aids for officers, rather than time-saving devices.
Truleo, which introduced its AI technology for auto-generated narratives in July, captures real-time recorded voice notes from the officer in the field instead of relying on bodycam footage like Axon. "We believe dictation and conversational AI is the fastest, most ethical, responsible way to generate police reports. Not just converting a body camera video to a report. That's just nonsense. Studies show it doesn't save officers any time," said Truleo CEO Anthony Tassone.
While 365Labs uses AI for grammar and error correction, CEO Mohit Vij emphasizes the importance of human judgment for reports involving complex interactions. He stated, "If it's burglary or assault, these are serious matters. It takes time to write police reports, and some who join the police force are there because they want to serve the communities and writing is not their strength. So, we focus on the formulation of sentences and grammar."
Accuracy in criminal investigations
The director of the Center for Professional Ethics at Case Western Reserve University School of Law, Cassandra Burke Robertson, has concerns about the accuracy of AI in police reporting.
AI-generated reports can be useful in generating plausible text quickly, but their accuracy should be scrutinized, especially in criminal investigations.
She believes that AI tools will remain a part of life, but she wants more than just a confirmation that the reports are thoroughly reviewed and verified.
In the courtroom, AI-generated police reports could create additional complexities, particularly when they rely solely on video footage instead of officer dictation. Schwartz believes that while AI reports could be admissible, they may lead to intense cross-examination. "If there is any discrepancy between what the officer recalls and what the AI report shows, it is an opportunity for the defense to question the report's reliability," he said.
If officers overly rely on AI and don't conduct thorough reviews, their potential for inconsistency could create a perception of laziness or lack of diligence.
Adam Rosenblum, a lawyer based in New Jersey, stated that "hallucinations" or inaccurate information generated by AI could distort context. He suggested that courts may need new standards to ensure that AI's decision-making process is documented in detail and transparently before allowing reports into evidence. Such measures could help protect due process rights in cases where AI-generated reports are used.
Axon and Truleo both confirmed their auto-generated reports include a disclaimer.
Pitcher stated that many attorneys likely agree that overcomplicating something or introducing potential challenges to inadmissibility is not worth it.
Sergeant Younger at Fort Collins believes that the key to understanding AI is that it is a process. He has seen officers say that this understanding is crucial in deciding whether to continue in law enforcement because they were not expecting the huge administrative functions that come with the job.
Technology
You might also like
- U.S. reportedly considers toned-down China curbs, leading to a rise in shares of key chip suppliers.
- Bitcoin surges above $95,000, with investors aiming for $100,000 mark before Thanksgiving.
- Despite the excitement surrounding generative AI, there is no must-have gadget for the holiday season.
- The UK market suffered a blow as Just Eat Takeaway announced its decision to delist from the London Stock Exchange.
- Reddit aims to increase ad revenue by targeting international users and introducing an enhanced search function.