Captain Jason Bussert demonstrates Draft One, an AI powered software that creates police reports from body cam audio, at Oklahoma City police headquarters on Friday, May 31, 2024 in Oklahoma City, Oklahoma.
Captain Jason Bussert demonstrates Draft One, an AI powered software that creates police reports from body cam audio, at Oklahoma City police headquarters on Friday, May 31, 2024 in Oklahoma City, Oklahoma.
OKLAHOMA CITY — A body camera captured every word and bark uttered as police Sgt. Matt Gilmore and his K-9 dog, Gunner, searched for a group of suspects for nearly an hour.
Normally, the Oklahoma City police sergeant would grab his laptop and spend another 45 minutes writing a report about the search. But this time, he had artificial intelligence write the first draft.
Pulling from all the sounds and radio chatter picked up by the microphone attached to Gilbert’s body camera, the AI tool churned out a report in eight seconds.
“It was a better report than I could have ever written, and it was 100% accurate. It flowed better,” Gilbert said. It even documented a fact he didn’t remember hearing.
Oklahoma City’s police department is one of a handful to experiment with AI chatbots to produce the first drafts of incident reports. Police officers who’ve tried it are enthused about the time-saving technology, while some prosecutors, police watchdogs and legal scholars have concerns about how it could alter a fundamental document in the criminal justice system that plays a role in who gets prosecuted or imprisoned.
Built with the same technology as ChatGPT and sold by Axon, best known for developing a stungun and as the dominant U.S. supplier of body cameras, it could become what Gilbert describes as another “game changer” for police work.
“They become police officers because they want to do police work, and spending half their day doing data entry is just a tedious part of the job that they hate,” said Axon’s founder and CEO Rick Smith, describing the new AI product — called Draft One — as having the “most positive reaction” of any product the company has introduced.
“Now, there’s certainly concerns,” Smith added. In particular, he said district attorneys prosecuting a criminal case want to be sure that police officers — not solely an AI chatbot — are responsible for authoring their reports because they may have to testify in court about what they witnessed.
The technology relies on the same generative AI model that powers ChatGPT, made by San Francisco-based OpenAI. OpenAI is a close business partner with Microsoft, which is Axon’s cloud computing provider.
“We use the same underlying technology as ChatGPT, but we have access to more knobs and dials than an actual ChatGPT user would have,” said Noah Spitzer-Williams, who manages Axon’s AI products. Turning down the “creativity dial” helps it stick to facts so it “doesn’t embellish or hallucinate in the same ways that you would find if you were just using ChatGPT on its own,” he said.
Axon won’t say how many police departments are using the technology. It’s not the only vendor, with startups like Policereports.ai and Truleo pitching similar products. But given Axon’s deep relationship with police departments that buy its Tasers and body cameras, experts and police officials expect AI-generated reports to become more ubiquitous in the coming months and years.
Before that happens, legal scholar Andrew Ferguson would like to see more of a public discussion about the benefits and potential harms. For one thing, the large language models behind AI chatbots are prone to making up false information, a problem known as hallucination that could add convincing and hard-to-notice falsehoods into a police report.
“I am concerned that automation and the ease of the technology would cause police officers to be sort of less careful with their writing,” said Ferguson, a law professor at American University working on what’s expected to be the first law review article on the emerging technology.
Ferguson said a police report is important in determining whether an officer’s suspicion “justifies someone’s loss of liberty.” It’s sometimes the only testimony a judge sees, especially for misdemeanor crimes.
Human-generated police reports also have flaws, Ferguson said, but it’s an open question as to which is more reliable.
For some officers who’ve tried it, it is already changing how they respond to a reported crime. They’re narrating what’s happening so the camera better captures what they’d want to put in writing.
As the technology catches on, Bussert expects officers will become “more and more verbal” in describing what’s in front of them.
After Bussert loaded the video of a traffic stop into the system and pressed a button, the program produced a narrative-style report in conversational language that included dates and times, just like an officer would have typed from his notes, all based on audio from the body camera.
“It was literally seconds,” Gilmore said, “and it was done to the point where I was like, ‘I don’t have anything to change.’”
At the end of the report, the officer must click a box that indicates it was generated with the use of AI.
Copyright 2024 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.