Header Ads Widget

Cops are using AI software to write police reports

Police departments are often some of the tech industry’s earliest adopters of new products like drones, facial recognition, predictive software, and now–artificial intelligence. After already embracing AI audio transcription programs, some departments are now testing a new, more comprehensive tool—software that leverages technology similar to ChatGPT to auto-generate police reports. According to an August 26 report from Associated Press, many officers are already “enthused” by the generative AI tool that claims to shave 30-45 minutes from routine officework.

Initially announced in April by Axon, Draft One is billed as the “latest giant leap toward [the] moonshot goal to reduce gun-related deaths between police and the public.” The company—best known for Tasers and law enforcement’s most popular lines of body cams—claims its initial trials cut an hour of paperwork per day for users.

“When officers can spend more time connecting with the community and taking care of themselves both physically and mentally, they can make better decisions that lead to more successful de-escalated outcomes,” Axon said in its reveal.

The company stated at the time that Draft One is built with Microsoft’s Azure OpenAI platform, and automatically transcribes police body camera audio before “leveraging AI to create a draft narrative quickly.” Reports are “drafted strictly from the audio transcript” following Draft One’s “underlying model… to prevent speculation or embellishments.” After additional key information is added, officers must sign-off on a report’s accuracy before it is for another round of human review. Each report is also flagged if AI was involved in writing it.

[Related: ChatGPT has been generating bizarre nonsense (more than usual).]

Speaking with AP on Monday, Axon’s AI products manager, Noah Spitzer-Williams, claims Draft One uses the “same underlying technology as ChatGPT.” Designed by OpenAI, ChatGPT’s baseline generative large language model has been frequently criticized for its tendency to provide misleading or false information in its responses. Spitzer-Williams, however, likens Axon’s abilities to having “access to more knobs and dials” than are available to casual ChatGPT users. Adjusting its “creativity dial” allegedly helps Draft One keep its police reports factual and avoid generative AI’s ongoing hallucination issues.

Draft One’s scope currently appears to vary by department. Oklahoma City police Capt. Jason Bussert claimed his 1,170-officer department currently only uses Draft One for “minor incident reports” that don’t involve arrests. But in Lafayette, Indiana, AP reports the police who serve the town’s nearly 71,000 residents have free rein to use Draft One “on any kind of case.” Faculty at Lafayette’s neighboring Purdue University, meanwhile, argue generative AI simply isn’t reliable enough to handle potentially life-altering situations as run-ins with the police.

“The large language models underpinning tools like ChatGPT are not designed to generate truth. Rather, they string together plausible sounding sentences based on prediction algorithms,” says Lindsey Weinberg, a Purdue clinical associate professor focusing on digital and technological ethics, in a statement to Popular Science.

[Related: ChatGPT’s accuracy has gotten worse, study shows.]

Weinberg, who serves as director of the Tech Justice Lab, also contends “almost every algorithmic tool you can think of has been shown time and again to reproduce and amplify existing forms of racial injustice.” Experts have documented many instances of race- and gender-based biases in large language models over the years.

“The use of tools that make it ‘easier’ to generate police reports in the context of a legal system that currently supports and sanctions the mass incarceration of [marginalized populations] should be deeply concerning to those who care about privacy, civil rights, and justice,” Weinberg says.

In an email to Popular Science, an OpenAI representative suggested inquiries be directed to Microsoft. Axon, Microsoft, and the Lafayette Police Department did not respond to requests for comment at the time of writing.

The post Cops are using AI software to write police reports appeared first on Popular Science.



from Popular Science https://ift.tt/8crzHaX

Post a Comment

0 Comments