News &

Human VS AI: The Lawsuit Against ShotSpotter

An issue with AI systems used as a security system is an inherent predilection towards bias. Many studies have shown that algorithmic system training inevitably leads to data prejudice. After all, there’s no perfect algorithmic pedagogy that encompasses all traits across demographics.

Moreover, reliance on AI undermines criminal evidence-based investigations that can lead not only to “blind faith” in the AI but might also stop police from exploring the deeper end of the inquisition and justice process.

We are already witnessing such events unfold, especially since the police started utilizing an AI system known as “ShotSpotter.”


ShotSpotter is an AI system that was initially developed as a gunshot detection system. However, an improvement upon data-driven tools allowed it to be a more efficient, effective, and equitable system that improves outcomes for all crime; or so they claim.

ShotSpotter remained controversial. The MacArthur Project reported in 2021 that 89% of ShotSpotter tip-offs lack on-site evidence. Worse still, people also claim that the system has racial prejudice. That is due to the horrific death of an unarmed black teenager in March 2021. The police shot the unaware Adam Toledo due to an alert from ShotSpotter.

Inevitably, another victim suffered due to this AI system. Michael Williams, a 65-year-old resident of Chicago, was arrested last year due to his alleged connection to the shooting of Safarian Harring while giving the man a ride home from a police brutality protest. He filed a lawsuit for his “evidence-less” arrest.

There are claims that the police did not confirm William’s motive for the shooting. The police’s grounds for arrest was the soundless security footage of a car and a ShotSpotter alert.

The Lawsuit

Michael William’s central ground for the litigation was his “evidence-less” arrest. However, the federal suit also claims that officers blindly relied on the AI system. Not only is that detrimental due to groundless arrests, but it can also stop the police from pursuing other leads in a case.

The tech company maintains that ShotSpotter was not responsible for Michael’s incarceration. They claim that the prosecutor theorized that Michael shot Mr. Herring inside the car. Therefore, his incarceration was due to a defect in the criminal investigation and not the alert from the AI system.

If Michael wins the lawsuit, all ShotSpotter operations will stop. Given that the city has a $33 million contract with the tech company, his victory will forever change AI-assisted policing.


The most intriguing part of this case is that ShotSpotter’s reputation remains under the spotlight as a new civil rights class action lawsuit filed in the district court for the northern district of Illinois takes aim at illegal stops and false charges.

The tech company that controls ShotSpotter remains resolute that ShotSpotter is not inaccurate or racist. They claim that the AI system maintains a 97 per cent aggregate accuracy rate for real-time detections across all customers and works by mapping “objective historical data on shootings and homicides.”

Nonetheless, the polarity between the claims of both sides continues and will be fascinating to see who emerges victorious in the end.

Our Trademark Lawyers in Sydney are Experts when it comes to Registering Trademark or Trademark Opposition Process in Sydney


Alessandra “Max” Maxine, Digital Administrator

Contact W3IP Law on 1300 776 614 or 0451 951 528 for more information about any of our services or get in touch at
Disclaimer. The material in this post represents general information only and should not be taken to be legal advice.

Leave a Reply

Your email address will not be published. Required fields are marked *