Sound Thinking
Law Enforcement

Why the Chicago OIG Report is Misunderstood

Home / Why the Chicago OIG Report is Misunderstood

In August 2021, the Chicago Office of Inspector General (OIG) published a report, “The Chicago Police Department’s Use of ShotSpotter Technology.” As the name implies, it was an audit of CPD’s use of the tool, not the efficacy of the tool itself. The report attempted to examine the outcome of all ShotSpotter alerts that occurred in Chicago between January 1, 2020 and May 31, 2021.

Unfortunately, the report suffered from incomplete and irreconcilable data, a fact that it acknowledged explicitly. Consequently, the OIG concluded “it may not be possible at present to reach a well-informed determination as to whether ShotSpotter is a worthwhile operational investment as an effective law enforcement tool for the City and CPD.”

Nonetheless, this report has often been cited by critics to characterize ShotSpotter as ineffective. There are two main mistakes with this characterization:

  1. First, it rests on a flawed assumption that ShotSpotter produces a high number of “false” alerts.
  2. Second, it presents a misleading interpretation of those alerts that the OIG could link to gun-related criminal evidence.

“FALSE ALERTS”

The OIG report concluded that ShotSpotter alerts rarely produce documented evidence of a gun-related crime. It states that, “Of the 50,176 confirmed and dispatched ShotSpotter alerts, 41,830 report a disposition – the outcome of the police response to an incident. A total of 4,556 of those 41,830 dispositions indicate that evidence of a gun related criminal offense was found, representing 9.1% of CPD responses to ShotSpotter alerts.”   

SoundThinking asserts that a ShotSpotter alert is, itself, digital evidence that gunfire occurred — with the specific location, a precise timestamp, an audio recording, and other forensic elements as part of the digital evidence.  The OIG report focuses, instead, on the percentage of ShotSpotter alerts where police did not find “physical evidence” or a witness willing to corroborate the digital evidence made available to CPD via the ShotSpotter service.  But in fact, CPD becomes a virtual witness to gunfire with its access to the audio recording and other pertinent data listed above.

Unfortunately, the findings referenced in the OIG report have been twisted by a few critics to spread a false narrative suggesting that any ShotSpotter alert must be a false alert (“false positive”) if no physical evidence or witnesses were found to validate that a gun crime occurred.

In the real world, many factors can affect whether law enforcement can locate physical evidence or a witness to corroborate that a gun crime was committed. These factors include but are not limited to:

  • the time required for an officer to arrive at the scene;
  • the amount of time that officer has to investigate the scene;
  • whether the perpetrator picked up their own shell casings;
  • the type of firearm involved;
  • whether cooperative witnesses or victims remained at the scene, and more.

These and other factors are further compounded by the fact that the dataset reflects public safety responses during a global pandemic and heightened civil unrest. These factors relating to whether or not physical evidence can be gathered at the scene of criminal gunfire hold true whether it is a response to a ShotSpotter gunfire alert or a response to a citizen’s call to 911.

Though evidence collection is challenging, in fact, research shows that ShotSpotter improves evidence collection by officers responding to shooting incidents. For example, according to the Urban Institute, police departments using ShotSpotter have a rate of finding shell casings that is up to three times higher due to the precise location provided by ShotSpotter alerts. Shell casings are critical evidence in an investigation that can be used to identify the gun that was fired, and ultimately identify and prosecute a suspect.

By assuming that any ShotSpotter alert that does not result in documented evidence at the scene is a false alert, it implies that ShotSpotter’s False Positive rate in Chicago was roughly 90% during the period of this analysis.

In fact, SoundThinking goes to great lengths to identify and minimize “false positives.” Definitions matter here – and “false positive” has a very specific definition that can be–and is–measured. “False positive” means that a ShotSpotter alert was sent to CPD for a specific time and location, when in fact there was no actual gunfire. ShotSpotter maintains an extremely low false positive rate, at just 0.5% across all customers in the last four years, according to independent analytics firm Edgeworth Analytics. ShotSpotter’s accuracy allows police to coordinate safe and efficient responses that require fewer resources in a way that improves community trust.

Further, while we fully recognize that no technology is perfect in the real world and false positives will never be zero, there has been no indication from the Chicago Police Department that false positives are an ongoing problem or concern. In fact, if ShotSpotter alerts were sending officers on dead-end calls, they would quickly lose confidence in the technology and would stop relying on it to guide them to the scene of these crimes. This is simply not the case.

Lastly, it is important to note that contractually, SoundThinking guarantees 90% accuracy, with significant financial penalties if we do not meet that specification. Nationally, the ShotSpotter system has been audited as operating at a greater than 97% threshold.

INCOMPLETE ANALYSIS OF EVIDENCE COLLECTED DUE TO SHOTSPOTTER

More importantly, the OIG Report failed to provide context to the 4,556 incidents that did have dispositions indicating physical evidence of a gun crime was found. For instance, SoundThinking’s analysis of ShotSpotter and publicly available OEMC data for this same period reveals:

  • ShotSpotter alerted CPD to over 800 more gunshot wound victims than were reported to 911.
  • ShotSpotter alerted CPD to almost 1,700 more instances of illegal use of a firearm than were reported to 911.
  • ShotSpotter alerted CPD to over 150 additional homicides than were reported to 911.

Ultimately, the OIG concluded that more information was needed to fully assess the value (costs & benefits) of the ShotSpotter program:

  • The OIG Report states that, “For this weighing of costs and benefits to accrue in favor of continued use of ShotSpotter technology, CPD and City would be well-served by being able to clearly demonstrate it law enforcement value.”  Such a value is not clearly demonstrated by presently available data.”
    • This conclusion shows that the Office of Inspector General determined that more information was needed to fully assess the value (costs & benefits) of the ShotSpotter program.
  • The OIG Report also concluded that, “Better data on law enforcement outcomes from ShotSpotter alerts would be valuable to support the City’s future assessment of whether to further extend, amend, or discontinue its contractual relationship with ShotSpotter.”  
    • SoundThinking agrees with this conclusion, but also contends that any analysis of outcomes must include a more thorough investigation of the outcomes of the ShotSpotter alerts that do lead to physical evidence of a gun crime or, more importantly, to a gunshot victim, and not simply on the percentage of those that do not.

Unfortunately, even though this report that was produced with weak data and lacked vital context, some have used it to conclude that ShotSpotter does not produce a value that justifies its cost to the City of Chicago.

They are simply misinformed.

SoundThinking contends that an analysis of complete data and appropriate context would clearly show its value. That’s because many (like the OIG) assume that immediate arrests and collection of physical evidence are the only measure of the system’s value.

But many ShotSpotter critics entirely overlook the system’s greatest use: saving lives.

Join Thousands: Log Your Support of ShotSpotter

Search