With gun violence soaring, police technology is justifiably receiving more scrutiny than ever. As police departments and cities make increasingly large expenditures to reduce rising levels of gun violence, citizens may understandably feel at a loss as to how the technology works and whether it is fairly applied to citizens. The most recent example of a company caught in the crossfire is ShotSpotter.
Negative reports have undermined the public’s trust in St. Louis city’s acoustic gunshot detection system. While it is true that we need more information about the technology’s efficacy and impact on communities of color, much of the recent criticism fails to foster a substantive debate.
ShotSpotter is accused of fabricating cases out of thin air for police, but my experience with the data suggests a more likely explanation. ShotSpotter sensors can be activated by a number of loud noises, all of which are processed through algorithms. Noises initially determined to not be gunfire are recorded but not forwarded to police.
Legitimate reasons exist why acoustic systems may have trouble identifying a small number of gunfire incidents: Interference by wind, loud cars, thunder, etc. Cases identified as “not gunfire” therefore are not immediately forwarded to police. These false-negative cases do get recorded and retained by ShotSpotter and can easily be retrieved by law enforcement for investigative purposes. A match can thus occasionally be made between the place and time of a violent crime and a ShotSpotter activation that was initially dismissed.
While such cases may appear fabricated because they do not exist in a police department’s calls-for-service log, all it really means is that the case received no initial dispatch alert from the technology.
Some also have suggested that addresses of ShotSpotter alerts are being changed by ShotSpotter to serve police interests. What may seem nefarious is actually most likely a matter of which data source you examine. ShotSpotter alerts include both spatial coordinates (latitude/longitude) and the nearest postal address. Police calls-for-service data typically records the address forwarded by ShotSpotter even if officers use the spatial coordinates to guide their response.
This can create discrepancies with some parcels, such as parks where the physical address can be far from the actual location where an alert occurred. For this reason, many agencies include specific language in their standard operating procedures that instructs officers to respond to the spatial coordinates, not the address.
The purpose of acoustic technology is to quickly detect and forward geographically accurate gunfire alerts. This means police response time is minimized for most legitimate gunfire. Studies, for example, have demonstrated that ShotSpotter can accurately identify about 80% of outdoor gunfire. This is, by all accounts, a substantial improvement on citizen reporting, which is pegged to miss around 80% of gunfire.
Even though some gunfire may not get forwarded by acoustic systems, revising the algorithms to detect all gunfire would increase the number of false positives — cases labeled as gunfire but aren’t. Nobody is in favor of this.
In that regard, both the Chicago inspector general’s report and the MacArthur Justice Center faulted ShotSpotter for producing many false positives. Indeed, only a small percentage of ShotSpotter incidents leads to an assaultive gun crime, but the same is true for most citizen calls for gunfire. It doesn’t mean that crime didn’t occur, since discharging a firearm inside city limits is illegal.
Linking acoustic alerts to concrete evidence for gunfire incidents without injury can be difficult. Even in cases where shell casings are retrieved, this may not appear in the original calls-for-service record. The fact that the inspector general’s report only found 9.1% of ShotSpotter cases with evidence of gunfire is therefore certainly a serious undercount.
What is wrong is to call all alerts without immediate evidence of a crime “false positives.” This implies the alert was not gunfire, though it most likely was.
I understand why many people may be disinclined to support ShotSpotter technology after reading the recent coverage, but I hope that my added context may soften that stance. My evaluation in St. Louis has been frequently cited in the media as evidence that ShotSpotter does not reduce crime, but my more recent work in Cincinnati shows strong crime reductions.
Such mixed results suggest that the nature of the police response can be an important factor in success. As with many violence prevention programs it is key that we identify best practices and aim to find unintended consequences. We certainly deserve accountability and answers about the efficacy of expensive violence prevention programs, be they ShotSpotter or Cure Violence.
Many federal grant programs already have such requirements in place, but local and state governments continue to lag. It is time we ask them to be more committed to finding out what works to reduce gun violence.
Dennis Mares is a professor of criminal justice at Southern Illinois University Edwardsville.