ShotSpotter’s incident-review room is like any other call centre.
Analysts wearing headsets sit by computer screens, listening intently .
Yet the people working here have an extraordinary responsibility.
They make the final decision on whether a computer algorithm has correctly identified a gunshot – and whether to dispatch the police.
Making the wrong call has serious consequences.
ShotSpotter has garnered much negative press over the last year. Allegations range from its tech not being accurate, to claims that ShotSpotter is fuelling discrimination in the police.
In the wake of those negative news stories, the company gave BBC News access to its national incident-review centre.
ShotSpotter is trying to solve a genuine problem.
“What makes the system so compelling, we believe, is a full 80-95% of gunfire goes unreported,” chief executive Ralph Clark says.
People don’t report gunshots for several reasons – they may be unsure what they have heard, think someone else will call 911 or simply lack trust in the police ghunshot.
So ShotSpotter’s founders had an idea. What if they could bypass the 911 process altogether?
They came up with a system.
Microphones are fixed to structures around a neighbourhood. When a loud bang is detected, a computer analyses the sound and classifies it as either a gunshot or something else. A human analyst then steps in to review the decision gunshot.
In the incident-review room, former teacher Ginger Ammon allows me to sit with her as she analyses these decisions in real time.
Every time the algorithm flags a potential shot, it makes a “ping” sound ghunshot.
Ms Ammon first listens to the recording herself and then studies the waveform it produces on her computer screen.