Of Public Safety Facts and Figures
After a brief stint of noncommittal part-time programming gigs that much resembled my own college experience, it was decided: I needed a path forward that afforded economic stability and job security. So, I joined the NYPD, a decision which, I’ve been told, caused more than a little trepidation among some of my family at the time.
I did traditional patrol police work in Harlem and the Bronx for years before being pulled back to my roots in data and technology when I was asked to transition “inside” into an administrative position that, among other things, included quite a bit of crime analysis. It was here that I cut my teeth working on vast multidimensional datasets and rediscovered not only my natural talent for working with data, but also my passion for finding its most efficient use.
Much of my time was spent compiling the precinct’s crime statistics, which were then put up side by side relative to other precincts city-wide to determine appropriate strategies to control or mitigate crime. Eventually, this data was not just reported on, but used in more sophisticated applications to determine when and where major felonies like shootings and robberies might occur. This mass crime tracking and reporting strategy became known as predictive policing.
Somewhat controversial and not entirely convincing not just to critics but also to police themselves, predictive policing leveraged historic crime data to make projections about which specific neighborhoods and times of day were more likely to experience spikes in crime. Yet, as with all models, there were limitations. For example, it is widely known to police who the violent offenders are, since they are generally repeat offenders (known colloquially as “career criminals”).
Whether or not a particular individual responsible for a disproportionate amount of crimes in the neighborhood was or was not incarcerated at a given point in time was not accounted for in the model. This may seem circumstantial but is exactly the type of enhanced data that predictive policing models do not currently account for. I was unable to solve for this particular challenge in my crime analyst role, but I did get the chance to solve for another when I transitioned to a newly minted citywide department, the Information Technology Bureau (ITB), into a role within its Strategic Technology division.
There I worked on what was probably one of the leading handicaps in policing data integrity, the manual record-keeping system known as the Activity Log. One thing I’ve learned from my years on patrol is that the challenges police once faced with appropriately logging crime data when making reports is significant. Human error was dwarfed only by lack of opportunity to leverage the efficiencies and accuracies of an automated process. In the absence of more sophisticated digital tools and technologies, the Activity Log was a prime example of legacy systems and manual processes that materially impacted the integrity of any data that would ultimately feed into the precinct’s crime statistics.