TV, Data, and Crime

As I sit down on my couch watching the crime television series Person of Interest, passively listening to the show as I munch on some snacks, I suddenly hear a line that piques my interest:

“The intelligence the [AI] machine produces has already foiled a half dozen major terrorist plots.” The machine being referred to here is a surveillance artificial intelligence system that is used by the main characters to identify individuals involved in potential crimes.

As an MSA student just beginning my journey into the world of analytics, it is always exciting to unexpectedly stumble upon a piece of data-related information. As it turns out, crime shows like Person of Interest, though fictional and dramatized, do highlight the fact that data analytics is an important part of the law enforcement industry. In fact, police departments generally take on a data-driven approach to detect and reduce criminal activity across the country.

Crime analytics software is generally used in conjunction with GIS (geographic information system) data, as well as historical crime data from a police records management system (RMS). This software helps police departments to first identify and track crime patterns/potential suspects, and then use that information to develop police strategy and distribute resources to particular areas of interest. Data analytics tools can help law enforcement officials efficiently analyze large amounts of data to track down anything from local occurrences of petty crime, to even large-scale terrorism (like in Person of Interest) and other nationwide criminal threats.

However, the use of data analytics as a resource for law enforcement is not always beneficial. I first became aware of this during Communication Week here at the MSA. As part of a group project, my team and I explored the impacts of a predictive policing AI program that aimed to serve as a resource for crime reduction in a hypothetical midwestern American city (full case study linked here). Predictive policing, as the name suggests, involves the use of predictive analytics in law enforcement to identify potential criminals and the likelihood of criminal activity in particular areas and accordingly allocate police resources to those specific areas of interest. 

Upon further researching predictive policing, however, we discovered that there was a myriad of issues associated with this forecasting method, namely a wide range of biases (primarily with regards to race and income) that can actually perpetuate cycles of violence and harassment rather than weaken them. Predictive policing algorithms generally use geographical arrest data to identify areas with a historically high number of arrests. However, this type of data is generally biased against minority neighborhoods, which, in turn, biases these algorithms. Because more arrests happen in minority neighborhoods, predictive policing tools direct more policing to those areas, which leads to even more arrests in minority neighborhoods. By using biased data, predictive policing algorithms have been shown to not accurately predict crime in more affluent areas and will instead recommend concentrating police resources almost exclusively in lower-income areas. Therefore, predictive policing algorithms cannot accurately identify all locations where crimes will likely occur, as they will be primarily focused on minority neighborhoods. By leading to the arrest of an increased number of minority individuals, these algorithms can worsen the racial divisiveness that exists in the incarceration system.

Learning about predictive policing through this MSA team project opened my eyes to a specific example of how the use of biased data leads to biased algorithms, which can have wide-reaching impacts (potentially on a societal scale).

Algorithms truly are only as “good” (i.e., unbiased) as the data they receive.

In a world that is becoming increasingly data-driven, it is all the more important to be aware of, and correct for, biases that can exist in data. Although no algorithm can ever be completely accurate and unbiased, reworking crime algorithms to rely less heavily on demographic data (including factors like race, gender, etc.) is a start.

It is important to note that, despite its implementation-related flaws, data analytics has aided police agencies in tracking down perpetrators to address both nationwide and small-scale occurrences of crime. Big data has found its way into the criminal justice space as well, with nearly 21 American jurisdictions making use of an algorithm as a means to assess the likelihood of recidivism and assist courts in making pretrial decisions (regarding length of sentence, bail, etc.).

It is evident that the use of data is here to stay in the law enforcement industry. By increasing awareness of, and taking steps to mitigate biases, data analytics can become an even more valuable tool to help law enforcement officials catch bad guys both on-screen and in real life.

Columnist: Anika Rally