Categories

Crime-Prediction Software: Justice or Just Unfair?

17 Feb 2016
Police-07

From shaping the way we shop online (Why, yes I AM interested in that waffle maker, Amazon!) to determining our next disastrous date (NOT OK, Cupid), the role algorithms play in our lives continues to grow as tech progresses. But along with affecting our consumer experience, social lives and…well, everything else…what if they could play a significant role in making us safer as well?

The creators of crime-prediction software—and the police departments embracing it—think that might be possible.

One such company working in this sphere is Philadelphia-based startup Azavea, whose “web-based predictive policing system,” HunchLab, is being implemented in certain cities (St. Louis, for one) as a means of predictive policing, allowing officers to identify patterns and, by proxy, cut down crime. But how does it work exactly and what factors does it take into account?

According to The Verge, “HunchLab primarily surveys past crimes, but also digs into dozens of other factors like population density; census data; the locations of bars, churches, schools, and transportation hubs; schedules for home games — even moon phases.” This provides insight into when, where and even, in some cases, why crimes are occurring and serves as a way for the department to recognize factors that they perhaps never would have thought of or identified previously as crime catalysts. (For example, it’s not likely officers would’ve realized the fact that aggravated assault rates in Chicago have historically decreased on days when it’s more windy or that, in Philadelphia, cars parked near schools were more likely to be stolen. Both of those, however, are apparently true patterns.)

And Azavea isn’t the only company producing products of this nature. Another, Hitachi, has created Predictive Crime Analytics (PCA) software, which, the Daily Mail writes, “[blends] tweets, CCTV camera feeds, gunshot detectors, and traffic systems with historical crime and incident data, Hitachi Data System’s Predictive Crime Analytics (PCA) generates heat maps of potential crime hotspots.” Based on this information, the threat level of various areas are then rated on a scale from 1-100.

Now to clarify, crime-prediction software isn’t completely transforming how cops typically operate. Instead, it often reaffirms the way departments have already been functioning: sending patrols to areas deemed “high-risk” in attempts to deter crime, catch criminals quickly, and act as a community presence. And in many cases, the numbers show that this kind of predictive policing can reduce crime

However, this isn’t a picture-perfect solution guaranteed to result in a crime-free utopia, and, as is often the case with Big Brother-esque technology of this kind, there are some interesting ethical questions raised. After all, by sending cops to these designated high-risk areas—likely leading to more arrests, which, in turn, will likely deem the areas even more high risk—are we creating a sort of self-fulfilling prophecy? While the software simply uses data to designate areas with more crime, there’s the legitimate worry that officers may take this information and unfairly administer justice based on the assumption that crime is more likely to occur.

When discussing an area of St. Louis where predictive policing is used, John Chasnoff, program director of the ACLU chapter for Eastern Missouri, expressed his concern:

“It’s a vicious cycle. The police say, ‘We’ve gotta send more guys to North County,’ because there have been more arrests there, and then you end up with even more arrests, compounding the racial problem.”

Along with that worry, there’s also the fact that predictive technology does nothing to solve real issues, protect the public and improve police-community relations, but instead simply serves to treat the symptoms of problems.

All in all, there’s no clear answer, and we might have to simply wait and see if, as it progresses, this software succeeds in reducing crime and making our streets safer or if predictive policing creates a more segmented world and an unfair bias.

And along with the ethical questions at hand, there’s another that always needs answering: “Bad boys, bad boys…whatchu gonna do…”