fb-pixelCan predictive policing help stamp out racial profiling? - The Boston Globe Skip to main content
Perspective | Magazine

Can predictive policing help stamp out racial profiling?

Some algorithms assume crime in one location leads to others nearby. Others identify specific people most likely to be involved in violence. Both can lead to discrimination.

Adobe Stock

One day in 2012, a data analyst for the Cambridge Police Department noticed a striking pattern of thefts in recent crime reports: laptops and purses were repeatedly being stolen on Tuesday and Thursday afternoons at a Harvard Square café. So detectives sent a decoy — their intern — to the café with a backpack that had a computer hanging out of it. Within an hour, the thief tried to steal the laptop and was arrested.

Inspired, Cambridge police partnered with MIT researchers to develop an algorithm that could analyze patterns of burglaries and search for crimes that seemed related. The system was never fully deployed in Cambridge, where burglaries have significantly declined over the past several years. But the New York City Police Department adopted a version of the algorithm in 2016, which it uses hundreds of times each week.

Advertisement



Cambridge and New York had plenty of company in adopting machine learning to improve operations: A 2016 report from the technology policy nonprofit Upturn found that, in the past decade, more than half of the nation’s 50 largest police forces had used or explored using algorithms to forecast crimes, a technique known as predictive policing. The most common approach, developed by technology companies like PredPol, is based on the assumption that crime in one location can lead to others nearby. Other algorithms identify specific people most likely to be involved in crimes or violence.

Predictive policing is often hailed as a scientific solution to otherwise-intractable issues of policing, such as racial profiling. Yet as with many technologies that appear to provide a quick fix to complex social and political challenges, predictive policing promises far more than it can deliver — and actually exacerbates the problems that it claims to solve. Studies by the think tank the RAND Corporation, conducted in Shreveport, Louisiana, in 2012 and Chicago in 2013, found no statistical evidence that these programs actually reduce crime. And despite proclamations that the algorithms are objective and race neutral, they have the potential to entrench and legitimize discriminatory police practices.

Advertisement



The first issue with predictive policing is that the algorithms really re-create police practices from the past rather than predict crime in the future. The data behind these algorithms can overestimate crime in minority neighborhoods and underestimate it in white neighborhoods. Even if a predictive policing algorithm is not hard coded to consider race or income, it can still lead police to be dispatched predominantly to low-income and minority neighborhoods. In Los Angeles, one of the first cities to adopt predictive policing, community members in low-income and minority neighborhoods report seeing the police “all the time,” while residents in affluent neighborhoods see the police “rarely,” according to the Stop LAPD Spying Coalition, an activist group.

These algorithms are often deployed in secret, making it impossible for the public to scrutinize them. Police in New Orleans quietly used predictive policing algorithms for several years without ever announcing they were doing so even members of the City Council were left in the dark. In Chicago, the police department has resisted countless calls to disclose the details of how its algorithm predicts which people are most likely to be involved in gun violence. Several years ago, for example, police showed up unannounced to Robert McDaniel’s home on the west side of the city and warned him not to commit any more crimes. McDaniel, who had multiple arrests on suspicion of minor offenses but only one misdemeanor conviction, learned that he had made the department’s “heat list,” meaning he was considered among those most prone to violence — either as a perpetrator or victim — according to the Chicago Tribune.

Advertisement



By providing the appearance of a value-neutral solution to policing issues without addressing the underlying problems, these algorithms grease the wheels of an already discriminatory system. They may make policing more efficient for officers, but they don’t evaluate whether the current system actually helps address social disorder. Yet, algorithms can still play a productive role in our criminal justice system by keeping vulnerable people with pressing social needs out of jail. That’s the goal behind the Data-Driven Justice Initiative, a national program started by the Obama White House in 2016, which aims to use data to prevent people with behavioral and physical health conditions from cycling through jail, emergency rooms, and other crisis services. More than 150 jurisdictions have signed on to the initiative, and Middlesex County is one of three sites selected to participate in a two-year pilot program that began last May.

One of the first communities involved was Johnson County, Kansas, where officials have spent the last decade developing programs that divert low-level offenders with mental illness away from the criminal justice system and into treatment. Because these programs are often hindered by limited knowledge about which people need services, the county partnered with the University of Chicago in 2016 to develop a machine learning model that can guide proactive outreach from mental health caseworkers.

Advertisement



Technology alone cannot solve today’s most pressing social challenges. There are no easy fixes for police discrimination. Rather than rush to devise technological solutions to every social problem, we must engage in the difficult political task of reshaping police practices, priorities, and power. We must develop nonpunitive and rehabilitative approaches to addressing community needs. That is a feat that no algorithm, no matter how sophisticated, could ever perform.


Ben Green is the author of  “The Smart Enough City: Putting Technology in Its Place to Reclaim Our Urban Future.” Send comments to magazine@globe.com. Get the best of the magazine’s award-winning stories and features right in your e-mail inbox every Sunday. Sign up here.