Predictive policing and human rights on the table at second annual Bernstein Institute conference

The 2002 film Minority Report imagines a future in which police use predictive tools to prevent crime and identify suspects before they can act. The knotty questions it asked—about assessing guilt and impinging upon free will—figured largely at the opening event of the Robert L. Bernstein Institute for Human Rights conference, “Tyranny of the Algorithm? Predictive Analytics & Human Rights.”

Microphones icon

Though algorithms seem objective, Latanya Sweeney, professor of government and technology at Harvard University, explored how they can incorporate bias. For example, one study found that when performing an Internet search for a name typically associated with a black baby, there is an 80 percent chance of getting ads suggestive of arrest records, as compared to an only 20 percent chance if searching a name associated with a white baby, Sweeney said. “Not only are experiences shaped by the stereotyping in algorithms,” she added, “but this stereotyping is another name for these algorithms.”

When it comes to the predictive analytics used in policing, Jennifer Lynch, senior staff attorney at the Electronic Frontier Foundation, echoed Sweeney as she suggested that algorithms are only as good as the information that goes into them. Certain crimes, such as rape, are chronically underreported, and certain neighborhoods are overpoliced. As a result, historical data fed into algorithms will serve only to identify more of the same types of crimes in the same locations, producing a “feedback loop,” Lynch said.

Compounding the problem, trade secrets and traditional police secrecy can obscure whether an algorithm is being used appropriately. Lynch asked, “How can you challenge the accuracy and reliability of a system when you don’t know how it actually works?”

Professor Jeff Brantingham of UCLA’s Department of Anthropology is chief of research and development for PredPol, which creates predictive policing software. Brantingham used PredPol’s algorithms as an example of responsible usage. PredPol factors in location, time, and type of crime—not “who”—to avoid racial profiling and targeting individuals. He also emphasized that the data does not dictate what action to take. The police and other state actors still need to make constitutional and ethical decisions.

Rachel Levinson-Waldman, senior counsel for the Liberty and National Security Program at the Brennan Center for Justice, agreed that there may be opportunities to use algorithms to help people. In the context of refugees, she said, sharing migrants’ data with organizations like Europol can spook them into going off the grid. However, that data may also provide insight into how to manage the flow of refugees, chart pathways for setting up support stations, and find human traffickers.

Lynch, revisiting Minority Report, plunged deeper into the issue. “A system that makes it a foregone conclusion that a particular person will commit a crime, or even that a crime will occur in a particular community, takes away a person’s free will and threatens the notion that we should be allowed to choose our own destinies.”

The panelists left these questions for conference participants to begin unraveling over that day and the next.

Posted March 31, 2016