News Release

Field-data study finds no evidence of racial bias in predictive policing

Peer-Reviewed Publication

Indiana University

INDIANAPOLIS -- While predictive policing aims to improve the effectiveness of police patrols, there is concern that these algorithms may lead police to target minority communities and result in discriminatory arrests. A computer scientist in the School of Science at IUPUI conducted the first study to look at real-time field data from Los Angeles and found predictive policing did not result in biased arrests.

"Predictive policing is still a fairly new field. There have been several field trials of predictive policing where the crime rate reduction was measured, but there have been no empirical field trials to date looking at whether these algorithms, when deployed, target certain racial groups more than others and lead to biased stops or arrests," said George Mohler, an associate professor of computer and information science in the School of Science at IUPUI.

Mohler, along with researchers at UCLA and Louisiana State University, worked with the Los Angeles Police Department to conduct the experimental study. A human analyst made predictions on where officers would patrol each day, and an algorithm also made a set of predictions; it was then randomly selected which set was used by officers in the field each day.

The researchers measured the difference in arrest rates by ethnic groups between the predictive policing algorithm and maps of hot spots created by LAPD analysts that were in use prior to the experiment.

"When we looked at the data, the differences in arrest rates by ethnic group between predictive policing and standard patrol practices were not statistically significant," Mohler said.

The study examined data both at the district level and within the LAPD officers' patrol areas and found there was no statistically significant difference between arrest rates by ethnic group at either geographical level. Finally, researchers looked at arrest rates overall in patrol areas and found that they were statistically higher in the algorithmically selected areas, but when adjusted for the higher crime rate in those areas, the arrests were lower or unchanged. "The higher crime rate, and proportionally higher arrest rate, is what you would expect since the algorithm is designed to identify areas with high crime rates," Mohler said.

Mohler said that in the developing field of predictive policing, there continue to be lessons learned from each study and implementation. A recent simulation study of predictive policing with drug arrest data from Oakland, California, showed there is potential for bias when these algorithms are applied in certain contexts. Mohler hopes the Los Angeles study is a starting point to measure predictive policing bias in future field experiments.

"Every time you do one of these predictive policing deployments, departments should monitor the ethnic impact of these algorithms to check whether there is racial bias," Mohler said. "I think the statistical methods we provide in this paper provide a framework to monitor that."

"Does Predictive Policing Lead to Biased Arrests? Results from A Randomized Control Trial" is published in the journal Statistics and Public Policy. Additional authors are P. Jeffrey Brantingham of UCLA, corresponding author, and Matthew Valasik of Louisiana State University.

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.