Police departments around the country are increasingly turning to predictive policing software to help determine when and where a crime is likely to occur — and who will commit it.
Yet the practice is not without criticism.
Supporters say it helps resource-strapped police departments, while critics say it reinforces racial biases and practices.
Stay informed on the latest news
Sign up for WPR’s email newsletter.
Rashida Richardson, director of policy research at the AI Now Institute at New York University and author of a forthcoming study on predictive policing, says there’s a flaw in the data.
“All of them rely on police data,” she said. “They’re using years and months of this data to train the system.”
Years and months of discriminatory policing practices, Richardson said, like stop, question and frisk policies or planting evidence. The data is gathered through a software provided by third-party vendors like HunchLab and PredPol.
Richardson’s research looked at 13 jurisdictions around the country that have used, are currently using or piloted predictive policing and have histories of discriminatory and unlawful practices.
“In all but four of those cases we found that the data generated during those times of unlawful and discriminatory police practices did in fact skew or influence the outcomes of the technology,” she said.
In the other four, which includes Milwaukee, Richardson said there was not enough public information to make a conclusion.
Those findings raise questions about whether the tool can be used in a objective way that doesn’t perpetuate discrimination and bias, she said.
Take stop, question and frisk policies that disproportionately targeted certain communities and resulted in those communities being overrepresented in the data, Richardson said.
“These systems sort of create a feedback loop, so if you’re sending officers to a certain neighborhood telling them that a system said a crime may occur, then it’s more likely that that’s what they’re looking for when in fact that may not be the case,” she said.
Yet Capt. Jason Melby of the city of La Crosse Police Department, says the software is not a replacement for good community policing.
“There’s no substitute for … good community relations, good police reports and professional practice in the field,” Melby said. “As we move forward with this and look at it, it’s going to be part of a total approach and how we police our community.”
The department will be doing a two-month pilot of the program in the near future before deciding whether to adopt the practice, Melby said.
Melby sees it as a way to use resources wisely and as one tool in the toolbox for the department — and a small one at that.
La Crosse is a different place than Chicago or New York, Melby said, and he envisions using the technology more for identifying areas and times where increased traffic accidents occur, as well as property crimes.
For example, Melby knows anecdotally that when the University of Wisconsin-La Crosse students go home for Christmas break, there are an elevated number of burglary reports while they’re gone, he said.
“This type of analytics may help us pinpoint a little better locations and patterns associated with those type of burglaries, so that when we go to disperse our resources … they will be more informed,” Melby said.
While Richardson said she thinks there’s a way to use predictive policing in the future that would benefit public safety, the data isn’t ready yet.
“There needs to be more work done on both the government side and by predictive policing vendors to validate and really scrutinize these data sources before putting these technologies into public use,” she said.
Wisconsin Public Radio, © Copyright 2024, Board of Regents of the University of Wisconsin System and Wisconsin Educational Communications Board.