Medical advancements and public health improvements are lowering death rates for many people, but not everyone is benefitting. A doctor explains why women are seeing a rise in deaths. We also learn about an artist’s appreciation for the Wisconsin landscape, and look at bias in crime prediction software.
Featured in this Show
-
Investigation Finds Criminal-Prediction Software Has Bias Against African-Americans
A new investigation out this week from ProPublica found that criminal-prediction software used by judges is more likely to falsely flag black defendants as future criminals.
These risk assessment tools are becoming increasingly common in courtrooms across the nation, including in Wisconsin, and are often used for as guidance before sentencing, as well as for a variety of other purposes.
ProPublica’s investigation, which looked at thousands of risk scores of individuals who were previously arrested, found that the formula was particularly likely to give high-risk scores to African-Americans, wrongly labeling them this way at almost twice the rate as white defendants.
Julia Angwin, who lead the investigation for ProPublica, said the software has burst onto the scene in recent years with good intentions.
“The idea is that humans and our criminal justice system have been biased racially for a while, and that if we were to use more objective measures of risk we might take some of that bias out of the system,” she said.
But good intentions, added Angwin, don’t help black people who are receiving stiffer sentences due to the unintentional programming bias that is inflating scores for African-Americans.
Angwin said the investigation was originally inspired by former U.S. Attorney General Eric Holder, who in 2014 asked for an independent assessment on whether there was racial bias in risk assessment scores. ProPublica analyzed the future criminality of 7,000 people for two years after they were given a risk score, a widely accepted window of time for recidivism. White people were twice as likely to get a low-risk score that was unjustified, meaning that they would go on to reoffend in the next two years.
“Weirdly, the overall accuracy of the algorithm is about 61 percent, meaning it’s a little better than a coin toss,” said Angwin. “But the way that it fails is different by race, and that’s something I think we all need to think about when adopting such tools.”
The software developers have disputed ProPublica’s findings, saying the investigation reached a wrong conclusion.
Angwin said they haven’t backed their claims up.
“We asked them repeatedly to provide some specific feedback about things they thought were wrong, and we did not get that from them,” she said. “I am hopeful that maybe they will still come forward with something that will allow us to understand their criticism.”
-
Report: Criminal Prediction Software Often Biased Against Blacks
A new investigation out this week from ProPublica takes a look a software used across the country to predict future criminal acts. These ‘risk assessments’ are becoming increasingly common in courtrooms across the nation, and are used for a variety of purposes…but the investigation found that the software was particularly likely to falsely flag black defendants as future criminals. A reporter discusses the results and the implications for the criminal justice system.
-
White Women's Mortality Rate Is Getting Worse
While overall death rates are improving, white people, and white women in particular, are losing ground. We find out why, and what can be done from a public health perspective.
-
Cultivating Wonder: Rural Wisconsin Landscapes As Inspiration
Wisconsin’s poet laureate shares the inspiring wonder of Wisconsin’s natural and farm landscapes.
Episode Credits
- Rob Ferrett Host
- Veronica Rueckert Host
- Chris Malina Producer
- Judith Siers-Poisson Producer
- Julia Angwin Guest
- Dr. Patrick Remington Guest
- Max Garland Guest
Wisconsin Public Radio, © Copyright 2024, Board of Regents of the University of Wisconsin System and Wisconsin Educational Communications Board.