In Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks published by ProPublica, the authors, Angwin, Larson, Mattu and Kirchner purport to assess the racial bias of a commonly used risk assessment, ultimately expanding their conclusions to imply that risk assessments that are commonly used in criminal justice settings are inherently biased against African Americans.
In their response to the story, researchers Anthony Flores, Chris Lowenkamp and CJI’s Kristin Bechtel highlight the flaws and erroneous assumptions made in the ProPublica article and use the data employed by the Larson et al., to conduct an independent analysis. Findings revealed that the conclusions reached by Larson et al were not supported by the data.