Slight tangent: The article cites the ProPublica study on the Northpointe software in which journalists (not statisticians) reported the software as biased. What they left out is that an independent study found this study showing bias to be wrong.
Source: Flores, Bechtel, Lowencamp; Federal Probation Journal, September 2016, "False Positives, False Negatives, and False Analyses: A Rejoinder to “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks.”", URL http://www.uscourts.gov/statis...
In fact the ProPublica analysis was so wrong that the authors wrote: "It is noteworthy that the ProPublica code of ethics advises investigative journalists that "when in doubt, ask" numerous times. We feel that Larson et al.'s (2016) omissions and mistakes could have been avoided had they just asked. Perhaps they might have even asked...a criminologist? We certainly respect the mission of ProPublica, which is to "practice and promote investigative journalism in the public interest." However, we also feel that the journalists at ProPublica strayed from their own code of ethics in that they did not present the facts accurately, their presentation of the existing literature was incomplete, and they failed to "ask." While we aren’t inferring that they had an agenda in writing their story, we believe that they are better equipped to report the research news, rather than attempt to make the research news."
The authors of the ProPublica article are no longer with the organization, but this article shows up in any news article about AI bias. The fake story just doesn't want to die...
With all that said, I have some hopes that algorithms will help make truly race-blind decisions in criminal justice. It's easier to test them for bias than humans, and decisions are made in a consistent, repeatable manner.