What If Predictive Algorithms Are Biased?
The Wisconsin Supreme Court tackles sentencing based on proprietary risk-assessment algorithms.
- By James E. Powell
- June 6, 2016
To be most effective, enterprises using predictive analytics must constantly reevaluate their algorithms to make sure their predictions are still accurate. What happens when those very algorithms are the subject of a lawsuit?
We’re about to find out. The Wisconsin Supreme Court will soon rule (in State v. Loomis) whether algorithms used by the state’s court system for sentencing and bail decisions are biased.
The algorithms, part of a tool called the Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), are intended to estimate the risk that a defendant will reoffend. The lawsuit concerns whether these algorithms assess risk accurately, or, as the suit claims, are “inherently biased” and rate lower-income individuals as more risky.
The case began in 2013, when Eric Loomis pleaded guilty to driving a stolen vehicle and fleeing from police. (The vehicle had been part of a shooting, but Loomis claimed he was not involved in that crime.) The judge used results from COMPAS to set his sentence, noting that the COMPAS assessment suggested Loomis was an “extremely high risk” to reoffend. In his appeal, the defendant claims that use of the software violated his right to due process because the details of the algorithm are not released by COMPAS’ maker, Northpointe.
According to a report in today’s Wall Street Journal, Northpointe claims its evaluations of recidivism are up 68 to 70 percent accurate but will not release the formula(s) used.
University of Wisconsin Law School professor Michele Lavigne told the university’s student newspaper, The Badger Herald, that “while there is ‘nothing wrong’ with helping judges make informed decisions, COMPAS relies on ‘static factors beyond the control of the defendant’ when determining appropriate prosecutions.”
Also according to the school newspaper, Jeremy Newman, also a UW Law School professor, said that “technically, COMPAS is not meant to sway a sentencing decision by a judge. Instead, the risk-assessment tool was designed to help the Department of Corrections determine what level of supervision a defendant should receive. ... ‘A main factor at sentencing is protection of the public.’”
If the software predicts that the offender is at high risk to reoffend, “it’s not a stretch to think that judges are sentencing people more harshly than necessary based on a software tool that isn’t designed to be used by judges to sentence people.”
James E. Powell is the editorial director of TDWI, including research reports, the Business Intelligence Journal, and Upside newsletter. You can contact him
via email here.