Blogs

BlogsAll Blogs

Lite DePalma Greenberg Law Blog

Search our blog posts

June 15, 2017Download PDF


Should Risk-Assessment Technology Be Part Of Our Criminal Justice System?

The age-old adage that nothing is certain in life but death and taxes seems ready for an update. The persistence of the digital revolution is proving to be another of life’s inevitabilities. The digital revolution has proven to be an unstoppable tidal wave of “advancement” that wipes out people and industries that are unwilling or unable to adapt. Unsurprisingly, the legal industry has had to adapt to survive in the digital world, but now, the digitization of the criminal justice system is calling into question whether technological “advancement” equals “progress.” Across the country the criminal justice system is becoming automatized. Many states have started using algorithms to help determine sentencing, parole, and bail. These algorithms asses a defendant’s risk of recidivism or risks of jumping bail by analyzing data about a given defendant.

In State v. Loomis, 881 N.W.2d 749 (Wis. 2016), a case recently heard by the Supreme Court of Wisconsin, the dangers surrounding the use of algorithms in sentencing were highlighted. The State alleged that the defendant was the driver in a drive-by shooting. Loomis (the defendant) entered into a plea agreement. During sentencing, the State argued aggravating factors based on the allegation that Loomis drove the vehicle during the shooting. The circuit court accepted Loomis’s plea and ordered a pre-sentence investigation that included a risk assessment by a program called COMPAS. As the circuit court explained:

COMPAS is a risk-need assessment tool designed by Northpointe, Inc. to provide decisional support for the Department of Corrections when making placement decisions, managing offenders, and planning treatment. The COMPAS risk assessment is based upon information gathered from the defendant's criminal file and an interview with the defendant. A COMPAS report consists of a risk assessment designed to predict recidivism and a separate needs assessment for identifying program needs in areas such as employment, housing and substance abuse. The risk assessment portion of COMPAS generates risk scores displayed in the form of a bar chart, with three bars that represent pretrial recidivism risk, general recidivism risk, and violent recidivism risk. Each bar indicates a defendant's level of risk on a scale of one to ten. COMPAS provides a prediction based on a comparison of information about the individual to a similar data group.

Loomis's COMPAS scores indicated that there was a high risk of recidivism. The circuit court used the COMPAS score to sentence Loomis to six years in prison. Loomis subsequently filed a motion for post-conviction relief arguing that the circuit court's consideration of the COMPAS risk assessment at sentencing violated his due process rights. The circuit court upheld the sentence and Loomis appealed to the Wisconsin Supreme Court.

The question before the Court was whether the use of a COMPAS risk assessment at sentencing "violates a defendant's right to due process, either because the proprietary nature of COMPAS prevents defendants from challenging the COMPAS assessment's scientific validity, or because COMPAS assessments take gender into account." The Court concluded that “a circuit court's consideration of a COMPAS risk assessment at sentencing does not violate a defendant's right to due process . . . because the circuit court explained that its consideration of the COMPAS risk scores was supported by other independent factors, its use was not determinative in deciding whether Loomis could be supervised safely and effectively in the community.”

Among his arguments, Loomis claimed that his due process rights were violated because COMPAS’s algorithm was proprietary. According to Loomis, COMPAS’s proprietary algorithm made it impossible to test its validity or challenge the factors used to find that he was a high risk defendant. “Northpointe, Inc., the developer of COMPAS, considers COMPAS a proprietary instrument and a trade secret. Accordingly, it does not disclose how the risk scores are determined or how the factors are weighed.” Loomis argued that because COMPAS does not disclose this information, he had been denied information which the circuit court considered at sentencing. The Court did not agree with Loomis’s position because “Loomis had the opportunity to verify that the questions and answers listed on the COMPAS report were accurate.”

Despite the Court’s holding, the fact that algorithms like the one COMPAS uses are protected trade secrets presents a host of potential problems for defendants facing prison time. Without knowing how an algorithm interprets the data it’s fed, how can a defendant verify that the data has been analyzed and weighed fairly? The Court acknowledged that there are serious issues with programs like COMPAS, and cited studies and reports that found that, among other things, “there is concern that risk assessment tools may disproportionately classify minority offenders as higher risk, often due to factors that may be outside their control, such as familial background and education.

In light of these issues, the Court held that “[f]ocusing exclusively on its use at sentencing and considering the expressed due process arguments regarding accuracy, we determine that use of a COMPAS risk assessment must be subject to certain cautions in addition to the limitations set forth herein.” The Court held that any pre-sentencing investigation containing a COMPAS risk assessment must disclose to the sentencing court that:

“(1) the proprietary nature of COMPAS has been invoked to prevent disclosure of information relating to how factors are weighed or how risk scores are to be determined; (2) risk assessment compares defendants to a national sample, but no cross-validation study for a Wisconsin population has yet been completed; (3) some studies of COMPAS risk assessment scores have raised questions about whether they disproportionately classify minority offenders as having a higher risk of recidivism; and (4) risk assessment tools must be constantly monitored and re-normed for accuracy due to changing populations and subpopulations.”
 
Notwithstanding the guidelines set forth by the Court in Loomis, the continued use of technology like COMPAS presents serious questions for the criminal justice system and the legal profession generally. As this technology becomes more prevalent and courts become more reliant on it to determine issues as important as liberty, the legal system must decide how big a role this technology should play.

Loomis has filed a petition for certiorari in the United States Supreme Court. In March 2017, the Supreme Court asked the Solicitor General for an amicus curiae brief implying that it might be thinking of granting review. In light of the continued use of this kind of technology in court rooms across America, it seems like a good idea for the highest Court in the land to provide some guidance on how to best incorporate this technology in our courts, especially since these algorithms are here to stay.