Law in the Internet Society
The Problem with RAIs

Increasingly implemented but continually misunderstood, Risk Assessment instruments, or RAIs, have become a common presence in the criminal legal process, with judges across all fifty states implementing RAIs in decision-making processes that determine bail amounts, flight risk, or even sentence lengths for criminal defendants. A major issue facing the implementation of this software, however, is what appears to be a fundamental misunderstanding by Courts and Legislators alike of how exactly these algorithmic tools work. Alas, from legislation to Court decisions on the topic, there appears to be a continual misconception that RAIs act as an infallible legal arbiter capable of making more precise decisions based on the data at its disposal. In actuality, the evidence is more troubling; last August, the Journal of Criminal Justice published a study on the validation techniques used by private companies to measure both the accuracy and risk of bias of nine of the most popular RAIs used countrywide. Overall, the study determined that, through evaluating numerous efficacy measurements reported for each of the nine tools, the “extent and quality of the evidence in support of the tools” was typically “poor”.

As Shoshana Zuboff argues in her book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, this illusion of accuracy created by decision support software causes avoidable public harm, and that harm is unequally distributed across class and race. This misunderstanding of RAIs is a fundamental example of the argument set forth by Zuboff, and reiterated here. Take for example the Courts’ treatment of legal challenges against RAIs. Although fairly new, Risk Assessment Instruments have already faced accusations of lack of transparency, discrimination, and potentially infringing practices. Potentially most notable is State v. Loomis, a 2016 discrimination and Due Process challenge against the technology before the Wisconsin Supreme Court.

The case arose from the early 2013 arrest of Wisconsin resident Eric Loomis who, upon pleading guilty to two charges in connection with a drive-by murder, was sentenced to six years in prison based largely on the findings of COMPAS, a risk assessment tool used to evaluate Loomis upon his arrest. In response to the use of the software in his sentencing, Loomis mounted a motion for post-conviction relief on the ground that, given that the source code behind COMPAS’ risk assessment is a trade secret, and is therefore unknowable to the defendant, the defendant was unable to properly challenge the evidence against him– a violation of his due process rights to be sentenced based on accurate information, to know the evidence against him, and to receive a personalized sentence. Loomis additionally argued that the trial court violated his due process rights by allowing the RAI to consider gender in its methodology, thus introducing an impermissible consideration of gender into the Court’s sentencing. While the post-conviction relief motion was denied by the trial court, the Wisconsin Court of Appeals certified the case to the Wisconsin Supreme Court. The Wisconsin Supreme Court ultimately affirmed the lower court’s decision, shooting down Loomis’ arguments on several grounds. Most important here, though, was the Court’s argument that the use of gender in the algorithm’s methodology served a nondiscriminatory purpose, specifically accuracy, in its inclusion.

Here, it is clear that the Court has fallen for the software’s “illusion of precision”; the Court in this instance signals that it is willing to allow discrimination against private citizens in the interest of bolstering the usage of what it believes to be a more accurate, precise decisionmaker. Thus, the Wisconsin Supreme Court errs in precisely the way that Zuboff predicts. Avoidable discrimination against people of certain genders, races, and socioeconomic statuses are and will continue to be enforced so long as there remains a fundamental misunderstanding of what AI is, what it entails, and how exactly decision support software like RAIs differ from artificial intelligence.

Navigation

Webs Webs

r3 - 20 Jan 2024 - 01:25:46 - JasmineBovia
This site is powered by the TWiki collaboration platform.
All material on this collaboration platform is the property of the contributing authors.
All material marked as authored by Eben Moglen is available under the license terms CC-BY-SA version 4.
Syndicate this site RSSATOM