There were 1,713 press releases posted in the last 24 hours and 398,536 in the last 365 days.

Lifting the Veil on the Design of Predictive Tools in the Criminal Legal System

Recently, the National Institute of Justice (NIJ) — the research arm of the Department of Justice (DOJ) — put out a call for researchers to participate in what they called the “Recidivism Forecasting Challenge”. The challenge was designed to use information about people on parole in Georgia to “improve the ability to forecast recidivism using person-and place-based variables,” encourage “non-criminal justice forecasting researchers to compete against more ‘traditional’ criminal justice researchers,” and provide “critical information to community corrections departments.” Challenge contestants were awarded a collective total of $723,000 for their submitted models.

While heralded by the NIJ as a successful effort that “demonstrate[d] the value of open data and open competition,” in reality, the challenge was marked by serious and fundamental flaws. One of the winning papers encapsulated the issues with the challenge best when they said, “We are hesitant to accept any insights gained from submitted models and question the reliability of their performance. We would also discourage the use of any submitted models in live environments.” Six of the other 25 winning papers also expressed their concerns about the use of models created for the challenge in real-world environments.

So, what contributed to the challenge’s failures?

We argue in a new research study critiquing the challenge that a failure to engage impacted communities (those whose data was used for the challenge) as well as public defenders and other advocates for impacted communities contributed in part to some of the failures of this project. The standard going forward for developing predictive tools should draw on recent resources from the federal government to inform decision-making around whether to develop predictive tools. These efforts should center around developing strong protections for the people whose data is used to build automated systems and the people who may ultimately be evaluated by those systems if they are deployed.

So, why does this matter?

The NIJ has a lot of power, given its position within the Department of Justice, to shape the way that local community corrections departments think about recidivism. We submitted a Freedom of Information Act request to the DOJ to try to better understand how the results of the challenge have been or will be used but have not yet received a response to our request. While it is not fully clear yet how the results of the challenge will be used by the DOJ, the NIJ has already signaled that these types of tools are important to it by spending close to $1 million creating and executing the challenge. Furthermore, the DOJ, through the Bureau of Prisons, already uses a risk assessment tool, PATTERN, to make critical decisions about incarcerated populations. The use of this tool has been roundly criticized by several civil rights organizations.

Beyond influencing decisions about imprisonment and government surveillance, the data produced by law enforcement agencies and the predictions generated from risk assessment tools are often used in making decisions that can have a catastrophic impact on people’s lives — including loss of parental rights, homelessness, prolonged job insecurity, immigration consequences (including deportation), and inability to access credit. The voices of those impacted by these tools should be embedded in the design and implementation of these tools, as they are the individuals who will have to suffer the consequences of poorly designed systems. By involving impacted communities in the development of predictive tools, the design of these types of systems may look dramatically different, or these tools may be determined to not be useful at all.

For more information about the NIJ’s Recidivism Forecasting Challenge and its shortcomings, check out our paper below. Our paper was presented at the Association for Computing Machinery’s Conference on Equity and Access in Algorithms, Mechanisms, and Optimization at the end of October, where it won an Honorable Mention for the New Horizons Award.

Legal Disclaimer:

EIN Presswire provides this news content "as is" without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the author above.