Artificial intelligence is poised to change our world. The criminal justice system is also included. Pre- trial risk assessments in the United States are increasingly using data-driven decision-making to calculate a defendants risk of recidivism. Proponents say that this removes bias in the criminal justice system.
The assertion is called into question by a new paper. There are a number of red flags that the justice system needs to address according to Chugh. She says that Indigenous defendants are particularly vulnerable to the tool's weaknesses.
The landmark case Ewert v Canada is an example of the problems posed by risk assessment tools in general. Jeffrey Ewert was convicted of murder and attempted murder and was sentenced to life in prison. He successfully argued before the Supreme Court of Canada that tests used byCorrections Services Canada are culturally biased against Indigenous inmates and keep them in prison longer and in more restrictive conditions than non-Indigenous inmates.
"Ewert tells us that data-driven decision-making needs an analysis of the information going in and of the social science contributing to the information going in and how biases are affecting information coming out."
How can we be sure that the data we use is going to produce the right outcomes if we know that our communities are being discriminated against?
It is necessary for subjectivity.
She says that using artificial intelligence to drive risk assessments would transfer biases from humans to machines. There is bad data in the data. Proponents of using artificial intelligence in this way shift responsibility to the designers of the program.
Some Canadian courts are considering using Artificial Intelligence. She is a member of the Board of Governors of the Law Commission of Ontario and admits to her reservations about the use of artificial intelligence in police investigations.
The lack of subjective discretion and deference is one of the main issues Chugh identifies with an over reliance on artificial intelligence. She notes that these are important pillars of an independent judiciary. Laws and statutes give judges some wiggle room when it comes to operating and considering relevant factors.
She believes that sentencing and bail are community driven.
Judges and decision-makers are appointed based on their knowledge of the community. Do we need to make that decision on our own? Are we going to rely on a system where we are talking with offenders? I think courts can have a big impact on people.
She is not against using artificial intelligence in the court system, only that more research needs to be done.
Is we there yet? If I can be proved wrong, I would change my mind.
The risk assessment tools on trial are mentioned in the article. The title is "10109/MTS. 2022.3197123."
Citation: It is still too early to use artificial intelligence for criminal justice, claims new paper (2022, November 22) retrieved 22 November 2022 from https://phys.org/news/2022-11-early-artificial-intelligence-criminal-justice.html This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.