Artificial Intelligence in Criminal Justice: How AI Impacts Pretrial Risk Assessment

artificial intelligence in criminal justice

Artificial intelligence (AI) has impacted society greatly, being used in a multitude of ways by individuals, businesses and governments. One of the most consequential applications of AI is in pretrial risk assessment.

AI in Criminal Justice

AI-powered tools and machine learning can provide deep insights into people. These technologies excel at detecting patterns we humans sometimes miss. They use algorithms to analyze large sets of data to find solutions and make predictions.

The predictive nature of AI has led to its rise in most areas of criminology, law and forensics, and forensic psychology. Algorithmic risk assessments are in widespread use across law enforcement today. Predictive policing and assessments are being used to forecast someone’s likelihood of showing up for a court date or committing a crime.

Furthermore, some jurisdictions have adopted AI tools for almost every stage of the criminal justice process. Algorithms are informing decisions about bail, sentencing and parole. In the case of pretrial risk assessments, many judges have turned to predictive analytics when deciding whether to incarcerate a suspect pending trial or release them.

AI and its predictive analytics have been touted by some as carrying the potential to improve a flawed criminal justice system. However, many legal experts, technologists and community activists believe these tools may actually exacerbate the problems they are meant to help solve.

Limitations of Risk Assessment Tools

AI technologies do not necessarily have greater predictive accuracy than other risk assessment instruments. However, they can improve pretrial assessments by considering many risk factors before determining whether an individual should be granted a release or not. Some of the risk factors that can be considered include any pending charges at the time of the offense including previous violent convictions, past failure-to-appear-for-court hearings; and prior sentences to incarceration.

It is also important for law enforcement and court systems to recognize the biases that can exist in using predictive tools in pretrial risk assessment:

  • Bias and data. Risk assessment tools are driven by algorithms informed by historical crime data, using statistical methods to find patterns and connections. Thus, it will detect patterns associated with crime, but patterns do not look at the root causes of crime. Often, these patterns represent existing issues in the justice system. Data can reflect social inequities, even if variables such as gender, race or sexual orientation are removed. The populations that have historically been targeted by law enforcement are at risk of algorithmic scores that label them likely to commit crimes.
  • Bias and humans. AI technology can also reinforce human biases. Some assessments may perpetuate the misconceptions and fears people have that drive mass incarceration. Court decisions may be influenced by implicit biases. Therefore, users of AI in the justice system need to watch for potential negative feedback loops that cause an algorithm to become increasingly biased over time.

AI can improve critical decision-making in criminal justice, particularly in pretrial risk assessments. Algorithms, when used carefully, make decisions more consistent and transparent. Ultimately, humans need to be aware of biases and must ensure predictive analytics support legal and ethical standards with fairness in mind.

Make a Difference in the Criminal Justice FieldThe legal system and many other fields need practitioners skilled in gathering data-based research and observing people to making well-informed decisions. You can learn to use the tools of psychology in a wide variety of criminal and civil legal applications in Carlow’s Master of Arts in Psychology (MAP) program, which offers a Forensic Psychology concentration. The Forensic Psychology degree curriculum includes courses on the Criminal Justice System and Psychology of Deviance.


Tags: