WASHINGTON— The use of artificial intelligence to determine the need for pretrial detention and predict the risks of ex-felons committing more crimes doesn’t work because the underlying data can be inaccurate or biased, experts on justice reform said Wednesday.

“We don’t have well-designed instruments and we as a public should demand better,” said Faye Taxman, a professor at George Mason University, during a panel discussion at the centrist think tank Brookings Institution.

AI tools have raised questions about bias, especially racial discrimination, accuracy and transparency.

AI tools replicate institutional bias in the criminal justice system. Because they use historical arrest data to make predictions, AI tools are determining felons’ sentences based on biased police activity, said Sakira Cook, program director of justice reform at The Leadership Conference on Civil &Human Rights.

“What we know about the criminal legal system is that it is inherently biased,” said Cook. “That data is flawed because the system is flawed.”

In addition, because only the creators of the AI tools understand the underlying arrest data used, lawyers can’t question the information to defend their clients in court, said Cook.

Predictive analytics don’t account for environmental variables that may increase the likelihood of recidivism. The AI tools fail to consider barriers to housing, healthcare and education in an individual’s community, which make reintegration into society more difficult, said Cook.

The criminal justice system should secure housing and healthcare before releasing felons. This would encourage ex-convicts to comply with their terms of parole, said Taxman.

Aaron Mannes, a senior policy advisor at the Department of Homeland Security, said getting the criminal justice system to collaborate with the healthcare system or public housing is “unbelievably hard.”

Taxman recommended redesigning the instruments of predictive analysis, allowing felons to use the tools themselves. Felons due to be released should be able to choose between rehabilitation programs by usingAI tools, said Taxman.

“We just haven’t seen that kind of tool. It doesn’t exist,” said Cook in response to Taxman’s suggestion. “So until it exists, we can’t even really have this debate. [AI tools] shouldn’t be used. Period.”