A prosecutor’s decision to charge someone with a crime can have severe consequences for that person’s life, family, and community. Yet we have strong evidence across many types of decision-making that when humans have discretion, racial bias can seep in. It’s unsurprising that this is a risk with prosecutors as well. A big, difficult question is how to avoid such bias in practice. One idea is to hide information related to race from the person making the decision — to make those decisions “race blind.”
Accordingly, researchers have created an algorithm that automatically redacts race-related information from crime reports to reduce the likelihood that race will influence prosecutorial charging decisions. The technology was piloted in two California jurisdictions and, in 2022, the state passed a new law that mandates the use of race-blind charging by the beginning of 2025. Such policies have promise, but removing information may have unintended consequences. Because of this, measuring the intervention’s real-world impact will be crucial to ensure it has the impact lawmakers are hoping for.
Arnold Ventures (AV) is supporting a rigorous evaluation of race-blind charging policies as they are implemented. Alex Chohlas-Wood, assistant professor at NYU Steinhardt and faculty co-director of the Computational Policy Lab, is working with Sharad Goel, Todd Rogers, and Joe Nudell of the Harvard Kennedy School; Julian Nyarko of Stanford Law School; and Charles Dorison of Georgetown University to run a pre-registered randomized controlled trial (RCT) with up to a dozen prosecutors’ offices in Missouri, Washington, and California. The evaluation will study how the new race-blind review procedure affects charging decisions, including charging rates by racial group.
“Racial biases are a well-known problem within our criminal justice system. However, decision-making processes within prosecutors’ offices related to this issue have been understudied,” says Tyrell Connor, criminal justice research manager at AV. “Additionally, more evidence is needed to understand how AI technology may be helpful in reducing or eliminating harmful biases and this research will help us better understand how AI may be used as a tool to create a fairer justice system.”
AV spoke with Chohlas-Wood about bias in prosecutorial charging decisions, how technology can reduce bias and improve fairness, and the importance of creating evidence about the outcomes of this new policy.
This conversation has been edited for clarity.
Arnold Ventures
Tell us about race-blind charging.
Alex Chohlas-Wood
We started this project about five years ago with the idea that prosecutors should do whatever they can to make charging decisions fairly. The intervention uses technology to automatically redact race-related information from police reports.
Specifically, we use artificial intelligence to ingest crime reports and remove all race-related information, including explicit mentions of race or ethnicity, people’s physical descriptions, names, and locations, which can all be clues about race or ethnicity. Then, we present the redacted report to the prosecutor and they file an initial decision about whether they will proceed with charges. Next, they revisit the case and look at things like photo or video evidence (which would be infeasible to redact) before filing a final charging decision. If their second decision differs from the first, we ask them to explain, on the record, why they changed their mind. We ran pilots for this intervention in San Francisco and Yolo Counties in California. In 2022, California passed a law saying that prosecutors across the state should use race-blind charging by the beginning of 2025. Now, we are launching a large-scale RCT that will study the actual impacts of this intervention. The new law will give us enough data to understand the policy’s impacts on charging decisions.
Arnold Ventures
Why is race-blind charging important?
Alex Chohlas-Wood
It makes a big difference in somebody’s life if you decide to charge them with a crime. It imposes huge consequences and costs on them. That underscores why this should be a fair process. There is evidence of implicit and even explicit bias in certain charging situations, and this policy is intended to be a reasonable guard against that. Aside from decision-making itself, this policy can help prosecutors better communicate to the public that they are making an effort to make charging decisions fairly.
Arnold Ventures
How did you become drawn to this issue?
Alex Chohlas-Wood
I am interested in doing what I can to improve the justice system leveraging my expertise in data and technology. In the justice system, technology is often used in an enforcement context — such as body-worn cameras or crime mapping for policing — but here we are talking about a very different application. I am intrigued by the ways that technology can be used to improve the fairness of the criminal justice system, and that goal is explicit in the design of this algorithm.
Arnold Ventures
How are you conducting your research?
Alex Chohlas-Wood
We know from earlier work that redaction makes it pretty hard to guess the race of an arrestee. The counterexample is the status quo procedure, where somebody’s race or physical description is directly listed on the incident report. We validated our algorithm in terms of what prosecutors actually read and found that it makes it a lot harder to guess an arrestee’s race. The big open question is whether this intervention actually changes charging decisions and how. Does it reduce bias in charging decisions? Are there unexpected adverse impacts? We need to know whether it is having the intended impact and that it is not causing harm.
In terms of our research design, there is a technological development piece. We are revamping our algorithm so that we can support prosecutors in complying with this new law, and their offices can work with us for free to adopt it. For the RCT, it is a pretty classic design where we will flip a simulated coin on incoming cases. Half of the cases will go through a status quo review procedure, and the other half will go through race-blind charging. Then, we will compare charging outcomes between the two groups.
We are working with some partners who are very courageous about being a part of this research. Prosecutors are often wary of getting involved in research because it can be a study of their decision-making process. The partners we are working with are interested in doing the right thing for the broader public.
Arnold Ventures
How could this study lead to better policy?
Alex Chohlas-Wood
There are a couple of big pieces here. In terms of policymaking, we absolutely need to evaluate new technologies and policies. We are using AI, and we are changing the way the charging process has been done for a very long time. A lot of the prosecutors we are working with have never used AI in a routine way before.
I am a big believer in evaluating policy and sharing the findings with policymakers so they can make informed decisions about legislation. This study could affect the future of charging across the country.
For arrestees, this study could have immediate impacts. These are real decisions about real people who are being arrested every day. If we see fairer outcomes in charging decisions, that could improve people’s lives — and those of their communities. It could also potentially improve public trust in the criminal justice system, which could have downstream benefits for everybody by helping to create safer communities.
Arnold Ventures
Why is rigorous, evidence-based research so important?
Alex Chohlas-Wood
It is hard to predict what will happen in implementation. You can have the best of intentions — with this law, for example, it feels like a very natural thing to want prosecutors to use race-blind charging in their day-to-day practice — but we need to research policies to know whether they work as intended. That can give us confidence that we are moving forward in the right direction. The nightmare scenario for me is that we have a very well-intentioned policy that gets implemented, but it makes things worse or runs counter to our intentions.