San Francisco DA launches unprecedented experiment using AI to reduce racial bias in charging

The San Francisco District Attorney announced an unprecedented, high-tech effort to remove the potential for implicit bias, by removing a suspect’s race and other factors in an experiment to find out if the color of someone's skin is a factor in charging decisions.

George Gascon announced this “bias mitigation” initiative, which uses Artificial Intelligence, on Wednesday, saying the project would formally begin on July 1. First, a suspect’s name, race, hair and eye color and where the crime occurred will be redacted from police reports that a prosecutor reviews and before making an initial charging decision. Next, prosecutors will get the full report, including body camera video and photos, where they will inevitably see all those factors, to consider charging again. If the prosecutor sees a discrepancy between the two, he or she will have to provide a ratioanale for the change. 

This effort is thought to be the first of its kind in the nation.

"The criminal-justice system has had a horrible impact on people of color in this country, especially African Americans, for generations," Gascon said in an interview ahead of the announcement. "If all prosecutors took race out of the picture when making charging decisions, we would probably be in a much better place as a nation than we are today."

He added later at a news conference: “We cannot continue to think a problem that only impacts one community, mostly the African-American community, doesn't impact the rest of us. Because it does, from a point of dignity as a country, it does from a point of social impact in our communities.”

Gascon teamed up with the Stanford Computational University Policy Lab, which is using open-source AI, to redact a suspect’s race and name, as well as the race of the victims. Prosecutors also won’t be able to see the specific locations and neighborhoods where crimes were said to have been committed. Stanford is providing the technology free of charge. The project at this point will be used for general felonies, about 80 percent of the DA's caseload excluding homicides and domestic violence, and it will be rolled out office-wide and include misdemeanors in the coming months. 

Civil rights attorney Adante Pointer, who has represented Alex Nieto of San Francisco and several other African-American men killed by police, questioned what would  happen to prosecutors who are “prone to engage in discriminatory charging decisions” and how the DA’s office would track and discipline those attorneys. He also wondered if the DA’s office would be tracking metrics of police officers who bring these cases to prosecutors in the fist place.

That said, Pointer added, Gascon seems to be heading in the right direction.

“I welcome all tools that are directed at eradicated the bias that has been proven to exist in the criminal justice system be it in San Francisco, California or Little Rock, Arkansas," Pointer told KTVU.

Brian Hofer of Oakland, executive director of Secure Justice, said that he's in the "wait and see category. If it's truly open source, and we can verify the inputs and how the values are weighted, I'm cautiously optimistic we will see improvements in sentencing."

However, he added, "algorithms are programmed by humans, and rely on data input by other humans. There is still great risk and potential for abuse present, and we must maintain careful oversight of and demand maximum transparency into such tools." 

The technology relies on humans – namely police officers - to collect the initial facts, which can still be influenced by racial bias. 

The DA will collect and review these metrics on a weekly basis to identify the volume and types of cases where charging decisions changed from the blind study to the full report in order to refine the tool and to take steps to further remove the potential for implicit bias to enter our charging decisions, Gascon said.

The effort is not meant to be disciplinary, Gascon’s office said. His spokesman, Max Szabo, said the effort is meant to root out implicit biases that everyone has. "You can't discipline people for unconsciously held beliefs that are a result of social conditioning," Szabo said. He added that prosecutors who show explicit bias could be disciplined. The office regularly conducts implicit bias trainings. 

The project comes at a time when black people and Latinos continue to be arrested and criminally charged more often than white people throughout the country.

In 2016, Latinos made up 41 percent of arrests in California, 36 percent were white and 16 percent were African Americans, despite black people making up just 6 percent of the population, according to a study by the Public Policy Institute of California.

Racial disparities in San Francisco are even more pronounced. African Americans accounted for 41 percent of people arrested between 2008 and 2014, while making up only 6 percent of the city’s population, according to a recent study by UC Berkeley and University of Pennsylvania researchers.

Gascon, who is retiring at the end of the year, is considered one of the country’s most progressive district attorneys, who has been using technology to reform the criminal justice system.

Earlier this year, he announced that he had wiped out 9,300 marijuana convictions through a partnership with Code For America, a nonprofit that used an algorithm to identify the eligible cases.


The Associated Press contributed to this report.