The San Francisco justice system has apparently been testing out an algorithm that can make release recommendations and compute for a bail amount based on several factors. In particular, it looks at defendants’ pending charges, age and how frequently they show up to court. What it doesn’t take into account is a person’s educational background and financial status. See, the city began testing the algorithm in May, months after critics accused the local government of having a bail system that favors the rich and unjustly penalizes the poor and people of color.
In response to the accusations and to the lawsuit filed by a civil rights group, District Attorney George Gascón lobbied for a change in the bail system by using high-tech tools. This algorithm, developed and provided to SF for free by the Texas-based Laura and John Arnold Foundation, is one of them. In order to make a fair assessment, it uses data from the case histories of 1.5 million people who were once incarcerated. It looks at how they performed after they were released and makes a prediction on which defendants could re-offend and which ones pose no risk to society. That said, risk assessment tools like this are nowhere near perfect. A Propublica report from May warns that algorithms predicting violent crimes tend to be biased against African Americans.
According to San Francisco Chronicle, some judges have been ignoring the algorithm’s release recommendations to the point that the people backing the project are disappointed over the lack of data coming in. Still, Matt Alsdorf (vice president of criminal justice at the Arnold Foundation) remains hopeful. “The idea is to provide judges with objective, data-driven, consistent information that can inform the decisions they make,” he told SF Chronicle, “…[W]hat I hope to see in San Francisco is that over time… people will start to see the validity of the tool and start to buy in more and more.”
Source: San Francisco Chronicle