California Supreme Court,
Criminal
Aug. 4, 2020
Despite flaws, some continue to push for bail algorithm
When the system erred, it consistently showed that Black defendants were predicted to be higher risk, while white defendants where similarly predicted to be lower risk — despite the fact that neither was actually true. It was later discovered that Los Angeles County was using the same COMPAS system to inform bail decisions.
Jeffrey J. Clayton
Executive Director American Bail Coalition
As the new millennium began, the power of human innovation, manifested largely through advances in science and technology, seemed endless. Solving all of the world's problems seemed not only possible, but probable. Through technology we would find a cure for various terminal diseases, eliminate the use of fossil fuels, eradicate poverty and fix the many other ills that plague our society.
Advances over the last 50 years have also found their way into social science. Today, massive computing power allows us to run endless correlations between any number of variables collected about human subjects. With pinpoint scientific accuracy, it can purportedly predict the behavior of the criminal mind or that of a bail jumping scofflaw.
One of those grand advances in social science technology was a computer algorithm that, it was posited, could tell judges the risk of persons arrested to commit new crimes or whether they would fail to appear in court as required.
Called pretrial risk assessment or public safety assessment tools, they were intended to help judges identify individuals considered risky in order to hammer them with higher bails and conditions of release, such as house arrest, GPS monitoring or drug screening. Many observers said use of the algorithms would break the endless cycle of mass incarceration, which, as we will see, turned out to be the complete opposite of what actually happened.
Against this backdrop, the Conference of Chief Justices, a prominent national group comprised of 58 chief justices throughout the U.S. and its territories, adopted a policy resolution in 2015 that advocated for "evidence-based assessment of risk in setting pretrial release conditions." The organization also adopted a second resolution entitled, "Supporting Federal Efforts to Promote Pretrial Risk Assessment."
CCJ also endorsed a resolution from the Conference of State Court Administrators, which read in relevant part, "The use of a validated pretrial risk assessment tool when making a judicial decision to release or not, and the attendant conditions on release based on that assessment, fits within a well-functioning case management regimen."
CCJ's position on a national policy to push pretrial risk assessment algorithms onto the justice system was demonstrated in the decision by California Supreme Court Chief Justice Tani G. Cantil-Sakauye to support California Senate Bill 10. If passed, the law would embed the pretrial risk assessment process deep into the state's criminal justice system. While lip service might subsequently be paid to judicial discretion, it is clear that the results of the tool would significantly drive outcomes for all defendants.
Cracks in the system first emerged in 2016 when ProPublica published its series, "Machine Bias." Its research determined that the rate of error among racial groups as predicted by the risk assessment algorithm used by the COMPAS system, was out of balance. When the system erred, it consistently showed that Black defendants were predicted to be higher risk, while white defendants where similarly predicted to be lower risk -- despite the fact that neither was actually true. It was later discovered that Los Angeles County was using the same COMPAS system to inform bail decisions.
The bottom really fell out when the Leadership Conference on Civil Rights, which includes the ACLU, NAACP and many other national civil rights groups, issued a statement in 2018 challenging the algorithm-based system of pretrial justice. It also called into question its inherent lack of transparency, as well as the mounting evidence that racial bias was creeping into the system.
Despite the nearly complete withdrawal of support for use of the algorithm by those who had previously been its greatest advocates, the California Legislature and Chief Justice Cantil-Sakauye continued to push for its implementation. Together, they mounted an effort to pass California Senate Bill 10 in 2018, which required that all defendants be assessed by a yet-to-be-developed pretrial risk assessment tool -- while mandating statewide funding to pay for its hefty price tag. Sponsors of the legislation noted in a press release, "Chief Justice Cantil-Sakauye's unprecedented commitment to see these reforms to the finish line" was critical in gaining passage of the bill.
A group of 27 prominent academics from Harvard, MIT, Princeton, UC Berkeley and Columbia addressed a letter in July 2019 to California's Judicial Council, chief justice and sponsors of the legislation, warning of the fatal flaws of pretrial risk assessments.
In urging the state turn away from the use of such tools, they noted the reliance of risk assessment tools by judges, despite their "serious methodological flaws that undermine their accuracy and effectiveness." It went on to say that their use could not increase the probability of better pretrial outcomes, much less guarantee them.
It also emphasized that risk assessment's inherent technical flaws could not be readily resolved in the implementation phase, as they were hard-baked into its very theoretical foundation. It concluded by admonishing, "Other reforms must be considered to improve pretrial justice in California."
Attempts by this office to contact the Conference of Chief Justices asking that they revisit their prevailing policies in light of the developing revelations regarding pretrial risk assessment tools have been met by silence. What's more, CCJ's policy remains in full effect and force, while its members continue to push for pretrial risk assessments in nearly every state across the country. Particularly outspoken on the issue have been Chief Justice Maureen O'Connor of Ohio (past chair of CCJ), along with the chief justices of California, Colorado, Kentucky, New Jersey and Texas.
Advancements in technology have brought countless improvements to society. Unfortunately, risk assessment algorithms are not the panacea that criminal justice reformers would have us believe. Despite numerous warnings that the costly tools may increase racial bias and solid evidence that they simply don't work, the Conference of Chief Justices and our states' chief justices seem hellbent on an agenda that no one else seems to want.
Legislators and local officials across the country must stand up and resist their attempts to hijack our criminal justice system by forcing us into a system of algorithmic justice. What's at stake is a move toward an honest, effective and transparent methodology that protects both persons accused of crimes and the public, who deserve better.
Submit your own column for publication to Diana Bosetti
For reprint rights or to order a copy of your photo:
Email
jeremy@reprintpros.com
for prices.
Direct dial: 949-702-5390
Send a letter to the editor:
Email: letters@dailyjournal.com