Skip to main content

A crime-prediction algorithm may be biased against black defendants

A crime-prediction algorithm may be biased against black defendants

Share this story

The "risk assessment" score, a tool used to chart the odds a criminal defendant will commit a crime in the future, is becoming a common sight, informing sentencing procedure and other steps of the criminal justice system across the country. But critics have long charged that such assessments, potentially a way to ease an overburdened system, disproportionately harm minorities.

Black defendants were falsely predicted to commit crimes at twice the rate of white defendants

Hoping to add some data to the debate, the nonprofit investigative team at ProPublica analyzed risk scores for more than 7,000 people arrested over two years in Broward County, Florida, tracking who was charged with new crimes over the next two years. The publication reports a few troubling findings from that analysis.

The algorithm used in the county, according to ProPublica, incorrectly predicted future crimes by black defendants at nearly twice the rate of white defendants. Controlling the data on race against factors like criminal history, as well as age and gender, still showed considerably higher rates for black defendants. Overall, the publication writes that only 20 percent of defendants predicted to commit violent crimes did so.

The county uses a system provided by a company called Northpointe. To arrive at the assessment scores, ProPublica writes, the company uses a questionnaire with questions like, "Was one of your parents ever sent to jail or prison?" Although questions about race are not included, some may correlate with race. Northpointe disputed the publication's analysis, though it says its specific calculations are proprietary.

Despite such controversy, the idea of risk assessment is gaining considerable traction. As ProPublica points out, a pending sentencing reform bill would require federal prisons to use risk assessments.