An algorithm used by police to predict crime flagged black people twice as often as white people

Latest

In courtrooms across the United States, it has become commonplace for judges to be provided with “risk assessment” reports—algorithmically-generated scores assigned to criminals meant to gauge the likelihood of a person reoffending in the future. These scores are marketed as being an objective and fair tool that help in determining the appropriate severity of a criminal’s sentence.

According to a new study from ProPublica though, the algorithm used to generate these reports was designed with an internalized bias leading it to incorrectly predict that black people are more likely to become repeat offenders.

Analyzing the COMPAS (Correctional Offender Management Profiling for Alternative Sanctions) scores assigned to 7,000 people who were arrested in Florida between 2013 and 2014, ProPublica found three key markers indicative of the system’s deep-seated failings:

  • Only 20% of the people who were flagged as being at high risk of committing violent crimes within two years of being arrested actually went on to do so.
  • Black defendants were twice as to likely be incorrectly flagged as committing crimes in the future.
  • White people were more likely to be identified as being at low risk of committing crimes in the future.

To look at vast disparity in how COMPAS scores were assigned, one might assume that black people and white people were merely committing different kinds of crimes or came into the system with different criminal histories that the algorithm assigned different weights to. ProPublica claims to have accounted for that as well and found that the algorithm still showed bias against black people.

“We ran a statistical test that isolated the effect of race from criminal history and recidivism, as well as from defendants’ age and gender,” ProPublica said. “Black defendants were still 77% more likely to be pegged as at higher risk of committing a future violent crime and 45% more likely to be predicted to commit a future crime of any kind.”

Northpointe, the company responsible for the risk assessment algorithm used by law enforcement in the sample that ProPublica analyzed, uses a series over 100 questions that can be answered by a defendant during an interview or pulled from their criminal records.

Some of the questions, though, rely on an interviewer’s perceptions of a person in order to determine an answer:

In a response to ProPublica, Northpointe stood by the methodology it uses to generate COMPAS scores, saying that it did “not agree that the results of the analysis, or the claims being made based upon that analysis.”

Though Northpointe is one of the largest companies marketing algorithms as the future of crime prevention in the U.S., it’s far from being the only one. Microsoft is currently working on a similar program that pulls the bulk of its information about a person from their prison record and claims to be able to predict which people will return to prison within six months of their release.

Even people who haven’t actually committed crimes (yet) are being targeted by programs that crunch big data. PredPol, a popular piece of predictive software being used by many local police forces, uses crime statistics gathered about particular neighborhoods and generates predictions as to exactly when and where crimes are most likely to occur again.

What PredPol lacks in being able to describe crimes before they occur, it makes up for by giving law enforcement reasons to go into neighborhoods under the auspices of prevention.

Individually, programs like PredPol and the COMPAS algorithm work as unreliable predictors of crime that rely on statistical modeling to justify the disproportionate harms they inflict upon people of color. With wider-spread adoption and tighter integration between the platforms, though, that could change for the worse.

It isn’t difficult to imagine a future in which people of color living in neighborhoods with larger communities of former inmates end up being subjected to over-policing and even more systemic harassment from law enforcement than they already deal with.

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin