Design a site like this with WordPress.com
Get started

The COMPAS Algorithm – It’s not that black and white

Looking closer at the COMPAS recidivism algorithm and application in the US justice system.

The following is an opinion piece written by an LDS contributor.

Algorithmic decision-making is purported by the mainstream media to reduce the impact of human bias in our social systems. Yet the insidious and secretive nature of algorithms allows them to replicate and amplify the social inequalities brought about by ingrained systemic oppression. As a society, we seem to be content with the knowledge that we are not only failing to resolve systematic inequalities such as racism but promoting them. Such technology excuses us from addressing these complex issues by retreating behind the facade of objectivity. Far from advancing equity, the disturbing reality is that algorithms are racist unless you are white.

The above is an example of the sensualised rhetoric promoted by certain groups that ignore the complexity of these nuanced technologies. An infamous example of this is Equivant’s COMPAS software which generates recidivism risk scores claimed by the US criminal justice system to promote fairness and efficiency in pre-trial, sentencing and parole decisions. This is not the case. The use of COMPAS is indeed unfair as indicated by the racial disparities in predictive accuracies that reinforce the entrenched racial biases which see black defendants sentenced far more harshly than their white counterparts (Angwin et al., 2016.). But is it the algorithm that is unfair? Or the way it is being used?

For predictive parity, the output must be equal for both black and white defendants. To achieve predictive parity, a higher rate of false positives for black defendants will occur even though race is not a direct variable in the equation. This is because a defendant will be more likely to have previously offended (a direct variable) if they are black. But while mathematical logic is sound, the bias data use does not satisfy our social notion of fairness due to the implications of ignoring the context of the data chosen.

We see then how the algorithm is claimed to be biased, but the fact is that it was never meant to satisfy both notions of fairness (Hardt, Price, Srebro, 2016; Kleinberg, Mullainathan, Raghavan, 2016). Equivant warns that the recidivism risk score should not be used to classify individuals without considering a comprehensive assessment of their social situation to identify their rehabilitation needs (Northpointe, 2015). The racial bias it seems is not entirely the fault of the algorithm, but rather the misuse of it by the US criminal justice system.

Design and use of data and algorithms must be accompanied by an understanding of the complexities of the social context. Continued use algorithms and data to make decisions about individuals that are manifestly perverse is irresponsible. This misuse and unaligned understanding of fairness have produced devastating long-term social effects that must be accounted for (Angwin, Larson, Mattu & Kirchner, 2016). Although different groups have attempted to address the issues of algorithmic bias, they are failing. This must be resolved.

Advertisement
%d bloggers like this: