Racism in Computer Algorithms: What Needs to Change

Racism in Computer Algorithms: What Needs to Change Image: Sanders, Jill K. “CC0 1.0 Universal (CC0 1.0) Public Domain Dedication.” Pappalardo & Pappalardo LLP , https://pappalardolaw.com/2022/11/predicate-felony-sentencing-ny-criminal-courts/. By Annalise Steinmann To study computer science is to achieve something remarkable. You’re acquiring an admirable skill, and by undergoing rigorous coursework, you’ve earned yourself an insignia of intelligence and…

Racism in Computer Algorithms: What Needs to Change

Image: Sanders, Jill K. “CC0 1.0 Universal (CC0 1.0) Public Domain Dedication.” Pappalardo & Pappalardo LLP , https://pappalardolaw.com/2022/11/predicate-felony-sentencing-ny-criminal-courts/.

By Annalise Steinmann

To study computer science is to achieve something remarkable. You’re acquiring an admirable skill, and by undergoing rigorous coursework, you’ve earned yourself an insignia of intelligence and an open door to a high-paying career.

To be a software engineer, there is untold responsibility — one that doesn’t need to be unearthed or discovered, but realized. If we don’t monitor our work, then we implement algorithms that not only perpetuate and further racism, but actually automate it.

COMPAS

This need for responsibility is demonstrated through the example of COMPAS, which is a software sold by Northpointe to the U.S. government and used in processing criminals all across the U.S. Anywhere from first arrest to prosecution, the software assigns a score based on an algorithm to determine the likelihood of a criminal to commit another crime (Angwin et al).

Since this algorithm can be implemented at any point in the judicial process, it is treated as a reasonable basis by which to increase or decrease one’s sentencing. This is in spite of ProPublica’s statistical analysis confirming disproportionately high risk scores given to Black Americans as opposed to white Americans (Angwin et al).

ProPublica’s study sampled 18,610 criminal defendants in Broward County, Florida, within a two-year period and evaluated their COMPAS scores for the categories assessing their risk of recidivism—or reoffense—and their risk of violence. ProPublica interprets Northpointe’s definition of recidivism to mean a “criminal offense that resulted in a jail booking,” after having already received a COMPAS risk score (Angwin et al). These 18,610 defendants were then narrowed down to the 11,757 defendants assessed during pretrial along with those who were not incarcerated during the aforementioned two-year period (because this category overlaps with assessing the risk of reoffense). Histograms created for both recidivists and violent recidivists revealed a skew toward lower-risk scores for white recidivists, while Black defendants had an even distribution from high to low scores (Angwin et al).

ProPublica also utilized logistic regression models to compare race to other factors, displaying that Black offenders were 45% more likely to receive a higher risk score for recidivism and 77.3% more likely for violent recidivism (Angwin et al).

Furthermore, the COMPAS algorithm is twice as likely to misidentify a Black defendant as a larger threat to reoffend, and three times as likely to do so for high-risk offenders (Angwin et al). Even when statistical analysis controlled for other factors such as age, gender, prior crimes, and future recidivism, Black defendants were 77% more likely to receive a higher risk score (Angwin el at).

Another analysis by Bernard E. Harcourt confirms that “prior criminality” has become the paramount consideration in sentencing decisions, which has proven disastrous for Black communities (Harcourt). After demonstrating a trend in which incarceration has disproportionately affected nonwhite communities between the 1940s and the 2000s, Harcourt concludes that predictive instruments are ineffective and harmful, doing more to increase incarceration than to combat it (Harcourt). As such, Harcourt calls for their elimination.

What could occur if we continue using this destructive software and continue creating analogously damaging programs?  What does the future demand if racism in our software remains unchecked? This widely-used algorithm reinforces the vilification of Black Americans, contributing to higher rates of incarceration. Even so, COMPAS is just one algorithm. We then must ask ourselves how we contribute to this problem. Computer science is flexible in its application to all fields, creating the space for software to either augment means of discrimination and hatred, or to become part of the solution.

References

  1. Angwin, Julia, et al. “Machine Bias.” ProPublica, 23 May 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. 

  2. Harcourt, Bernard E. “Risk as a Proxy for Race.” SSRN, 16 Sept. 2010, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=1677654. 

  3. Larson, Jeff, et al. “How We Analyzed the Compas Recidivism Algorithm.” ProPublica, 23 May 2016, https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm.

Leave a Reply

Your email address will not be published. Required fields are marked *