Categories: The Law

Algorithmic Data Analysis

Algorithms Are Awesome!

Who does not love algorithms? These sophisticated technologies can make remarkable predictions. “Hey Kris, our algorithm sees you like the series Even Stevens. We think you will enjoy The Even Stevens Movie!” Algorithms also inform websites to never show me advertisements for Ugg boots, but know I am likely to click an ad for nail fungus treatments. They are a win for everyone! But if you have read any posts on BS, you know that dangers run parallel to the benefits provided by many technologies. Perhaps the most negative energy you expend towards algorithms is regarding the order of your social media feed. Get ready, because they do a lot of other crummy things too.

Algorithms Are (Not Always) Awesome!

Algorithms increasingly make decisions that significantly impact lives. Mathematical models determine employment, prison sentences, loans, and the cost of goods. Often the data entered into these systems are unknown by the users affected, and cannot be challenged. Theoretically, these models increase efficiency and eliminate bias because everyone is judged by uniformed standards; however, bias is often imbedded in the algorithms without accountability since conclusions are made by computers and not humans. The models also fail to consider relevant information outside the algorithm.

Algorithms and Employment

In Weapons of Math Destruction, Cathy O’Neil describes an algorithm used by the Washington, D.C. school system that fired teachers based on low scores. Sarah Wysocki was one of 205 teachers that was fired due to the algorithm. Empirical evidence suggested that Wysocki was an excellent educator. Her students, their parents, her colleagues, and the principal all held her in high regard, but her score necessitated her removal. One factor heavily weighed by the model was her students’ standardized test scores in relation to their scores from the previous year.  The model did not recognize that many students’ scores were artificially inflated by their prior teachers who corrected wrong answers. This discovery was made by an investigation by the Washington Post and USA Today.[1]  Wysocki was unable to effectively challenge the results of her score because, as O’Neil observed, “the human victims…we’ll see time and again, are held to a far higher standard of evidence than the algorithms themselves.” [2]  

Proponents of the algorithm observe that Wysocki is a unique example and that the majority, if not all, of the other fired teachers likely underperformed. Further, they believe this model provides an efficient way to determine if an educator is meeting uniformed standards set by the administration.

Algorithms in the Criminal Justice System

In states throughout the U.S., including Virginia, judges are given defendants’ risk assessment scores. The scores are generated by an algorithm that will allegedly determine how likely a defendant is to reoffend. The score is provided during criminal sentencing and may influence the length of incarceration. An investigation conducted by ProPublica concluded that the results were unreliable. Only 20 percent of individuals predicted to commit violent crimes committed violent offenses. Furthermore, the formula presented a racial bias that disproportionately affected African Americans, and incorrectly labeled them as twice as likely to reoffend when compared to white defendants. “[T]he questionnaire judges the prisoner by details that would not be admissible in court, [and therefore,] it is unfair. While many may benefit from it, it leads to suffering for others.”[3]

Proponents suggest that any racial biases in the system are coincidence because the race of the defendant is not entered in the algorithm. Additionally, this system is likely to reduce racial biases that are held by judges during sentencing because a computer determines the score.[4] 

BS Solution

Algorithmic systems should receive heightened scrutiny when employed by the government. The Washington, D.C. public school system, a recipient of government funding, should provide a formal system for performance results to be challenged. Fired teachers should learn what factors were used in the algorithm, be able to challenge incorrect data, and present relevant information that was not included. The necessity for a formal method to challenge algorithmic models is particularly apparent in the criminal justice system. If courts insist on the continued use of algorithms, the data that determines scores and its methodology should be transparent; otherwise, courts risk impeding a defendant’s due process rights

While the proliferation of algorithms may seem scary, there is a bright side: you might not get a loan because of one algorithm, but at least another will shave three minutes off your car ride home.

BS


[1] Cathy O’Neil, Weapons of Math Destruction (2016), p.7-10. 

[2] Id at 10.

[3] Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, There’s software used across the country to predict future criminals. And it’s biased against blacks, ProPublica (May 23, 2016).

[4] O’Neil at 29.

Kristin

Recent Posts

The Tin Man’s Journey to a Heart

Do you suspect you are without a heart? Watch the 2006 Jane Eyre miniseries and…

2 years ago

Save Time and Read This Book Instead of Appearing on the Bachelor(ette)

You will not receive more Instagram followers, a podcast, sponsored trips, or passes to b-level…

2 years ago

BS Chats with Kamala Harris

Is VP Harris scared of tough questions? I don't know, her people have not gotten…

3 years ago

BS Reviews: Phantom Planet

To celebrate the 17 year and 11 month anniversary of the Phantom Planet LP, I…

3 years ago

Captain America: Civil War is Nonsense- An Introduction to the Laws of Armed Conflict

Captain America: Civil War has a 90% fresh rating on Rotten Tomatoes despite the outrageous…

3 years ago

BS Holiday Gift Guide

The expression “it is better to give than to receive” is often true because people…

3 years ago