NewYorkUniversity
LawReview
Issue

Volume 94, Number 3

June 2019

Challenging Racist Predictive Policing Algorithms Under the Equal Protection Clause

Renata M. O’Donnell

Algorithms are capable of racism, just as humans are capable of racism. This is particularly true of an algorithm used in the context of the racially biased criminal justice system. Predictive policing algorithms are trained on data that is heavily infected with racism because that data is generated by human beings. Predictive policing algorithms are coded to delineate patterns in massive data sets and subsequently dictate who or where to police. Because of the realities of America’s criminal justice system, a salient pattern emerges from the racially skewed data: Race is associated with criminality in the United States. Because of the “black-box” nature of machine learning, a police officer could naively presume that an algorithm’s results are neutral, when they are, in fact, infected with racial bias. In this way, a machine learning algorithm is capable of perpetuating racist policing in the United States. An algorithm can exacerbate racist policing because of positive feedback loops, wherein the algorithm learns that it was “correct” in associating race and criminality and will rely more heavily on this association in its subsequent iterations.

This Note is the first piece to argue that machine learning-based predictive policing algorithms are a facial, race-based violation of the Equal Protection Clause. There will be major hurdles for litigants seeking to bring an equal protection challenge to these algorithms, including attributing algorithmic decisions to a state actor and overcoming the proprietary protections surrounding these algorithms. However, if the courts determine that these hurdles eclipse the merits of an equal protection claim, the courts will render all algorithmic decision-making immune to equal protection review. Such immunization would be a dangerous result, given that the government is hurling a growing number of decisions into black-box algorithms.