Publication Date
2017
Abstract
Data-driven decision systems are taking over. No
institution in society seems immune from the
enthusiasm that automated decision-making generates,
including-and perhaps especially-the police. Police
departments are increasingly deploying data mining
techniques to predict, prevent, and investigate crime.
But all data mining systems have the potential for
adverse impacts on vulnerable communities, and
predictive policing is no different. Determining
individuals' threat levels by reference to commercial
and social data can improperly link dark skin to higher
threat levels or to greater suspicion of having
committed a particularcrime. Crime mapping based
on historical data can lead to more arrests for nuisance
crimes in neighborhoods primarilypopulated by people
of color. These effects are an artifact of the technology
itself, and will likely occur even assuming good faith on
the part of the police departments using it. Meanwhile,
predictive policing is sold in part as a "neutral"method
to counteract unconscious biases when it is not simply
sold to cash-strapped departments as a more cost-
efficient way to do policing.
The degree to which predictive policing systems have
these discriminatory results is unclear to the public
and to the police themselves, largely because there is no
incentive in place for a department focused solely on
"crime control" to spend resources asking the question.
This is a problem for which existing law does not
provide a solution. Finding that neither the typical
constitutional modes of police regulation nor a
hypothetical anti-discriminationlaw would provide a
solution, this Article turns toward a new regulatory
proposal centered on "algorithmicimpact statements."
Modeled on the environmental impact statements of
the National Environmental Policy Act, algorithmic
impact statements would require police departments to
evaluate the efficacy and potential discriminatory
effects of all available choices for predictive policing
technologies. The regulation would also allow the
public to weigh in through a notice-and-comment
process. Such a regulation would fill the knowledge
gap that makes future policy discussions about the
costs and benefits of predictive policing all but
impossible. Being primarily procedural, it would not
necessarily curtail a department determined to
discriminate, but by forcing departments to consider
the question and allowing society to understand the
scope of the problem, it is a first step towards solving
the problem and determining whether further
intervention is require
Recommended Citation
Selbst, Andrew D.
(2017)
"Disparate Impact in Big Data Policing,"
Georgia Law Review: Vol. 52:
No.
1, Article 6.
Available at:
https://digitalcommons.law.uga.edu/glr/vol52/iss1/6