Big Data Blacklisting

2016; Fredric G. Levin College of Law; Volume: 67; Issue: 5 Linguagem: Inglês

ISSN

1045-4241

Autores

Margaret Hu,

Tópico(s)

Privacy, Security, and Data Protection

Resumo

“Big data blacklisting” is the process of categorizing individuals as administratively “guilty until proven innocent” by virtue of suspicious digital data and database screening results. Database screening and digital watchlisting systems are increasingly used to determine who can work, vote, fly, etc. In a big data world, through the deployment of these big data tools, both substantive and procedural due process protections may be threatened in new and nearly invisible ways. Substantive due process rights safeguard fundamental liberty interests. Procedural due process rights prevent arbitrary deprivations by the government of constitutionally protected interests. This Article frames the increasing digital mediation of rights and privileges through government-led big data programs as a constitutional harm under substantive due process, and identifies the obstruction of core liberties with big data tools as rapidly evolving and systemic. © 2015 by Margaret Hu. * Assistant Professor of Law, Washington and Lee University School of Law. I would like to extend my deep gratitude to those who graciously offered comments on this draft, or who offered perspectives and expertise on this research through our thoughtful discussions: John Bagby, Jack Balkin, Kate Bartlett, Lawrence Baxter, Jody Blanke, Joseph Blocher, danah boyd, Franziska Boehm, Rachel Brewster, Sam Buell, Howard Chang, Guy Charles, Andrew Christensen, Danielle Citron, Adam Cox, Jennifer Daskal, Josh Fairfield, Susan Franck, Michael Froomkin, Mark Graber, Jennifer Granick, David Gray, Mark Grunewald, Woody Hartzog, Janine Hiller, Jeff Kahn, Anil Kalhan, Margot Kaminski, Orin Kerr, J.J. Kidder, Anne Klinefelter, Robert Koulish, Corinna Lain, Stephen Lee, Maggie Lemos, Sandy Levinson, Rachel LevinsonWaldman, Jamie Longazel, Erik Luna, Tim MacDonnell, Peter Margulies, Russ Miller, Steve Miskinis, Hiroshi Motomura, Brian Murchison, Mark Noferi, Jeff Powell, Angie Raymond, David Robinson, Mark Rush, Pam Saunders, Mark Seidenfeld, Andrew Selbst, Victoria Shannon, Shirin Sinnar, Ben Spencer, Juliet Stumpf, Dan Tichenor, Steve Vladeck, Russ Weaver, John Weistart, Ben Wittes, Ernie Young, and apologies to those whom I may have inadvertently failed to acknowledge. In addition, this research benefited greatly from the discussions generated from the 2015 8th Annual Privacy Law Scholars Conference, co-hosted by Berkeley Center for Law & Technology and George Washington Law; 2015 Law and Ethics of Big Data Research Colloquium, hosted by University of Indiana; 2015 Culp Colloquium, hosted by the Duke Law Center on Law, Race and Politics; 2015 Constitutional Law Schmooze on “The Public/Private”, hosted by Francis King Carey School of Law University of Maryland; 2014 Washington College of Law, American University, Faculty Workshop; 2014–2015 Emroch Faculty Colloquy Series, Richmond School of Law; Transnational Dialogue on Surveillance Methods, hosted by Max Planck Institute; 2013 Earle Mack School of Law, Drexel University, Faculty Workshop; and the 2013 Duke Law Summer Faculty Workshop. Much gratitude to the Florida Law Review for their editorial care, including Trace Jackson, Editor in Chief, and Megan Testerman, Executive Managing Editor. Many thanks to the research assistance of Rossana Baeza, Emily Bao, Lauren Bugg, Russell Caleb Chaplain, Jessica Chi, Cadman Kiker, Kirby Kreider, Oscar Molina, Markus Murden, Kelsey Peregoy, Joe Silver, and Cole Wilson. All errors and omissions are my own. 1736 FLORIDA LAW REVIEW [Vol. 67 To illustrate the mass scale and unprecedented nature of the big data blacklisting phenomenon, this Article undertakes a significant descriptive burden to introduce and contextualize big data blacklisting programs. Through this descriptive effort, this Article explores how a commonality of big data harms may be associated with nonclassified big data programs, such as the No Work List and No Vote List—programs that the government uses to establish or deny an individual’s eligibility for certain benefits or rights through database screening. The big data blacklisting harms of big data tools to make eligibility decisions are not, of course, limited to nonclassified programs. This Article also suggests how the same consequences may be at play with classified and semiclassified big data programs such as the Terrorist Watchlist and No Fly List. This Article concludes that big data blacklisting harms interfere with and obstruct fundamental liberty interests in a way that now necessitates an evolution of the existing due process jurisprudence. INTRODUCTION 1737 I. OVERVIEW OF THE BIG DATA BLACKLISTING INQUIRY 1745 A. What Is Big Data Blacklisting? 1747 B. Big Data Blacklisting and Due Process Liberty Interests 1752 II. DATABASE SCREENING AND DIGITAL WATCHLISTING SYSTEMS 1761 A. Nonclassified Big Data Programs 1762 1. No Work List 1763 2. No Vote List 1767 3. No Citizenship List 1770 B. Classified and Semi-Classified Big Data Programs 1773 1. Terrorist Watchlist 1773 2. No Fly List 1775 C. Commonality of Big Data Consequences 1776 III. BIG DATA BLACKLISTING RISKS 1777 A. Risks of Nonclassified Big Data Programs 1777 1. No Work List 1778 2. No Vote List 1783 3. No Citizenship List 1784 2015] BIG DATA BLACKLISTING 1737 B. Risks of Classified and Semi-Classified Big Data Programs 1786 1. Terrorist Watchlist 1786 2. No Fly List 1788 IV. BIG DATA BLACKLISTING AND THE DUE PROCESS INQUIRY 1792 A. Substantive Due Process and Informational Privacy Rights 1794 B. Substantive Due Process Approach to Systemic Big Data Blacklisting Harms 1797 CONCLUSION 1798

Referência(s)