top of page

Facing the Constitution: How Facial Recognition Technology Threatens Fourth Amendment Protections and Deepens Racial Inequality

  • Bailey Mandell
  • Mar 21
  • 7 min read

It is very unlikely you have ever consented to a police lineup.  Yet if you have a driver’s license or a social media account, you may already be in one.  A new kind of digital lineup has emerged, one that never ends and requires neither your knowledge nor consent.[1]  Facial recognition technology (“FRT”) is a biometric identification method that creates a digital template of a person’s face using measurements such as the distance between the eyes, the width of the nose, the length of the jawline, and the shape of the cheekbones.[2]  Algorithms then compare that template either to a second image for verification or to a large database of known images to identify a potential match.[3]  Law enforcement agencies across the country use FRT to identify suspects, victims, and witnesses.[4]  Today, roughly half of all American adults are already included in a police facial recognition database.[5]


One company alone, Clearview AI, has scraped billions of images from public websites to build an enormous searchable database that law enforcement has already searched nearly one million times.[6][7]  For as little as a few thousand dollars per year, police departments can use the program to identify virtually anyone whose picture is on the internet.[8]  Founded by Hoan Ton-That and Richard Schwartz, the company has built a surveillance infrastructure that the Founders of the Constitution could scarcely have imagined.[9]  The rapid, largely unregulated proliferation of this technology raises profound constitutional questions.[10]  As currently deployed, the warrantless use of FRT violates the Fourth Amendment’s protections against unreasonable searches and disproportionately harms communities of color.[11]

The Supreme Court has increasingly recognized that digital surveillance demands heightened constitutional scrutiny.[12]  In Carpenter v. United States, the Court held that police generally need a probable cause warrant to acquire cell-site location information, emphasizing that individuals retain a reasonable expectation of privacy in the totality of their physical movements.[13]  Chief Justice Roberts grounded the holding in the depth, breadth, and comprehensive reach of digital surveillance, recognizing that digital is different than previous physical intrusions.[14]  The Court explicitly warned against applying analog-era precedents to modern technologies, reasoning that such mechanical application does not preserve the status quo but dramatically enhances police power at the expense of personal liberty.[15]  Building on United States v. Jones[16] and Riley v. California,[17] the Court established that when digital data collection enables law enforcement to amass near-infinite quantities of personal information, store it indefinitely, and reveal an intimate portrait of a person’s life, the Fourth Amendment’s protections are triggered.[18]


FRT presents an even more expansive threat than the surveillance at issue in Carpenter.[19] Where cell-site data reveals a person’s movements, facial recognition can reveal their identity, associations, habits, and presence at specific locations, all without their knowledge.[20]  The Court’s concern in Carpenter stemmed not from physical tracking alone but from the government’s ability to cheaply and retroactively retrieve “an intimate window into a person’s life.”[21] Clearview AI grants police that same power and more.[22]  The program captures billions of scraped images, allows retroactive analysis at any time, and effectively turns every digital camera into a government surveillance device.[23]  A system that scans and identifies individuals in crowds without individualized suspicion constitutes precisely the kind of pervasive, suspicion-less dragnet surveillance the Fourth Amendment exists to prevent.[24]


Compounding these Fourth Amendment concerns, law enforcement agencies routinely conceals its use of FRT from criminal defendants, potentially violating discovery obligations under Brady v. Maryland.[25]  When a facial recognition search generates the lead that prompts an arrest, the reliability of that search constitutes material evidence for the defense.[26]  Yet many departments assert they bear no obligation to disclose their use of FRT in arrest affidavits.[27] Without disclosure, defendants cannot challenge the accuracy of the technology that led to their arrest, and courts cannot assess whether the search provided constitutionally sufficient probable cause.[28] This concealment undermines the adversarial process at the heart of the criminal justice system.[29]


These constitutional problems grow dramatically worse in communities of color, confirming that the warrantless use of FRT not only violates the Fourth Amendment but deepens the very racial inequalities the Constitution was amended to address.[30]  In December 2019, the National Institute of Standards & Technology (“NIST”) released a major study evaluating FRT’s demographic effects across numerous algorithms.[31]  NIST found that false-positive rates ran highest among West and East African and East Asian individuals, with some algorithms producing false-positive rates up to one hundred times higher across demographic groups.[32] Among domestic law enforcement images, the highest false positives appeared in American Indians, with elevated rates in Black and Asian populations.[33]  These disparities persist even after controlling for image quality.[34]  Developers initially designed facial recognition systems on homogeneous populations of white men, and camera technologies compound the problem by capturing lower-quality images of people with darker skin tones.[35]  For example, Detroit’s police chief admitted that the department’s facial recognition software produced incorrect results 96% of the time, yet officers continue to use it today.[36]


The consequences of this bias have devastated real lives.  As of February 2024, at least seven confirmed wrongful arrest cases have resulted from FRT misidentification, including Robert Williams,[37] arrested at his home in front of his family in Detroit, and Porcha Woodruff, arrested while eight months pregnant.[38]  Almost all publicly reported wrongful arrests resulting from facial recognition misidentification have involved Black individuals.[39]  These cases likely represent only the tip of the iceberg, given law enforcement’s refusal to disclose when it uses FRT in investigations.[40] Each wrongful arrest represents both a violation of individual liberty and a systemic failure that erodes public trust in the communities that already bear the greatest burden of over-policing.[41]


The convergence of constitutional vulnerability and documented racial bias makes the case for legislative action overwhelming.  The Commission on Civil Rights is actively urging Congress to establish legal redress mechanisms for individuals harmed by FRT misuse, to implement comprehensive field testing across demographic groups, and to consult with affected communities to help design better systems.[42]  Some have even proposed banning generalized facial surveillance outright for ordinary law enforcement purposes, while permitting narrower uses only under probable-cause warrants and rigorous accuracy requirements.[43]  At a minimum, Congress should enact federal legislation that bans suspicionless facial surveillance, requires a probable-cause warrant for any law enforcement use of facial recognition, mandates disclosure of FRT use in all criminal proceedings, and establishes independent oversight mechanisms to ensure accountability.[44]  States should follow suit with complementary legislation tailored to local law enforcement practices.  Without such reforms, the use of facial recognition technology will continue to be defined by constitutional uncertainty, regulatory gaps, documented bias, and an untenable tension between public safety interests and fundamental civil liberties.


[1] Riya Anchi, Facial Recognition Technology: A Fourth Amendment Violation?, Pᴇɴɴ Sᴛ. L. Rᴇᴠ. (Feb. 24, 2020), https://www.pennstatelawreview.org/the-forum/facial-recognition-technology-a-fourth-amendment-violation[https://perma.cc/J7HE-YRKX].

[2] Id.

[3] Kᴇʟsᴇʏ Y. Sᴀɴᴛᴀᴍᴀʀɪᴀ, Cᴏɴɢ. Rsᴄʜ. Sᴇʀᴠ., R46541, Fᴀᴄɪᴀʟ Rᴇᴄᴏɢɴɪᴛɪᴏɴ Tᴇᴄʜɴᴏʟᴏɢʏ ᴀɴᴅ Lᴀᴡ Eɴғᴏʀᴄᴇᴍᴇɴᴛ: Sᴇʟᴇᴄᴛ Cᴏɴsᴛɪᴛᴜᴛɪᴏɴᴀʟ Cᴏɴsɪᴅᴇʀᴀᴛɪᴏɴs, 4 (Sep. 24, 2020).

[4] Katie Evans, Half of All American Adults Are in a Police Face Recognition Database, New Report Finds, Gᴇᴏ. L. (Oct. 18, 2016), https://www.law.georgetown.edu/news/half-of-all-american-adults-are-in-a-police-face-recognition-database-new-report-finds [https://perma.cc/RR5B-9MUZ].

[5] Terence Liu, How We Store and Search 30 Billion Faces, Cʟᴇᴀʀᴠɪᴇᴡ AI: Bʟᴏɢ (Apr. 18, 2023), https://www.clearview.ai/post/how-we-store-and-search-30-billion-faces [https://perma.cc/S7PF-PY5T].

[6] James Clayton & Ben Derico, Clearview AI Used Nearly 1m Times by US Police, It Tells the BBC, BBC Nᴇᴡs (Mar. 27, 2023), https://www.bbc.com/news/technology-65057011 [https://perma.cc/9CQZ-YFFP].

[7] See Hays County Sheriff Approved to Acquire Facial Recognition Software, Iɴᴅᴜs. Iɴsɪᴅᴇʀ (Dec. 21, 2023), https://insider.govtech.com/texas/news/hays-county-sheriff-approved-to-acquire-facial-recognition-software[https://perma.cc/M2XQ-HBFE]; see also Kashmir Hill, Clearview AI, Used by Police to Find Criminals, Is Now in Public Defenders’ Hands, N.Y. Tɪᴍᴇs (June 21, 2023), https://www.nytimes.com/2023/06/21/technology/clearview-ai-public-defenders.html[https://perma.cc/4AVE-4WAJ].

[8] See Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, N.Y. Tɪᴍᴇs (Jan. 18, 2020), https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html [https://perma.cc/Q27S-PCJG].

[9] See Sᴀɴᴛᴀᴍᴀʀɪᴀ, supra note 3.

[10] See Kevin Johnson, The Use of Clearview AI to Support Warrants Violates the Fourth Amendment, 34 Fᴏʀᴅʜᴀᴍ Iɴᴛᴇʟʟ. Pʀᴏᴘ. Mᴇᴅɪᴀ & Eɴᴛ. L.J. 991, 1021–24 (2024).

[11] Carpenter v. United States, 585 U.S. 296, 311 (2018).

[12] Id. at 312.

[13] Id. at 315.

[14] United States v. Jones, 565 U.S. 400, 417 (2012).

[15] Riley v. California, 573 U.S. 373, 393 (2014).

[16] See Carpenter, 585 U.S. at 310–16.

[17] See Johnson, supra note 10, at 1021–24.

[18] See Andrew Guthrie Ferguson, Facial Recognition and the Fourth Amendment, 105 Mɪɴɴ. L. Rᴇᴠ. 1105, 1113–14 (2021).

[19] Id. at 1130; see also Carpenter, 585 U.S. at 296.

[20] See Johnson, supra note 10, at 1024.

[21] See Ferguson, supra note 18, at 1197–200.

[22] Brady v. Maryland, 373 U.S. 83 (1963).

[23] See Nᴀᴛ’ʟ Ass’ɴ ᴏғ Cʀɪᴍ. Dᴇғ. Lᴀᴡʏᴇʀs, Advisory: Defense Use of Facial Recognition Technology 1 (2023).

[24] See Johnson, supra note 10, at 1003.

[25] See Sᴀɴᴛᴀᴍᴀʀɪᴀ supra note 3, at 12–15 (discussing Fourth Amendment implications of FRT-related concealment in criminal proceedings).

[26] See Ferguson, supra note 18, at 1188–91.

[27] See U.S. Commission on Civil Rights Releases Report: The Civil Rights Implications of the Federal Use of Facial Recognition Technology, U.S. Cᴏᴍᴍ’ɴ ᴏɴ C.R. (Sep. 19, 2024), https://www.usccr.gov/news/2024/us-commission-civil-rights-releases-report-civil-rights-implications-federal-use-facial [https://perma.cc/R8ZQ-V7J8].

[28] Nᴀᴛ’ʟ Iɴsᴛ. ᴏғ Sᴛᴀɴᴅᴀʀᴅs & Tᴇᴄʜ., Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects (Dec. 2019), https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf [https://perma.cc/MD75-CJAX].

[29] U.S. Commission on Civil Rights Releases Report, supra note 27.

[30] Id.

[31] Id.

[32] See Ferguson, supra note 18, at 1170–71.

[33] Timothy B. Lee, Detroit Police Chief Cops to 96-Percent Facial Recognition Error Rate, Aʀs Tᴇᴄʜɴɪᴄᴀ (June 30, 2020), https://arstechnica.com/tech-policy/2020/06/detroit-police-chief-admits-facial-recognition-is-wrong-96-of-the-time [https://perma.cc/9P57-HJSB].

[34] See Nathan Freed Wessler, Police Say a Simple Warning Will Prevent Face Recognition Wrongful Arrests. That’s Just Not True., ACLU (Apr. 30, 2024), https://www.aclu.org/news/privacy-technology/police-say-a-simple-warning-will-prevent-face-recognition-wrongful-arrests-thats-just-not-true [https://perma.cc/H7DT-Q343].

[35] See Kashmir Hill, Wrongfully Accused by an Algorithm, N.Y. Times (June 24, 2020), https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html [https://perma.cc/5RPF-6HR3]; see also Kashmir Hill, Eight Months Pregnant and Arrested After False Facial Recognition Match, N.Y. Tɪᴍᴇs(Aug. 6, 2023), https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.html [https://perma.cc/ZPF4-UU4F].

[36] U.S. Commission on Civil Rights Releases Report, supra note 27.

[37] Id.

[38] See Ferguson, supra note 18, at 1181–85 (discussing how FRT bias compounds existing structural inequities in policing and erodes public trust in over-policed communities).

[39] U.S. Commission on Civil Rights Releases Report, supra note 27.

[40] Id.

[41] See Ferguson, supra note 18, at 1181–85.

[42] U.S. Commission on Civil Rights Releases Report, supra note 27.

[43] See Ferguson, supra note 18, at 1197–1200.

[44] Id. at 1197–210 (proposing comprehensive legislative changes for error, bias, fairness, and transparency with FRT use).

 
 
 

Recent Posts

See All
The Inequity of Elective Shares in Second Marriages

Elective share statutes, the modern successors to the historical doctrines of dower and curtesy, serve as a vital legal safety net designed to prevent a decedent from intentionally disinheriting a sur

 
 
 

Comments


bottom of page