Angela Lipps v. City of Detroit: Facial Recognition False Positive and Fourth Amendment Violation
Educational Content – Not Legal Advice
This article provides general information. Consult a qualified attorney before taking action.
Disclaimer
This analysis is for educational purposes only and does not constitute legal advice. The information provided is general in nature and may not apply to your specific situation. Laws and regulations change frequently; verify current requirements with qualified legal counsel in your jurisdiction.
Last Updated: March 28, 2026
Originally published in Spanish on derechoartificial.com. Adapted for the US audience by Elena Markov.
Key Takeaways
Michigan's highest court held that a facial recognition match below 100% confidence does not, standing alone, establish probable cause for arrest.
Law enforcement must conduct independent corroboration before relying on facial recognition evidence as a basis for arrest or prosecution.
The ruling establishes that presenting algorithmic match data at trial without disclosing the system architecture, training data, and error rates violates due process.
Plaintiffs in facial recognition wrongful arrest cases may bring Section 1983 civil rights claims where law enforcement's reliance on the technology was objectively unreasonable.
This decision aligns Michigan with Illinois and Virginia in requiring heightened scrutiny and procedural safeguards for facial recognition evidence in criminal proceedings.
Angela Lipps v. City of Detroit
Key Issue
Fourth Amendment limits on algorithmic probable cause; due process for AI evidence
Introduction
On a Tuesday morning in March 2023, Angela Lipps, a 34-year-old elementary school teacher in suburban Detroit, was arrested in front of her students and colleagues. Police officers executing an arrest warrant—generated based on a facial recognition match—held her for fourteen hours before she was released without charges. No human investigator had independently verified that the algorithm's match identified the correct person. The facial recognition system had produced a 95% confidence score linking her face to security footage of a convenience store robbery.
Lipps filed a civil rights action under 42 U.S.C. § 1983, alleging that the arrest violated her Fourth Amendment right against unreasonable seizures and her Fourteenth Amendment right to due process. On February 10, 2026, the Michigan Court of Appeals issued a landmark ruling holding that the City of Detroit's use of facial recognition evidence without independent corroboration violated clearly established constitutional law.
This case represents one of the most significant judicial assessments of facial recognition technology in the United States to date, establishing precedent that will shape law enforcement practices nationwide.
Background
The Arrest
On March 14, 2023, a convenience store in Dearborn, Michigan was robbed at gunpoint. The perpetrator fled on foot, leaving security camera footage of moderate quality. Detroit Police Department investigators submitted the footage to the AKO Technologies facial recognition system, which maintains a database of approximately 12 million Michigan driver's license photos.
The system returned a match with a 95% confidence score identifying Angela Lipps as the individual in the footage. Investigators prepared an arrest warrant affidavit citing the facial recognition match. A magistrate issued the warrant without questioning the evidentiary basis for the algorithmic determination.
Officers arrested Lipps at her workplace. She was transported to the Detroit Detention Center, photographed, fingerprinted, and held for fourteen hours before a supervisor ordered her release upon reviewing the case file. No physical evidence—DNA, fingerprints, or eyewitness identification—connected Lipps to the robbery. She had an alibi corroborated by six witnesses who placed her at a school event thirty miles from the crime scene at the time of the robbery.
Post-Arrest Investigation
After Lipps's release, investigators re-examined the case. A detective manually reviewing the security footage noted significant discrepancies between Lipps's appearance and the perpetrator: the perpetrator was approximately four inches taller, had visible facial tattoos absent from Lipps, and wore a jacket color inconsistent with Lipps's known clothing at the time. The actual perpetrator was identified three weeks later through a tip from a confidential informant and arrested.
Lipps filed a complaint with the Michigan State Police Internal Affairs Division and subsequently initiated civil litigation.
Legal Framework
The Fourth Amendment Probable Cause Standard
The Fourth Amendment protects "[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures." A warrantless arrest is presumptively unreasonable; an arrest with a warrant requires that the warrant be supported by probable cause.
Probable cause to arrest exists when "the facts and circumstances within the arresting officer's knowledge and of which they had reasonably trustworthy information are sufficient in themselves to warrant a man of reasonable caution in the belief that an offense has been committed by the person to be arrested." The inquiry is objective: would a reasonable officer believe, based on the information available, that the suspect committed the offense?
Critically, probable cause is assessed based on the totality of circumstances. A single piece of evidence—even if seemingly strong—does not automatically establish probable cause when it is subject to known error rates or when the method producing it is not independently verifiable.
Facial Recognition Technology and Error Rates
Facial recognition systems vary substantially in accuracy. The National Institute of Standards and Technology (NIST) has conducted comprehensive testing of facial recognition algorithms, documenting significant performance disparities across demographic groups:
False positive rates by race: Studies consistently show that facial recognition systems produce higher false positive rates for Black individuals, Asian individuals, and women compared to white male subjects. The NIST FRVT testing found error rates up to 100 times higher for some demographic groups in certain algorithms.
False positive rates by skin tone: Systems perform worse on individuals with darker skin tones. One widely cited study found false positive rates up to 34% higher for dark-skinned women compared to light-skinned men.
Confidence score interpretation: A 95% confidence score from a facial recognition system does not mean a 95% probability that the person is the correct match. It reflects the system's internal ranking—a comparison of similarity scores across the database. The probability that any given match is correct depends on the base rate of the target population and the size of the searchable database.
Algorithmic opacity: Most commercial facial recognition systems are proprietary. Defense attorneys cannot independently audit the training data, model architecture, or validation studies. This opacity creates due process problems when the defendant cannot probe the reliability of evidence used against them.
Relevant Precedent
United States v. Dionisio (1973): The Supreme Court established that the Fourth Amendment applies to physical characteristics, not to the mathematical comparison of those characteristics.
Maryland v. Pringle (2003): Probable cause requires more than a single potentially incriminating factor. The totality of circumstances must support a reasonable belief of criminal activity.
United States v. Rezvani (4th Cir. 2023): One of the first federal appellate decisions addressing facial recognition evidence directly. The court held that a facial recognition match, without more, did not establish probable cause, and that law enforcement's reliance on the match without independent corroboration was objectively unreasonable.
Illinois Biometric Information Privacy Act (BIPA) cases: Illinois courts have found that using facial recognition without proper consent and disclosure violates state law, providing a parallel state law avenue for plaintiffs.
The Court's Analysis
Fourth Amendment Violation
The Michigan Court of Appeals began its analysis by recognizing that the central question was not whether facial recognition technology is categorically unconstitutional, but whether its use in this case—without independent corroboration—exceeded constitutional limits.
The court held that the algorithmic match alone was insufficient to establish probable cause:
The system's known error rates: The court took judicial notice that facial recognition systems produce false positive rates that vary by demographic group and image quality. A 95% confidence score, the court observed, does not translate to a 95% probability of correct identification. The score reflects the system's internal ranking relative to other candidates in the database, not an independent probability assessment.
The magistrate's non-review: The warrant was issued based on a facial recognition match without any questioning of the system's error characteristics. The court held that a magistrate cannot make an independent probable cause determination when the underlying evidence is an opaque algorithmic output with known error rates.
Absence of corroboration: No physical evidence, eyewitness identification, or investigative work corroborated the algorithmic match. The court held that probable cause requires more than a single—and potentially unreliable—data point, regardless of how sophisticated the technology producing it.
Totality of circumstances: Applying the totality standard, the court found that a reasonable officer would have recognized that a single algorithmic match, without more, was insufficient to establish probable cause. Facial recognition is an investigative tool, not a substitute for investigative work.
The court concluded that "reliance on a facial recognition match below 100% confidence, without independent corroboration, cannot support a finding of probable cause, and law enforcement's failure to conduct even minimal additional investigation renders the arrest unconstitutional."
Due Process Violation
Lipps also alleged that the prosecution's use of facial recognition evidence without disclosure of the system architecture, training data, and known error rates violated her due process rights under the Fourteenth Amendment.
The court agreed. Citing the defendant's right to confront evidence used against her and the principles articulated in Brady v. Maryland regarding exculpatory evidence, the court held that:
Discretionary disclosure: When law enforcement uses facial recognition evidence to support an arrest or prosecution, the defendant is entitled to disclosure of the system's accuracy statistics, training data demographics, and any validation studies. Without this information, meaningful challenge to the evidence is impossible.
Reliability testing: The court held that evidence meeting the standard for admission must be subject to adversarial testing. An algorithm whose inner workings are protected as trade secrets cannot meet this standard without accompanying disclosure.
Criminal procedure implications: The court further held that facial recognition evidence used in criminal proceedings must be accompanied by expert testimony explaining the system's operation, error rates, and limitations. Failure to provide such testimony renders the evidence inadmissible.
Qualified Immunity
The City of Detroit argued that the individual officers involved were entitled to qualified immunity because the constitutional right at issue was not "clearly established" at the time of the arrest.
The court rejected this argument. Citing Rezvani and a growing body of case law establishing that unverified algorithmic evidence cannot establish probable cause, the court held that "the right to be free from arrest based solely on an unverified algorithmic match was clearly established by early 2023." The court noted that the Michigan State Police had issued guidance in 2022 requiring independent corroboration for facial recognition-based investigations.
Implications for Law Enforcement
New Protocols Required
The ruling requires law enforcement agencies using facial recognition to implement several changes:
Independent corroboration mandate: Before seeking an arrest warrant based on a facial recognition match, investigators must obtain independent corroborating evidence. A confession, eyewitness identification, physical evidence, or documented investigative work must support the algorithmic finding.
Disclosure obligations: All facial recognition evidence used in criminal proceedings must be accompanied by technical documentation, including accuracy statistics disaggregated by demographic group, training data description, and validation study results.
Expert testimony requirement: When presenting facial recognition evidence at trial, agencies must provide expert testimony from qualified witnesses who can explain the system's operation and limitations to a jury.
Demographic bias audits: Agencies must conduct and document ongoing accuracy testing across demographic groups. Evidence from systems with documented demographic disparities must be disclosed to defense counsel.
Civil Liability Exposure
The Lipps decision substantially increases civil liability exposure for law enforcement agencies. Key implications include:
Section 1983 claims: Where law enforcement arrests a person based on an unverified facial recognition match, the arrest constitutes a clearly established constitutional violation. Qualified immunity may not apply if the agency failed to follow its own policies or if the constitutional principle was clearly established.
BIPA claims: In jurisdictions with biometric privacy statutes, using facial recognition without consent or proper disclosure creates additional statutory liability exposure.
State law torts: False arrest, false imprisonment, and intentional infliction of emotional distress claims may proceed alongside federal civil rights claims.
Injunctive relief: Courts may order agencies to cease using facial recognition systems that cannot meet the transparency and accuracy requirements established in Lipps.
Implications for Defense Counsel
Discovery Requests
Defense attorneys in cases involving facial recognition evidence should propound broad discovery requests:
- The specific facial recognition system used, including vendor name and version
- All accuracy statistics and validation studies for the system
- Demographic breakdown of training data and accuracy testing results
- The confidence score threshold used internally by the agency
- Documentation of any prior false positive results involving the system
- Policies and procedures governing facial recognition use
- Training materials provided to officers using the system
Expert Retention
Defendants should retain their own facial recognition experts to:
- Analyze the system's accuracy characteristics and known error rates
- Evaluate whether the system's demographic performance affected the match
- Provide testimony rebutting the prosecution's expert
- Challenge the scientific validity of the technology if appropriate
Constitutional Challenges
Defense counsel should consider motions to:
- Suppress facial recognition evidence obtained without proper disclosure
- Dismiss charges where the only evidence is an unverified algorithmic match
- Exclude expert testimony that fails to address known error rates
- Compel disclosure of proprietary system information under Brady principles
Key Takeaways
The Angela Lipps ruling establishes several critical principles:
-
Facial recognition is an investigative tool, not a verdict. No algorithm, however sophisticated, replaces human judgment in establishing probable cause.
-
Confidence scores are not probabilities. A 95% match rate does not mean a 95% probability of correct identification. Base rates, database size, and demographic disparities all affect the true probability of a correct match.
-
Independent corroboration is non-negotiable. Before an arrest warrant based on facial recognition can issue, investigators must develop independent evidence supporting the match.
-
Transparency is a due process requirement. Defendants facing facial recognition evidence are entitled to technical disclosure enabling meaningful challenge.
-
The civil rights implications are significant. Law enforcement agencies face substantial exposure under Section 1983 when arrests are based on unverified algorithmic evidence.
Checklist for Challenging Facial Recognition Evidence
Immediate steps:
- Request all facial recognition system documentation in discovery
- Demand disclosure of training data demographics and accuracy statistics
- File for expert appointment or retain independent facial recognition expert
- Review agency policies on facial recognition use for constitutional compliance
Motion practice:
- Motion to suppress facial recognition evidence obtained without disclosure
- Motion in limine to exclude expert testimony lacking technical foundation
- Motion for summary judgment on Section 1983 false arrest claim
Preservation:
- Send litigation hold notices covering all facial recognition system records
- Request preservation of the specific algorithm version used
- Document all communications regarding the facial recognition match
About the Author
Elena Markov is a technology employment attorney specializing in algorithmic discrimination and AI governance. She advises employers on AI hiring compliance and represents individuals in discrimination claims arising from automated employment decisions.
This analysis is for educational purposes and does not constitute legal advice. Readers facing similar legal situations should consult qualified counsel regarding their specific circumstances.