US Woman Wrongly Jailed for Five Months After Faulty Facial Recognition Error

Photo by Ron Lach
Photo by Ron Lach pexels

Angela Lipps had never been on an airplane. She had never been to North Dakota. She was at home in Tennessee, caring for children, when officers arrived and arrested her at gunpoint.

The charge: bank fraud committed in Fargo, North Dakota. The evidence: a match produced by Clearview AI, a commercial facial recognition system, that a detective used to link Lipps to surveillance footage of the actual suspect. Lipps, 50, spent nearly six months in jail before the case against her collapsed. No one in the chain of custody had caught the error. No one had been assigned to catch it.

The investigation's breakdown did not begin with bad intent. It began with a shortcut. According to krdo.com, the detective identified Lipps by cross-referencing her facial features, body type, hairstyle, and hair color from surveillance footage, then confirmed the match against her social media profile and driver's license photo. The AI output pointed in one direction. The detective followed it.

Bank records obtained by Lipps' defense attorney showed she was in Tennessee when the alleged North Dakota crimes occurred, directly contradicting the facial recognition match. That alibi evidence existed from the start. It was not acted on. According to fox5atlanta.com, the assigned detective was unaware of Lipps' custody status for over a month after her arrest.

When no agency owns the tool, no agency owns the error

The accountability structure surrounding the arrest was fractured before it began. The Fargo Police Department (FPD) does not own facial recognition technology and relies on external agencies to run such queries.

Fargo Police Chief Dave Zibolski publicly acknowledged he had not known that West Fargo police had purchased the Clearview AI system used in the investigation. Zibolski later admitted to "a couple of errors" in the case and issued a formal apology, announcing new parameters for facial recognition use within his department.

Lipps was jailed as a fugitive without bail for nearly six months. She was released on Christmas Eve. She left the facility in summer clothes, with no coat, no money, and no immediate way to return to Tennessee.

The case fits a documented pattern. The American Civil Liberties Union (ACLU) has recorded at least seven wrongful arrests in the United States attributable to police facial recognition errors, with nearly all victims being Black.

Civil liberties advocates have argued consistently that police departments deploy AI identification tools without adequate human validation or institutional accountability structures. The ACLU has also warned that AI-assisted documentation in criminal investigations risks contaminating officer memory and undermining transparency in prosecutorial proceedings.

Photo by RDNE Stock project
Photo by RDNE Stock project pexels

Reforms are moving, but unevenly. The San Diego Police Department (SDPD) formally banned officers from using generative AI to write police reports under a new California state law, according to cbs8.com. No equivalent federal standard governs how local agencies may use third-party facial recognition systems or what verification steps must precede an arrest.

Also Read: Wisconsin Man Charged with Murder for Killing Girlfriend's Husband After 2-Month-Affair

For Lipps, the legal outcome offered no structural remedy. A tool her arresting agency did not own produced a match no one adequately verified. The detective who pursued the match did not know she was still in custody a month into her incarceration. Her alibi was provable on paper throughout.

Zibolski's apology established that mistakes were made. It did not establish who, institutionally, was responsible for preventing them.

Disclaimer: This article was produced with the assistance of artificial intelligence.

READ MORE