Facial Recognition Error Lands Tennessee Grandmother in North Dakota Jail for Six Months in Bank Fraud Case
Key Takeaways
- ▸Fargo police relied solely on facial recognition software to identify Angela Lipps as a bank fraud suspect without conducting independent verification or contacting her
- ▸Lipps was arrested at gunpoint and held without bail for nearly six months based on a faulty AI match, despite living in Tennessee and having never traveled to North Dakota
- ▸Bank records immediately proved her innocence once her attorney requested them, raising questions about why police did not pursue basic verification before arrest
Summary
Angela Lipps, a 50-year-old Tennessee grandmother, spent nearly six months in jail after Fargo police mistakenly identified her as a suspect in a bank fraud case using facial recognition software. Lipps was arrested at gunpoint in July 2024 while babysitting four young children, despite having never traveled to North Dakota and having no connection to the state. Police used facial recognition to match her to surveillance footage from the fraud case, then corroborated their findings by reviewing her social media and driver's license photo, but never contacted her directly to verify their suspicions.
After being held without bail for four months in Tennessee as a fugitive, Lipps was transferred to North Dakota in October 2024 where her court-appointed attorney immediately requested her bank records. When those records were presented to police in December, they proved she was in Tennessee during the time of the fraudulent transactions. The charges were dismissed, but Lipps had already lost her home, car, and dog during her incarceration. The case highlights serious vulnerabilities in the use of facial recognition technology in law enforcement without adequate verification procedures.
- The case demonstrates the dangers of over-reliance on AI identification tools without human investigation and due diligence protocols
Editorial Opinion
This case is a sobering reminder that facial recognition technology, while potentially useful as one investigative tool among many, is dangerously unreliable when treated as primary evidence. The Fargo Police Department's decision to arrest Angela Lipps based on an AI match without basic verification steps—like a simple phone call or financial record check—represents a fundamental failure of investigative procedure. As facial recognition becomes more prevalent in law enforcement, jurisdictions must establish strict protocols requiring human verification, alternative corroborating evidence, and clear warnings about the technology's error rates before any arrest warrant is issued.



