>> Continued From the Previous Page <<
Authorities reportedly used facial recognition software to identify Lipps as a suspect. Instead of treating that match as a starting point for deeper investigation, it appears law enforcement treated it as near-conclusive evidence.
The situation escalated dramatically when U.S. Marshals showed up and arrested Lipps at gunpoint while she was babysitting four children. The scene was chaotic, frightening, and, according to Lipps, completely unjustified.
“I’ve never been to North Dakota. I don’t know anyone from North Dakota,” Lipps said.
She added, “It was so scary. I can still see it in my head, over and over again.”
It’s hard to imagine a more traumatic situation—a grandmother caring for children suddenly surrounded by armed officers over a crime she says she didn’t commit.
But the problems didn’t stop with the arrest.
Investigators in North Dakota had been looking into a series of bank fraud incidents that took place between April and May 2025. The suspect reportedly used a fake U.S. Army ID to withdraw large sums of money. During that investigation, AI software flagged Lipps as a potential match—even though she lived hundreds of miles away.
Instead of thoroughly verifying that identification, a detective allegedly relied on a cursory review of Lipps’ driver’s license and social media profiles to confirm the AI’s suggestion. That minimal effort was apparently enough to move forward with serious criminal charges.
Lipps was hit with four counts of unauthorized use of personal identifying information and four counts of theft. She then spent four months in a Tennessee jail without being able to properly defend herself before being extradited to North Dakota, where she remained behind bars even longer.
Only later, when she finally had her day in court, did the truth begin to come out. Her attorney presented evidence showing that Lipps was going about her normal life—depositing checks and making purchases—at the exact time the crimes were being committed elsewhere.
Eventually, she was released. But by then, the damage had already been done.
Stranded far from home in North Dakota, Lipps had no easy way to return to Tennessee. When she finally made it back, her ordeal had already triggered a cascade of personal losses. She fell behind on her bills, ultimately losing her home, her car, and even her dog.
To make matters worse, she reportedly received no apology and no financial assistance from the authorities responsible for the mistake.
What happened to Lipps isn’t just an isolated error—it exposes a deeper breakdown in how technology is being used within the justice system.
An algorithm produced a “match,” and instead of treating it with caution, officials treated it as fact. From there, the process unraveled quickly: limited investigation, superficial confirmation, and a rush to prosecute. While AI may have pointed the finger, it was still human decisions that put Lipps in handcuffs and behind bars.
That should concern every American.
If someone with no connection to a crime can be identified, arrested, and jailed for months based largely on an algorithm’s output, then the principle of due process is in serious trouble. The justice system is supposed to rely on evidence, verification, and careful scrutiny—not shortcuts driven by convenience or overconfidence in technology.
When those safeguards are ignored, the system stops protecting the innocent and starts endangering them.
Now, Lipps may have legal options. Cases involving wrongful arrest, false imprisonment, and negligence could all come into play. And many would argue that accountability is not just appropriate—it’s necessary.
Because without consequences, there’s little reason for institutions to change.
Lipps may have regained her freedom, but the life she once had was effectively dismantled. And that raises a critical question: if this can happen to her, who’s next?
Until there are real reforms, real oversight, and real accountability, stories like this won’t just remain possible—they’ll become inevitable.


