Whether it's AI that flagged her, or a witness who saw her, or her IP address appeared on the logs. Did anybody bothered to ask her "where were you the morning of july 10th between 3 and 4pm. But that's not what happened, they saw the data and said "we got her".
But this is the worst part of the story:
> And after her ordeal, she never plans to return to the state: “I’m just glad it’s over,” she told WDAY. “I’ll never go back to North Dakota.”
That's the lesson? Never go back to North Dakota. No, challenge the entire system. A few years back it was a kid accused of shoplifting [0]. Then a man dragged while his family was crying [1]. Unless we fight back, we are all guilty until cleared.
[0]: https://www.theregister.com/2021/05/29/apple_sis_lawsuit/
"[I]t’s not just a technology problem, it’s a technology and people problem."
I can't. I just can't.
First, the detective used the FaceSketchID system, which has been around since around 2014. It is not new or uniquely tied to modern AI.
Second, the system only suggests possible matches. It is still up to the detective to investigate further and decide whether to pursue charges. And then it is up to court to issue the warrant.
The real question is why she was held in jail for four months. That is the part that I do not understand. My understanding is that there is 30-day limit (the requesting state must pick up the defendant within 30 day). Regarding the individual involved, Angela Lipps, she has reportedly been arrested before, so it is possible she was on parole. So maybe they were holding her because of that?
Can someone clarify how that process works?
https://www.clearview.ai/privacy-and-requests
I have suddenly becomes very interested in New York's S1422 Biometric Privacy Act.
The incentive is to prosecte and prove the charges.
Speaking from the experience of being falsely accused after calling 911 to stop a drunk woman from driving.
The narrative they "investigated" was so obviously false, bodycam evidence directly contradicted multiple key facts. Officials are interested only seeking to prove the case. Thankfully the jury came to the right verdict.
We could sit here all day arguing “you should always validate the results”, but even on HN there are people loudly advocating that you don’t need to.
Better just to apply Musk or Altman software to the problem and avoid it entirely.
"The trauma, loss of liberty, and reputational damage cannot be easily fixed,” Lipps' lawyers told CNN in an email.
That sounds a LOT like a statement you make for before suing for damages, not to mention they literally say "Her lawyers are exploring civil rights claims but have yet to file a lawsuit, they said."
This lady probably just wants to go back to normal life and get some money for the hell they put her in. She has never been on a airplane before, I doubt she is going to take on the entire system like you suggest. Easier said than done to "challenge the entire system", what does that even mean exactly?
It absolutely was. There's no question of this. Now we need to ask how was the system marketed, what did the police pay for it, how were they trained to use it?
> anybody bothered to ask her "where were you the morning of july 10th between 3 and 4pm.
Legally that amounts "hearsay" and cannot have any value. Those statements probably won't even be admissible in court without other supporting facts entered in first.
> we are all guilty until cleared.
This is not at a phenomenon that started with AI. If you scratch the surface, even slightly, you'll find that this is a common strategy used against defendants who are perceived as not being financially or logistically capable of defending themselves.
We have a private prison industry. The line between these two outcomes is very short.
As the article gestures towards, challenging the extradition can greatly extend the timeline, from 30 days after the arrest to 90 days after a formal identity hearing. Which isn't fair and isn't intuitive, but is unfortunately a long-standing part of the system. (Even worse, this kind of mistaken identity can't be challenged in an extradition hearing; the question isn't whether she's the person who committed the crime but whether she's the person identified in the warrant.)
The truth is much more complicated and involves politics. For example Seattle (and possibly other cities?) enacted a law that involves paying damages for being wrong in the event of bringing certain types of charges. But that has resulted in some widely publicized examples where the prosecutor erred by being overly cautious.
This is how it should work, but I still think it is important to discuss these failures in the context of AI risks.
One of the largest real-world dangers of AI (as we define that now) is that it is often confidently wrong and this is a terrible situation when it comes to human factors.
A lot of people are wired in such a way that perceived confidence hacks right through their amygdala and they immediately default to trust, no matter how unwarranted.
https://pub.towardsai.net/the-air-gapped-chronicles-the-cour...
...Unable to pay her bills from jail, she lost her home, her car and even her dog.
There is not a jury in the country that will side against the woman. I am not even sure who will make the best pop culture mashup - John Wick or a country song writer?(Also, what happened to journalism - no Oxford comma?)
How is that hearsay if she's directly testifying to her own whereabouts?
Hearsay would be if someone else was testifying "she was in X location on july 10th between 3 and 4pm", without the accused being available for cross
They probably did “identity challenge” arguing that she is not the right person. But from Tennessee’s perspective, she was considered the correct person to be arrested, so there was no “mistaken identity” in their system. In other words, North Dakota Wanted person x and here is person x.
Once a judge in North Dakota reviewed the full evidence (and found that person they issued warrant for arrest is not one they want), the case was dismissed.
They picked her up in TN and held her for 4 months, even after:
The ND police knew the ID was fake and the person using it was not her. The ND police knew she had been in TN before, during, and after the crime.
She is still technically a suspect, even after all of this has come out.
Source: I live in Fargo and have been following this story closely. Everyone here is pissed
Maybe she objected to the extradition order without good counsel.
"I aint never been to N.Dakota". She found out the hard way how the law works..
What about the banks being hit. Surely they have good cameras. This was bad mojo. I would think a Wells Fargo/BoA has a unit for this stuff.
Finincial crimes handled like this. The banks will be sued too I suspect.. Deep pockets settle out.
To the extent people trust AI to be infallible, it's just laziness and rapport (AI is rarely if ever rude without prompting, nor does it criticize extensive question-asking as many humans would, it's the quintessential enabler[1]) that causes people to assume that because it's useful and helpful for so many things, it'll be right about everything.
The models all have disclaimers that state the inverse. People just gradually lose sight of that.
[1] This might be the nature of LLMs, or it might be by design, similar to social media slop driving engagement. It's in AI companies' interest to have people buying subscriptions to talk with AIs more. If AI goes meta and critiques the user, that's bad for business.
Where your home was lost to foreclosure because one JUDGE did not look at the paperwork.
There should be a way to personally sue somebody when they don't do their job. Protecting the innocent. The JUDGE failed badly here.
Flimsy evidence would mean no warrant. Do your basic investigation please... Rubberstamping JUDGE caused this.
Why are they not named? Like they are a spectator. Infact they are the cause.
"I was at the library" is firsthand testimony.
"I saw her at the library" is firsthand testimony.
"I saw her library card in her pocket" is firsthand testimony.
"She was at the library - Bob told me so" is hearsay. Just look at the word - "hear say". Hearsay is testifying about events where your knowledge does not come from your own firsthand observations of the event itself.
If you look at examples of people quoting on the internet, lots are out of context, paraphrased, or made up.
AI is just mimicking what it has seen.
The timer starts from when you invoke it, though.
The 2 issues, which she may be caught in, are that it’s “speedy” from the perspective of a court, and that it really means “free from undue delays”.
There is no general definition of a speedy trial, but I think the shortest period any state defines is a month (with some states considering several months to still be “speedy”).
A trial can still be speedy even past that window if the prosecution can make a case that they genuinely need more time (like waiting for lab tests to come back).
It’s basically only ever not speedy if the prosecution is just not doing anything.
I wonder who is slandering her more... WOW
Maybe the citys insurance carrier hired a FIRM...
They will be taking a hit.
Cops did not do a proper investigation and the judge green-lighted it.
It is all on the JUDGE or possibly a magistrate who approved a faulty warrant.
The judge failed the poor woman. FIRE him.
Then sue Clearview for big bucks.
Actually most criminal defense attorneys recommend not waiving your speedy trial rights. Yes, the defense goes in blind. But so does the prosecution, and they're the ones that have to make a case.
The usual result for defendants that don't waive their speedy trial rights is an acquittal if the case goes to trial (between 50-60%), which doesn't sound like a lot but prosecutors are expected to win >90% of their trials. Additionally, in many counties they don't have sufficient courtrooms to handle all the criminal trials within the speedy trial timeframe, so if the trial date comes and a courtroom is not available the case is dismissed with prejudice. Nonviolent misdeameanors are the lowest priority for a courtroom (and by that I mean even family law cases have priority over nonviolent misdos in most counties), so those cases are frequently dismissed a day or two before the trial date. Consequently, most prosecutors will offer better and better plea bargains as the trial date approaches.
This is even more true for murders, which is why murder suspects don't usually get charged for a year or two after the crime.
What I still do not understand is why she spent nearly six months in a Tennessee jail. That part remains unclear and needs further explanation.
Only one small little problem --- there is no way to tell if you are using it "correctly".
The only way to be sure is to not use it.
Using it basically boils down to, "Do you feel lucky?".
The Fargo police didn't get lucky in this case. And now the liability kicks in.
I see all kinds of people being told that AI-based AI detection software used for detecting AI in writing is infallible!
You want to make sure people aren't using fallible AI? Use our AI to detect AI? What could possibly go wrong.
Now, if I misused a hammer and it hurt everyone's thumb in my country, then maybe what you said would have some merit.
Otherwise, I'd say it's an extremely lazy argument
The use case here is police facial recognition. Not hitting nails. The parent wasn't saying "AI is a liability" with no context.
This situation likely resulted from either sloppy investigative work or an honest mistake: the detective believed her booking photo matched the individual captured on camera.
Her booking photo from a prior arrest can be found here: https://mugshots.com/US-States/Tennessee/Carter-County-TN/An...
Do we have recording of the suspect they used for the match?
A Tennessee grandmother spent more than five months in jail after police used an AI facial recognition tool to link her to crimes committed in North Dakota – a state she says she’d never been to before.
Police in Fargo, North Dakota, have acknowledged “a few errors” in the case and pledged changes in their operations but stopped short of issuing a direct apology.
Angela Lipps, 50, was first arrested in Tennessee on July 14, according to a statement from the Fargo Police Department and a verified GoFundMe for Lipps.
Unbeknownst to Lipps, a warrant had been issued for her arrest weeks earlier – in Fargo, over 1,000 miles away from her Tennessee home. Months before, several instances of bank fraud had occurred in and around Fargo, according to police.
In their search for a suspect in the bank fraud cases, investigators used “our partner agency’s facial recognition technology” as well as “additional investigative steps independent of AI to assist in identification” before submitting the report to the Cass County State Attorney’s Office, Fargo Police Department Chief Dave Zibolski told CNN in an email.
But Zibolski said at a Tuesday news conference that his police department’s reliance on some of the information from a neighboring agency’s AI system is “part of the issue,” referring to errors made in Lipps’ case.
“At some point, our partner agency over at West Fargo purchased their own AI facial recognition system that we were not aware of at the executive level …, and we would not have allowed that to be used, and it has since been prohibited,” he said.
The West Fargo Police Department told CNN that they use Clearview AI, a startup with a database of billions of photos scraped from the internet, including social media. Clearview “identified a potential suspect with similar features to Angela Lipps” and West Fargo police shared that report with Fargo police, reads a statement from the police department. The statement notes that West Fargo police didn’t forward any charges and didn’t have enough evidence to charge anyone for the fraud case in West Fargo.
CNN has reached out to Clearview AI for comment. It’s unclear what other evidence was used in the investigation to tie Lipps to the crimes.
Lipps’ case comes as police departments across the country have rapidly integrated new technologies, including AI. But police use of the novel technology has attracted criticism – and it’s been linked to other cases of misidentification.
On July 1, a North Dakota judge signed a warrant for Lipps’ arrest, with nationwide extradition. She was arrested July 14 and spent over three months in a Tennessee jail before being extradited, according to Fargo police and her lawyers.
It wasn’t until October that Tennessee law enforcement told the Cass County Sheriff’s Office in North Dakota they had Lipps’ extradition waiver. She was facing multiple charges, including felony theft and felony unauthorized use of personal identifying information, according to her lawyers.
It’s unclear why it took so long for Tennessee authorities to notify their North Dakota counterparts about Lipps’ arrest. Lipps’ attorneys told CNN they “have seen a July 14, 2025 email notifying various North Dakota law enforcement personnel that Angela had been arrested in Tennessee.”
Fargo police, alternately, told CNN, “We have been unable to determine based on available information if the length of time Ms. Lipps was in jail in Tennessee before being transported to North Dakota was due to serving time for a probation violation or if it was because she fought extradition.”
CNN has reached out to Tennessee authorities for comment.
Lipps’ extradition to North Dakota, she said in her GoFundMe, was terrifying: “It was the first time I had ever been on an airplane,” she wrote. “I was terrified and exhausted and humiliated.”
In Fargo, she was given a lawyer who found bank records showing she had been in Tennessee during the time of the crimes, according to the GoFundMe. Fargo police say on December 12, the State’s Attorney’s Office informed the Fargo detective that the defense had produced “potential exculpatory evidence.”
On December 23, the Fargo detective, the state’s attorney and the judge “mutually agreed to dismiss the charges without prejudice to allow for further investigation,” according to Fargo police. Lipps was released from custody on Christmas Eve.
For Lipps, her months of incarceration were devastating.
“The trauma, loss of liberty, and reputational damage cannot be easily fixed,” Lipps’ lawyers told CNN in an email. Her lawyers said Lipps was unavailable to speak for an interview.
Lipps, a mother of three and grandmother of five, had never been to North Dakota before her extradition, according to CNN affiliate WDAY.
And after her ordeal, she never plans to return to the state: “I’m just glad it’s over,” she told WDAY. “I’ll never go back to North Dakota.”
Her legal team says they’re investigating why Lipps was held in custody for so long when “it appears that exculpatory bank records were readily available.”
“We believe that Angela’s lengthy detention was unnecessary and should have been avoided with a proper investigation by law enforcement,” they said.
Her lawyers are exploring civil rights claims but have yet to file a lawsuit, they said.
Zibolski, Fargo’s police chief, said authorities had identified a “couple of errors” in the investigative process that led to Lipps being identified as a potential suspect in the fraud cases.
At a Tuesday news conference, police said that the Fargo Police Department doesn’t have any AI-powered facial recognition tools of its own, but neighboring West Fargo does – and their system identified Lipps as a “potential suspect” based on the image on a fake ID used in a West Fargo fraud case.

“They forwarded that information to our detectives, who then assumed wrongly that they had also sent in the surveillance photos with that photo ID,” Zibolski said.
The chief said Fargo police will no longer be “sending or utilizing information” from West Fargo’s AI system because “it’s their own system – we don’t know how it’s run or how it’s overseen.”
Instead, Fargo police will work with state and federal authorities, including the North Dakota State and Local Intelligence Center, he said. Additionally, all facial recognition identifications will be submitted to the Investigations Division commander on a monthly basis, “so that we can keep a closer eye on this evolving technology,” he said.
Fargo police also erred, according to Zibolski, by not submitting surveillance photos associated with the fraud cases to the North Dakota State and Local Intelligence Center, which he said is certified and trained in facial recognition. Police “immediately began measures to address that,” and the center has since provided the center with other potential suspects based on the surveillance footage, Zibolski said.
He also addressed the months between Lipps’ extradition and her first interviews with Fargo authorities.
“In talking with Cass County and the State’s Attorney’s Office, there’s not an easy mechanism for them to notify us if someone arrested on our felony warrant is into custody,” he said. The department is considering improvements, including a daily review of the booking roster.
Asked if the department plans to apologize to Lipps, the chief said, “At this juncture, we still don’t know who’s involved and who’s not involved” in the fraud cases.
“We’re going to have to whittle through all of this kind of vast network of people and who’s involved,” he said.
Zibolski added police are still considering any disciplinary measures for officers involved in the investigation.
“What I can tell you, from what I know right now, is that the persons involved are also very upset by this, because they pride themselves on their thoroughness,” he said. “No one wants to see someone detained, arrested unnecessarily.”
The State’s Attorney’s Office is also “very interested” in attending training about facial recognition with the North Dakota State and Local Intelligence Center, “so that they have a better perspective also on the prosecutorial side,” the chief said.
Fargo police previously told CNN the case is still “open and active” and that “the charges may be refiled if additional investigation supports doing so.”
Lipps’ attorneys said that they appreciated the police department’s efforts toward correcting AI-related issues in the future but criticized what they characterized as a lack of “basic investigative efforts” before issuing Lipps a warrant.
“Officers knew that Angela was a Tennessee resident, and we have seen no investigation by officers to determine whether she traveled to or was in North Dakota at the time of the bank thefts,” they said in a news release after the Tuesday news conference. “Instead, an officer used AI facial recognition as a shortcut for basic investigation, resulting in an innocent woman being detained and transported halfway across the country to answer for charges that she had nothing to do with.”
It’s not the first time that the use of AI in policing has attracted scrutiny.
Last year, armed police handcuffed and searched a Baltimore County high school student after an AI-driven security system flagged the teen’s empty bag of Doritos as a possible firearm.
The incident sparked criticism of the school’s safety protocols and calls for accountability.
Ian Adams, an assistant professor in the department of Criminology and Criminal Justice at the University of South Carolina, told CNN that police are currently rapidly adopting new technologies, including AI – with little evidence for their efficacy.
“We’re doing it so quickly that all agencies really have to rely on is vendor promises,” he said.
He added that most mistakes involving AI in policing involve human error, too.
“The overwhelming amount of the time, it’s not just a technology problem, it’s a technology and people problem,” Adams said. “We get nightmare scenarios when we don’t have people doing what they’re supposed to do, with technology that they’re using inappropriately.”
Because AI tools are so powerful, “it’s very easy to get lulled into a sense of complacency,” he said.
But “your detectives need to be really, really careful to make sure that they’re putting their human eyes on these algorithmic results.”
CNN’s Diego Mendoza contributed to this report.
[1] The reason being that she was found in Tennessee while being searched for a crime in another state, thus allowing them to treat it as interstate fugitive from a crime scene
But...
> there is no way to tell if you are using it "correctly".
This simply isn't true, at least in cases like this.
I know common sense isn't really all that common, but why would you give more credence to an untested tool than an untested crack-addled human informant?
The entire point of the informant, or the AI in this instance, is to generate leads. Which subsequently need to be checked.
The problem here is incidental to the tool; it was done by the cops and therefore nobody will be held accountable.
https://www.grandforksherald.com/news/north-dakota/ai-error-...
Some people are just weird
https://www.lawlegalhub.com/how-much-is-a-wrongful-arrest-la...
But this approach negates much of the incentive to pay for questionable results.
That would be the vendors, the system planners, and the institutions that greenlit this. It would also include the larger financial tech circle that is trying to drive large scale AI adoption. Like Peter Thiel, who sees technology as an "alternative to politics". I.e. a way to circumvent democracy [1]
[1] https://stavroulapabst.substack.com/p/techxgeopolitics-18-te...
Look for similar to play out elsewhere --- using unreliable tools for decision making is not a good, responsible business plan. And lawyers are just waiting to press the point.
I’m very opposed to AI in general, but this one is clearly human failure.
The noteworthy AI angle is the undeserved credence police gave to AI information. But that is ultimately their failure; they should be investigating all information they receive.
Absolutely.
The failure starts with tool vendors who market these statistical/probabilistic pattern searchers as "intelligent". The Fargo police failed to fully evaluate these marketing claims before applying them to their work.
So in the same way that the failure rolled down hill, liability needs to roll back up.
At some point, you have to decide if wasting good money on bad intel makes sense.