406 comments

  • mitchbob 3 days ago ago
  • whack 3 days ago ago

    > According to the court documents, the Fargo detective working the case then looked at Lipps' social media accounts and Tennessee driver's license photo. In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color.

    > Once they were in hand, Fargo police met with him and Lipps at the Cass County jail on Dec. 19. She had already been in jail for more than five months. It was the first time police interviewed her.

    How is this the fault of AI? It flagged a possible match. A live human detective confirmed it. And the criminal justice system, for reasons that have nothing to do with AI, let this woman sit in jail for 5 months before doing even interviewing her or doing any due diligence.

    There's a reason why we don't let AI autonomously jail people. Instead of scapegoating an AI bogeyman, maybe we should look instead at the professional human-in-the-loop who shirked all responsibility, and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.

    • rglover 3 days ago ago

      > How is this the fault of AI? It flagged a possible match. A live human detective confirmed it.

      Because we're seeing the first instances of what reality looks like with AI in the hands of the average bear. Just like the excuse was "but the computer said it was correct," now we're just shifting to "but the AI said it was correct."

      Don't underestimate how much authority and thinking people will delegate to machines. Not to mention the lengths they'll go to weasel out of taking responsibility for a screw up like this (saw another comment in this thread about the Chief of Police stepping down but it being framed as "retirement").

      • mrweasel 2 days ago ago

        It's only recently some have come to terms with the fact that DNA evidence sometimes returns false positives. Society, and law enforcement, assumed that DNA was infallible. No one apparently wondered millions of people could be reduced to a tiny number of genetic markers apparently having no overlap.

        Danish police had to redo 20.000 DNA tests with a larger set of markeres begin tested, because they jailed someone based solely on a DNA test and did consider that they might have gotten the wrong person, despite the DNA match. It's essentially a human hash collision.

        Identification by AI is going to be the same, except worse, because it's frankly less scientific. Law enforcement, the judicial system and especially the public is simply to uninterested in learning the limitations of these types of systems. Even in the more civilized part of the world police would love to just have the computer tell them who to pick up and where.

        • Gibbon1 2 days ago ago

          There was a man arrested in Santa Clara county because his DNA was tracked to a murder scene by the paramedics that treated him before they were called to the scene of the murder. He only got away with it because the public defender realized that he was in the hospitals detox at the time of the murder.

          • lotsofpulp 2 days ago ago

            Typically, the “it” in the phrase “got away with it” refers to an action that broke the rules.

            • jackyinger 2 days ago ago

              “Got off” would be more appropriate

              • Eddy_Viscosity2 a day ago ago

                "got off" implies he was guilty but got away with it. I'd say "vindicated" or "absolved" fit the bill here.

        • RobotToaster 2 days ago ago
        • hpdigidrifter 2 days ago ago

          Closed source DNA testing software and hardware is a travesty imo

      • crashbunny 2 days ago ago

        not the first instance.

        This was 2023 https://www.youtube.com/watch?v=lPUBXN2Fd_E&t=19s

        A dude in the usa was arrested in a casino by police because the casino's facial recognition software said he had been trespassed before. He hadn't. I think there was height differences and eye colour difference. The police still arrested him, booked him. I think the prosecutors took it to trial.

      • pj_mukh 3 days ago ago

        I'm sorry but this is a piss-poor excuse. When I Claude code broken features, I'm responsible 100%.

        Why are cops not treated the same way? OP is right, AI is totally irrelevant in this story.

        If the point is "cops can't be trusted". Why do they have GUNS?! AI is the least of your problems.

        I feel like I'm going crazy with this narrative.

        • jacquesm 3 days ago ago

          > I feel like I'm going crazy with this narrative.

          We're only getting warmed up. There are programmers on HN that will take the output of their favorite AI, paste it and run it. And we're supposed to be the ones that know better.

          What do you think an ordinary person is going to do in the presence of something that they can not relate to anything else except for an oracle, assuming they know the term? You put anything in there and out pops this extremely polished looking document, something that looks better than whatever you would put together yourself with a bunch of information on it that contains all kinds of juicy language geared up to make you believe the payload. And it does that in a split second. It's absolutely magical to those in the know, let alone to those that are not.

          They're going to fall for it, without a second thought.

          And they're going to draw consequences from it that you thought could use a little skepticism. Too late now.

        • heavyset_go 3 days ago ago

          When you foster a culture of impunity and passing the buck, don't be surprised when they pass the buck to the inscrutable black box they bought.

          You might even argue that's the purpose of the inscrutable black box.

        • dml2135 3 days ago ago

          The “I” in “AI” stands for “intelligence”. Cops are using AI facial recognition because it is being sold to them as being smarter and better than what they are currently capable of. Why are we then surprised that they aren’t second-guessing the technology?

          • wongarsu 2 days ago ago

            AI facial recognition is smarter than what they are capable of. That's not the issue. It is much faster than a human, and state-of-the-art models make fewer errors than a human (though the types of errors are not the same).

            The issue is that facial recognition is just not very reliable. Not for humans and not for machines. If you look at millions of people, some of them just look incredibly similar. Yet police apparently thought that was all the evidence they will ever need. A case so watertight there's no point in even talking to the suspect

            • wartywhoa23 2 days ago ago

              So the sane solution here is just leaving unreliable stuff to humans and reliable to machines. Especially so when human wellbeing and freedom are at the stake.

              To define the line between the two, calculate the percentage of cases when mainstream CPUs return anything but integer 4 after addition of integer 2 and integer 2, and use that as the threshold to define "reliable".

          • re-thc 2 days ago ago

            > The “I” in “AI” stands for “intelligence”

            By that logic the “I” in Siri is 2x more intelligent.

          • lotsofpulp 3 days ago ago

            Because they are supposed to possess minimum levels of intelligence found in homo sapiens, which includes not believing anything a salesperson says.

            Also, their whole job is dealing with people who constantly lie to them.

            • pixl97 3 days ago ago

              There are two things occurring here.

              Police get raises and recognition for closing cases. In general they don't care if you're guilty or not, that's someone else's problem. Same with the detective, same with the DA. The more cases they close they 'tougher they are on crime'.

              The next thing occurring is

              https://en.wikipedia.org/wiki/Computer_says_no

              https://en.wikipedia.org/wiki/Computer_says_no

              • mapt 2 days ago ago

                If you have a broken system whose injustice is checked only by the limitations of the human elements, and you start replacing those human elements and powerscaling them, you have an unlimited downside.

              • heavyset_go 3 days ago ago
            • mschild 2 days ago ago

              Some police departments seem to actively reject candidates that have higher scores on IQ tests. Not that I think IQ test scores and actual intelligence are related but it clearly shows their intended target candidate group.

              https://abcnews.com/US/court-oks-barring-high-iqs-cops/story...

              • tptacek 2 days ago ago

                This came up a few weeks ago. I don't think it's true. This lawsuit from 26 years ago is the only example anybody has come up with. Among the problems with this claim:

                * Nobody can find a police department that administers any kind of general cognitive test.

                * There are large states with statewide written police aptitude tests that are imperfect but correlated to general cognitive ability, and maximizing scores on that test is the universal correct strategy.

                * It's a luridly stupid policy and most municipalities aren't luridly stupid.

                I think this happened like, once or twice, in one or two of the 20,000 police departments across the United States, many of which are like one goober and his sidekick (no offense to them; just, you live in gooberville, you're a goober), and now it's an Internet meme that police departments specifically hire for midwittery. Nah.

                • ahazred8ta 2 days ago ago

                  In different states, police use cognitive aptitude tests such as the Wonderlic -- https://jobdescriptionandresumeexamples.com/10-important-fac... -- https://www.practice4me.com/lst-police-exam/ -- these are not strictly 'IQ' tests, but they're very similar.

                  • tptacek 2 days ago ago

                    The Wonderlic might as well be an IQ test (I'm using the term "general cognitive test").

                    The LST isn't; it's a domain-specific occupational exam.

                    If you find a place that (1) uses the Wonderlic and (2) has recently (like, not all the way back in 2000) claimed there was a high-end cut-off for applicants, you'll have disproven my claim. I don't think giving general cognitive tests to prospective police officers is common; this is why there are things like the LST, the PELLETB, and the POST.

            • tharkun__ 3 days ago ago

              You're over-selling the minimum level of intelligence in homo sapiens.

              What you're stating is your wishful thinking. Don't get me wrong. I'd also like what you say to be true. It very much is not. Quite the opposite, which is why salespeople "work".

              The amount of AI bullshit Senior+ level developers just paste to me as truth is astonishing.

        • caconym_ 3 days ago ago

          As soon as we start to see a pattern of shitty vibe-coded software actually harming people via defects etc. (see: therac-25), I would hope that the conversation is about structural change to mitigate risk in aggregate rather than just punitive consequences for the individual programmers who are "responsible". The latter would be a fantastically stupid response and would do little or nothing to reduce future harm.

          • pj_mukh 3 days ago ago

            all accountability need not be punitive, we can certainly talk about systemic guardrails. What I find disbelief in, is someone saying the Chief of Police saying "We are not going to talk about that today?" is not the biggest scandal, but the AI is.

            • toraway 3 days ago ago

                "Among his accomplishments has been establishing the department’s Real Time Crime Center that leverages technology and data to support officers in responding more effectively to incidents," the city's release said. "Zibolski also prioritized officer wellness initiatives to strengthen mental health resources and resilience within the department. He reinstituted the Traffic Safety Team to focus on roadway safety and proactive enforcement, and ... played an active role in statewide discussions on various issues affecting law enforcement."
              
              From the same article... He spearheaded a push to "leverage technology and data to support officers in responding more effectively to incidents", then that same technology mistakingly ruins a woman's life by passing along a hit to an officer who compared with her FB photos and said "sure, seems right".

              The technology seems highly relevant here. Plus, as we've seen in the software world, when a mandate comes from the top to use the shiny new magic AI tools as much as possible, the officer may have felt pressured to make arrests using the new system they paid a bunch of money for instead of second guessing whatever it spits out.

            • caconym_ 3 days ago ago

              > someone saying the Chief of Police saying "We are not going to talk about that today?" is not the biggest scandal, but the AI is.

              Who is this "someone"? OP's article and the discussion here are absolutely not neglecting the human factors and general institutional failure that made this possible. But it's also true that without these "AI" tools, it would never have happened.

              • pj_mukh 3 days ago ago

                Yea but this feels like when a Waymo ran over a cat, and a Human driver ran over a toddler and both got the same level of coverage in the media (actually the cat got more follow-up coverage). And I'm supposed to believe both issues are equally important.

                No. That's gaslighting, and totally misplaced political activation.

                • abustamam 2 days ago ago

                  What do you propose we do in the latter situation? The news isn't the value of the life that was (presumably lost). The news is the circumstances that made that loss possible. Human driver was maybe careless, or maybe didn't look. The child safety classes I took emphasized over and over again to look around your car and yard before backing your car out. This is a problem with a known solution that unfortunately still happens despite the best efforts to prevent it.

                  Waymo hitting a cat is obviously less tragic, but if it can hit a cat, what else can it hit? A toddler? A human? The wall of your kitchen? This is a problem that has no known solution; furthermore, it's a problem that the engineers at Waymo don't seem overly keen on solving quickly.

                  • 2 days ago ago
                    [deleted]
                  • pj_mukh 2 days ago ago

                    "This is a problem with a known solution that unfortunately still happens despite the best efforts to prevent it."

                    Great, let's just apply that logic to Waymo as well and call it a day (see how silly that sounds?). Waymo has engineers..so does the Department of transportation.

                    • abustamam 2 days ago ago

                      I'm really not sure how to respond to this because it seems like you're insinuating that the Dept of Transportation has the same level of control over ALL cars in the country as Waymo has over their cars.

        • malwrar 3 days ago ago

          You are right IMO to question why North Dakota police were able to obtain this Tennessean woman in the first place, you’d think something like that should require far more sufficient evidence than facial recognition.

          But, then what good is facial recognition for? Would it have been okay for this woman’s life to have been merely invaded because she matched a facial recognition system? Maybe they can just secretly watch you so you’re not consciously aware of being investigated? Should that be our new standard, if a computer thinks you look like a suspect you can be harassed by police in a state you’ve never even been in?

          I just don’t see a legitimate way for AI to empower officers here without risking these new harms. That’s why I lean towards blaming the AI tech, rather than historically intractable problems like the reality of law enforcement.

          • mlinsey 3 days ago ago

            Having a facial recognition match make you a suspect and cause the police to ask you some questions doesn't seem completely unreasonable to me. Investigations can certainly begin with weak forms of evidence (like an anonymous tip), you just require a higher standard of evidence for a search warrant, surveillance, or an arrest. A facial recognition match shouldn't be probable cause for an arrest warrant, but it still might be a useful starting point for a detective looking for actual evidence.

            • crooked-v 3 days ago ago

              It is absolutely not reasonable to use low-quality photos to decide someone halfway across the country with no history of even leaving their local area is 'a suspect'.

              • kitd 2 days ago ago

                You wouldn't know they had no history of leaving their local area unless you interviewed them.

                • Y-bar 2 days ago ago

                  Why does not the investigator have to supply some sort of evidence that she has a history of leaving their local area rather than putting the onus on the accused? This line of argument is halfway to "guilty until proven otherwise".

                  • mlinsey 2 days ago ago

                    You and the GP that replied to me are way overstating what it means to be a "suspect". It just means the police are investigating you and consider it a possibility you've committed the crime. On its own, is not a sufficient status to search your home, subpoena your ISP, or arrest you - all of those things require a much higher burden of evidence, and oftena third party (judge's) approval. People routinely become "suspects" on much flimsier evidence than an unreliable software match - if I call in an anonymous tip that I saw you acting suspicious near the crime scene, you will probably become a suspect.

                    If you'd like, you can replace the term "suspect" in my post with "person of interest", which colloquially implies a lot less suspicion but isn't practically any different in terms of how the police interacts with you.

        • Brian_K_White 2 days ago ago

          I feel like I'm going crazy that anyone tries to suggest the AI and the producers and promulgators and apologists of AI played no part and bear none of the responsibility in this narrative.

          • JuniperMesos 2 days ago ago

            Because the responsibility lies on the part of the criminal justice system who used the flimsy AI facial recognition evidence to arrest and hold her for months. If AI didn't exist, and this same incident happened because a human looked at a photograph of the woman and said "I think this might be the same person who committed the crime in the video", it would be insane to blame the people who invented photographs or video recording for her arrest.

            • Anamon 2 days ago ago

              The problem is in how these tools are sold to them. Not everybody can be an expert in every topic. Like in every other application area, these AI systems are promoted as being able to do about a thousand times more, and a million times more reliably, than they actually are. Of course the departments can be expected to do some due diligence and instruct their officers, but the lies by AI system suppliers is where a large part of the blame belongs. Manufacturers of cameras or CCTV systems never told the police department that the system would do their job for them.

        • jfengel 3 days ago ago

          You are exactly correct. Cops cannot be trusted. We spent a lot of time pointing that out in 2020. AI is the least of our problems with policing.

          Unfortunately, a lot of people are certain it won't happen to them, and it has been practically impossible to establish any kind of accountability. It has only gotten worse since 2020.

          • akimbostrawman 2 days ago ago

            Are we just gonna pretend the wide implementation of bodycams hasn't shown that the overwhelming majority of times the cops weren't in the wrong to a point that the same people that demanded them want them gone now?

            • FireBeyond 2 days ago ago

              Citation needed. Who are these people who wanted improved police oversight who are supposedly now fighting for the removal of bodycams?

        • smcl 3 days ago ago

          You’re on the right track here but I don’t think it should be hand-waved away as “the least of your problems” - it’s yet another weapon that police in the USA can use against the population with impunity. They’re going to have to reckon with all of this in the coming years - cops having guns and armored cars, “qualified immunity”, the “stop resisting” workaround for brutality and now this AI

        • rapnie 2 days ago ago

          The AI is the authority having so much knowledge, that we hear a reassuring "Please continue" [0].

          https://en.wikipedia.org/wiki/Milgram_experiment

        • stego-tech 3 days ago ago

          You’re going crazy because up until this exact moment you’ve never had to confront the reality that these tools, placed into the hands of the common man, are viewed as authoritative and lack any accountability or consequence for misuse.

          For anyone who has been victimized by law enforcement or governments before, we’ve been warning about this shit for decades. About the lack of consequence for police brutality. The lack of consequence for LPR abuse. The lack of consequence for facial recognition failures and AI mismatches.

          You need to understand that by using these systems correctly and holding yourself accountable, you are in the minority. Most people do not think that critically, and are all too happy to finger the computer when things go badly.

          And until you accept that, and work to actually hold folks accountable instead of deflecting blame away from the tool, then this won’t actually change.

          • Nimitz14 2 days ago ago

            Your answer presumes we cannot hold people accountable. I think that is incorrect.

            • throwaway17_17 2 days ago ago

              Do you mean hypothetically could society hold law enforcement personnel accountable for mistakes, bad judgement, flagrant criminal conduct, horrendous abuse of any and everyone? Certainly, a large scale and comprehensive restructuring of America’s law enforcement and prosecutorial system is legally possible.

              However, I hold to the opinion that if you are discussing actual reality, based on decades (if not the entire period post civil war, for near certainty) of historical examples and the current “majority” position of the US electorate: there is a nearly unqualified NO. We cannot, or will not, hold law enforcement accountable for even intentional, planned, and malicious conduct in a vast majority of cases. There is practically no accountability at all, and that’s just for thoroughly proven intentional conduct. Bad judgement, alleged mistakes, etc are even less able to result in any action.

              The reality of the legislation and precedent ensure it. It’s not a bug, it’s a feature.

        • antod 3 days ago ago

          But it's not totally irrelevant in this story.

          Cops are already susceptible to confirmation bias, and for "efficiencies" they are delegating part of their job to apparently magical tools that will only increase their confirmation bias. And because it is for efficiency you can bet they won't be given extra time to validate the results.

          What or who is at fault isn't either/or, it's a bunch of compounding factors.

        • pear01 3 days ago ago

          It's called qualified immunity. Many support its repeal. I hope you join them, and convey the same to your local representatives and candidates. Until it is reformed few if any officers or administrators of criminal justice in the United States will ever feel any type of accountability.

          Short of video evidence of blatant gun to the back of the head style homicide qualified immunity means most law enforcement officials are never held accountable for their miscarriages of justice. Criminal charges against officers are exceedingly rare. She should be able to sue this detective directly. Of course she can sue the government too, and should. But without any personal consequences for the people carrying out these acts, taxpayers will continue to bail out these practices without ever noticing. Your own government should not be a shield for a police officer who has violated you or your neighbors.

          • kelnos 3 days ago ago

            > Many support its repeal.

            There's nothing to repeal. Qualified immunity is a doctrine that the judicial branch made up out of thin air, with no legislative backing.

            But agreed, we need legislatures to write laws that expressly hold police accountable, and declare that they are not shielded from liability when things go wrong due to their own failures and negligence.

            • throwaway17_17 2 days ago ago

              Not that it changes your point, but, um actually:

              While the origins of qualified immunity are judicial, some State loved the idea so much the went and made it statutory too. Louisiana’s 2024 bill explicitly removes negligence as an exception (which is a valid method to circumvent qualified immunity based on jurisprudence at the federal and most state levels). Louisiana requires intentional violations or criminal actions to even be able to bring a claim.

          • jfengel 3 days ago ago

            > Short of video evidence of blatant gun to the back of the head style homicide qualified immunity means most law enforcement officials are never held accountable for their miscarriages of justice.

            And frequently not even then.

        • ux266478 2 days ago ago

          You can hold someone responsible only after they've actually fucked up. And with the way things move in the criminal justice system, that can take months to discover. Holding them responsible doesn't really fix anything, it's purely reactive.

        • lenkite 2 days ago ago

          Dude, not sure which team are you working in, but across many-many domains - corporate, business and political, people are already delegating full decision making and responsibility to AI. Unless national governments and standards institutions create and enforce ironclad AI governance laws, situations resembling what this poor granny went through are going to occur again, again and again.

          • rectang 2 days ago ago

            There’s money to be made selling AI plausible deniability machines that allow end users to enact unethical policies while evading accountability, but only if all moral responsibility ostensibly falls on the end user and none on the dealer.

        • wat10000 3 days ago ago

          When are cops ever treated the same way as the rest of us?

          • deepsun 3 days ago ago

            Well in most cases I would prefer to have a cop's word to outweigh a word of an average joe.

            • bloomingeek 2 days ago ago

              You should tell that to Angela Lipps, I'm sure she told every cop she came in contact with she had never been to Fargo. Cops have a responsibility to do their job, part of that job is listening and relying on proof. ALL those cops were either too lazy or were afraid of their superiors. This is unacceptable for the amount of power and information they have access to. We should either de-fund the police system or reform the hell out of it. BTW, where was her state representative during this fiasco?!?

            • throwaway17_17 2 days ago ago

              The belief by a juror that law enforcement personnel, especially phrased as a belief that applies to law enforcement personnel as a generic group, is a well established basis for a challenge for cause leading to exclusion of that person from being a juror. The US jury system is build explicitly on excluding these types of belief in juries in order to ensure fairness, impartiality, and individual and case/witness specificity of “triers-of-fact”.

              I could understand someone who disagrees with it, but your position would be antithetical to current and historical thought on what defines a fair jury.

              • PopAlongKid 2 days ago ago

                >The belief by a juror that law enforcement personnel,

                >excluding these types of belief

                You have not stated what belief you are talking about.

                • throwaway17_17 a day ago ago

                  True, apparently I got backspace happy when Inposted the reply on mobile. I was talking about the belief by a perspective juror that law enforcement personnel are more credible or trustworthy than others due to their status as law enforcement personnel.

                  • spolitry a day ago ago

                    This is not quite true. The rules of evidence state that law enforcement (official) testimony is more credible than civilian testimony. Officials have a wide exemption from hearsay objections, if the offfical was working at an official task at the time.

            • amanaplanacanal 2 days ago ago

              Do you think police are inherently more honest than everybody else? Why would you think that?

            • wat10000 3 days ago ago

              Why should having that particular job give you that privilege? All should be equal before the law.

        • pessimizer 21 hours ago ago

          > Why are cops not treated the same way? OP is right, AI is totally irrelevant in this story.

          It's absolutely absurd. The argument that AI is the problem is literally the people arguing against AI shedding responsibility to the machines. The people arguing that AI is the problem are essentially (philosophically) the same people who will say it was the AIs fault.

          The thing that it most reminds me of is people trying to stop the deaths and injuries that come as a result of "swatting" by being really angry at people who "swat" and proposing the harshest punishment for it that they can come up with (or outdoes anyone else in the thread.)

          The problem with swatting is that police were showing up to the houses of harmless people based on anonymous phone tips and murdering them. You guarantee swatting will work indefinitely when you indemnify the cops.

          You don't need AI for injustice in the US justice system. There is literally no part of the US justice system that makes sense at all, and even in the best case scenario when the guilty are caught, tried, and punished, it is tremendously wasteful, cruel, and ass-backwards. Juries are basically the AI of the US justice system, allowing the prosecutorial and enforcement apparatus to be infinitely cruel, illogical, self-serving and incompetent. 12xFull AGI. AI couldn't do any worse.

          > I feel like I'm going crazy with this narrative.

          You're not alone.

        • kelnos 3 days ago ago

          I mean, this is the USA we're talking about. Cops are given huge authority over everyone else, with poor accountability. AI just lets them pretend to be even less accountable. And by "pretend" I of course mean "get away with it".

        • materialpoint 2 days ago ago

          See, AI was used to accelerate arrest and jailing, but not to follow through. It was not used to ensure her well being. Clearly this demonstrates that AI contributes to treating humans inhumanely, and demonstrably AI is not used to improve anyones quality of life. Stop making excuses for "AI not at fault here".

      • andrepd 3 days ago ago

        It's not even just incompetence, but malice. "AI says so" is going to be the perfect catch-all excuse for literally everything anyone might want to do that they shouldn't. You know how techbros love to excuse every horrifying outcome of their torment nexi with "don't blame me, the algorithm did it"? It's going to be like that, but now everyone can do it.

        • dqv 2 days ago ago

          It's also why people start parroting the phrase "the purpose of a system is what it does". Look at where we are right now: a precipice before this becomes widely used in all forms of policing. We still have a chance to police the police's use of the AI.

          The purpose of using AI to identify suspects in criminal cases is to ease the burden of manual searching for a suspect (or insert whatever the purpose of statement you want). Ok, but we're getting false positives that are damaging people's lives already in the early stages. And I don't want to hear "trust me bro, it will get more accurate" as an excuse to not regulate it.

          At a minimum, we should enshrine the right to appeal AI and have limits on how it can be used for probable cause.

          This isn't even the only recent case of this happening. There was another case of mistaken identity due to AI. [0] Sure 4 hours isn't the same as 5 months, but still this guy wanted to show multiple forms of ID to prove who he was! The bodycam footage was posted a few months back but never got traction here.

          Like if the police officer can't read numbers, they can't do breathalyzer tests on people. If the AI can't be used responsibly, then it can't be used at all.

          [0]: https://www.youtube.com/watch?v=lPUBXN2Fd_E

      • mapt 2 days ago ago

        If you're skeptical, watch this - https://www.youtube.com/watch?v=lPUBXN2Fd_E

      • foxglacier 3 days ago ago

        So what? There were false arrests and convictions made by misuse of line-ups, DNA, eye-witnesses, photos, bloodstains, fingerprints, etc. since forever. You must also blame all those other technologies, so what do you think the police should use to find suspects? In your view, the more help police have, the worse a job they'll do. Is that actually the trend?

        • catlikesshrimp 3 days ago ago

          With all other proof you mentioned, there was always a human putting his signature.

          Now that they can blame "AI" no specific officer(s) will take the blame, ever. If no one is responsible there will be many more false positives.

          And false positives destroy lives

          • dragonwriter 2 days ago ago

            > With all other proof you mentioned, there was always a human putting his signature.

            There was a human doing that in this case; AI doesn’t inititiate charges. “In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color.”

            • JuniperMesos 2 days ago ago

              The article could have - but didn't - mention this specific Fargo police detective's name.

              • dragonwriter 2 days ago ago

                The article not mentioning the name does not change that the detective did sign the police charging documents.

                (Nor does the omission in the article of other names and procedural details change the fact that for there to be actual criminal charges, an arrest warrant, extradition, and incarceration, a number of other people had to sign their names to official acts, including, among others, at least one public prosecutor, and more than one judge.)

        • worik 3 days ago ago

          So what???

          This woman lost most of her material possessions, was terrorised by "goons"... The police do this stuff regularly, as black people, immigrants, "white trash" etcetera know well. Another opportunity, presented BY AI models for more routine police oppression

          As the wise singer said: "Fuck the police!"

          • foxglacier 3 days ago ago

            Exactly, it's the police's fault, as well as the wider system they operate in that enables that kind of abuse, and they do it anyway even with out AI.

            • dragonwriter 2 days ago ago

              AI is, in this case, a tool enabling it, because trawling large databases using AI allows finding people with a degree of similarity to a suspect that would reasonably constitute probable cause int he context of what was until fairly recently the norm for police work because that work relies on proximity and connections to the crime. The understanding of probable cause and what is necessary for it , given the actual investigative process in the case, including the use of large databases unconnected with the events and locality of the crime needs to adapt.

            • scarecrowbob 2 days ago ago

              You're right that they often do a lot of harm.

              The point that you're missing is that, in a system where such abuses are possible, many of us really don't want one more tool in their box for them to fuck us with.

              Like, they already prove themselves incompetent- giving the power to track anyone in the US via a distributed ALPR system just makes them more dangerous. Giving them all these "AI" based tools does the same.

    • caconym_ 3 days ago ago

      This particular "AI bogeyman" isn't just AI; it's cops with AI and in particular cops with facial recognition tools, dragnet LPR surveillance tools, and all this other new technology that essentially picks somebody's name out of a hat to have their life temporarily (or [semi-]permanently) ruined by shithead cops who won't ever face any real accountability.

      This keeps happening, and the reason it keeps happening is that shithead cops have these tools and are using them. Until we can find a reliable way to prevent this from happening, which may or may not be possible, cops who may or may not be shitheads should not have access to these tools.

      • cortesoft 2 days ago ago

        Yes! This is about why mass surveillance and dragnets and the like are horrible. These all suffer from people not being able to understand the base rate fallacy (https://en.wikipedia.org/wiki/Base_rate_fallacy)

        Even if AI facial recognition gets really really good, and is 99.999% accurate, if you use it in this way you are going to arrest more innocent people than guilty people.

        If you find a suspect, who has a lot of evidence pointing to them being the criminal and you run a test that is 99.999% accurate and it tells you they are guilty, they are probably guilty.

        But if you take that same test and run it against the entire population of the country, it is going to find 3500 people that match with "99.999% certainty" That gives you a 0.02% of the person being guilty.

        People don't think like this, though, so they think the person must be guilty.

      • hinkley 3 days ago ago

        It’s also cops Making the Numbers Go Up by marking down a case file as having progressed because someone is in custody. Which isn’t about justice.

        • mothballed 3 days ago ago

          They don't seem to give a single iota of a fuck about that when a private regular person has their money stolen or their car totaled by hit and run driver. Finding some innocent person to arrest would indicate they are at least pretending to give a fuck, yet they seem to only be bothered to even keep up appearances when it is the bank being robbed.

          • chocoboaus3 2 days ago ago

            mate Capitalism 101

            • bloomingeek 2 days ago ago

              Sorry, I disagree. This is an example of the corruption inside the American legal system. The cops are at the level of us regulars, and their judgement and actions seem to have no supervision or accountability.

      • tgv 2 days ago ago

        > cops [...] should not have access to these tools

        But what else can (identification via) face recognition be (safely) used for? Absolutely nothing. It's tech that's just made for surveillance.

      • guelo 3 days ago ago

        It's not just the shithead cops, it's the voters. All the "Blue Lives Matter", "thin blue line", "back the blue" propaganda works towards giving police infinite powers with zero accountability. This is what voters want and they've said so loudly over and over again.

      • throwaway314155 3 days ago ago

        There’s nothing wrong with your comment per se, but it’s almost as if you didn’t even read the comment you’re responding to.

        • caconym_ 3 days ago ago

          Let me help you out with this comprehension issue. The point of my comment is that I disagree with the apparent premise of the comment I replied to, which is that "AI" is some generic investigative tool that we can neatly snip out of the picture to blame this incident on human factors at the individual level ("the professional human-in-the-loop who shirked all responsibility"). Said comment also implies that people are fixating on the AI aspect of this issue while ignoring the human factors, which IMO is a strawman. To me, the existence of AI in its current incarnations and the ways in which law enforcement will inevitably abuse it are, together, inseparably, the problem. AI (in the most general sense) opens up entire new dimensions for potential abuse.

          As a concrete example:

          > And the criminal justice system, for reasons that have nothing to do with AI, let this woman sit in jail for 5 months before doing even interviewing her or doing any due diligence.

          Let me state what should be obvious: without AI (as in, the facial recognition systems involved in this case), this woman would not have sat in jail for 5 months, or indeed for any length of time at all. So saying that it has "nothing to do with AI" is totally ridiculous.

          • fc417fc802 2 days ago ago

            > Let me state what should be obvious: without AI (as in, the facial recognition systems involved in this case), this woman would not have sat in jail for 5 months, or indeed for any length of time at all.

            How do you arrive at that conclusion? Because it happened, and it wasn't an AI overseeing (the lack of) due process. The police identifying suspects is part of their job. So are arrest warrants and all the rest of it. I honestly don't see what AI had to do with anything here. All I see is a gaping systemic issue that could have happened regardless of AI if the wrong person got the wrong idea or had a personal vendetta.

            Suppose ICE busts down someone's door, drags them off, holds them in an internment camp for months, and then finally goes "oh, oops, guess you were a citizen all along sorry about that" and releases them. We don't blame the source of their faulty hit list. We blame the systemic practices and legal apparatus that permitted it all to happen in the first place.

            You might as well blame the SUV manufacturer because without vehicles the police wouldn't hav been able to drive over to make the arrest, right?

            • caconym_ 2 days ago ago

              > How do you arrive at that conclusion?

              Because it's beyond obvious? How would this woman have ended up in jail if she hadn't been misidentified by the facial recognition software in use by the Fargo police? She lives 3 states over; would be a hell of a coincidence if some other avenue of investigation led them to her.

              > I honestly don't see what AI had to do with anything here.

              You honestly don't see what facial recognition software had to do with a woman being misidentified by facial recognition software?

              > Suppose ICE busts down someone's door, drags them off, holds them in an internment camp for months, and then finally goes "oh, oops, guess you were a citizen all along sorry about that" and releases them. We don't blame the source of their faulty hit list.

              I actually am completely willing to blame any entity that supplies ICE with the names of people it can reasonably assume will be targeted for "enforcement action" due to said entity representing said names as being legitimate targets for said enforcement action, without taking reasonable care to ensure said representation is correct in each and every case.

              What you don't seem to understand is that these abuses of law enforcement authority are predicated on at least an appearance of legitimacy, which can be provided by (e.g.) an app with (presumably) a very official looking logo that agents can point at somebody to get a 'CITIZEN' or 'NOT CITIZEN' classification. It is upon this kind of basis that they perform illegal arrests. All parties—the app vendor and ICE, as well as the people who are meant to be overseeing ICE and providing accountability—are complicit enablers in these crimes. To absolve the vendors who provide the software knowing full well what it will be used for, what its limitations are, and how unlikely it is that ICE personnel will understand those limitations and work around them to keep everything legal, is totally absurd.

              • fc417fc802 2 days ago ago

                It isn't obvious, no. If I drop a hammer on my foot and break my toe I can't then blame the hardware store or the manufacturer. If the store didn't carry hammers I wouldn't have been able to purchase it, I think to myself. Then I couldn't possibly have dropped it on my foot, thus my toe wouldn't be broken right now. It's a specious line of reasoning.

                It doesn't matter in the slightest by what means she was selected to "win" this particular lottery. The tool rolling the dice isn't to blame. Tools (and people!) will occasionally return spurious results. Any system needs to be set up to deal with that.

                So no, I honestly don't see what facial recognition software has to do with gross negligence and process failure on the part of multiple government agencies.

                > without taking reasonable care to ensure said representation is correct in each and every case.

                Only if that was part of the contract. Was the product delivered according to specification or not?

                What if ICE used FOSS tools to put together the list themselves? Are the tools still to blame? That would obviously be absurd.

                The only way the provider (never the tool) could be at fault would be something such as willful negligence or knowingly and intentionally attempting to manipulate the user's actions to some end.

                What you don't seem to understand is that human negligence can't be foisted off on tools. Of course an abuser will try to play his actions off as legitimate. That isn't the fault of the tool, it's the fault of the abuser. It isn't up to an app to determine the legitimacy of LEO agent actions. Neither is it the responsibility of an arbitrary, fungible government contractor to oversee ICE.

                I think you're confusing the morality of participating in a broader ecosystem with moral culpability for the process failure associated with a specific event. You can advance a reasonable argument that AI companies that choose to do business with ICE are making an at least moderately immoral decision. However that doesn't place them at fault for the specific process failures of any particular event that happens.

                • caconym_ 2 days ago ago

                  If you don't agree that facial recognition software is involved in a case of a woman being misidentified by facial recognition software then there is no point in me spending any more time/effort in conversation with you. Goodbye.

                  • fc417fc802 2 days ago ago

                    You seem to be intentionally ignoring the point I made. I never disputed that facial recognition software was used (ie involved).

                    The facial recognition tool didn't arrest her. It holds no authority, has no will of its own, and does not possess a corporeal form with which to enact change in the world. The only parties that could possibly be at fault here are various government agents who clearly acted with negligence, failing to uphold their duty to the law and the people.

                    If you're unable to rebut my point then perhaps you should consider that you might be in the wrong? If you're unwilling to entertain such a possibility then I wonder why you're posting here to begin with. What is your goal?

                    • caconym_ 2 days ago ago

                      > I never disputed that facial recognition software was used

                      You, yesterday:

                      > I honestly don't see what AI had to do with anything here.

                      ???

                      > You seem to be intentionally ignoring the point I made.

                      I completely understand your point. You are saying that if a mentally ill high schooler manages to acquire a gun and kills 20 people at their school, we should a) punish the shooter, and b) understand the gun as a neutral object that simply popped into existence and was misused, rather than a machine whose design purpose is to kill humans, and whose manufacturer(s) (and other organizations who profit from the easy availability of guns) are actively engaged in a broad effort to preserve the status quo which allowed a mentally ill high schooler to acquire a gun and massacre 20 of their classmates/teachers.

                      I think it's a terrible opinion, and I vehemently disagree with it. But if you are willing to engage in the sort of rhetorical contortions highlighted at the top of this comment, there is no point in expressing my disagreements to you, because you will evidently say literally anything in response. I may as well have a debate about toilet tank design with `cat /dev/urandom`.

                      > If you're unable to rebut my point then perhaps you should consider that you might be in the wrong?

                      Try looking in the mirror, buddy. Sheesh.

          • throwaway314155 3 days ago ago

            Like I said, there wasn’t anything wrong with your comment. It just didn’t seem to directly address the parent comment. This does, thanks.

        • Retric 3 days ago ago

          Seems like a direct response to me.

          >> How is this the fault of AI?

          > This particular "AI bogeyman" isn't just AI; it's cops with AI

          You can’t separate the thing from how it will be used. It’s like arguing that cars on their own aren’t particularly dangerous, but the point of buying a car is to use it thus risking the general public.

          • fc417fc802 2 days ago ago

            But you can in fact argue exactly that. If (arbitrary example) pedestrians are being killed due to poor road engineering practices it isn't reasonable to point at cars and say "see those are the root problem" when in fact it's due to a willful lack of sidewalks or marked crossings or whatever. Being adjacent to something bad doesn't equate to being the root cause.

            • Retric 2 days ago ago

              History shows the timeline of dependence here. Before the introduction of cars, “poor road engineering practices” wouldn’t result in those deaths. So clearly it’s cars that are necessitating sidewalks, etc.

              Same deal here, if something “becomes a problem” because of the introduction of AI, it’s AI that is the root case of the resulting issues. Many people are tempted to argue that flawed humans can’t implement the perfect system that is Anarchy, Communism, Recycling programs, or whatever but treating systems as needing to operate on the real world is productive where complaining about humans isn’t.

              • tatersolid 20 hours ago ago

                > Before the introduction of cars, “poor road engineering practices” wouldn’t result in those deaths.

                Death by adverse horse encounter was very common before the 1920s. Not sure how many of those deaths can be blamed on poor quality road engineering. But putting a bunch of humans, carts, and excitable half-ton animals in the same crowded streets seems like poor engineering practice.

              • fc417fc802 2 days ago ago

                Well I (thought it was obvious that) I was referring to roads constructed relatively recently. If cars necessitate sidewalks and the city chooses to cut costs by not putting those in that isn't the fault of automobile designers or manufacturers or dealers or private owners or whoever.

                To your example, technology changes and that necessitates infrastructure changing. That doesn't mean that fault for mishaps in the meantime can be attributed to the new technology. A user operating the new technology in an obviously unsafe manner is solely at fault for his own negligence.

                • Retric 2 days ago ago

                  The safest street designs still result in automobile fatalities. You can at best mitigate the issue with better street designs but not address the underlying issue.

                  Failing to acknowledge cars as the root cause may be comforting, but it blinds you to viable solutions.

                  Indoor shopping malls for example solve many of the issues with cars by forcing people to move around on foot in a little island surrounded by a sea of very low density parking. They are’t perfect solutions, but they still saved a lot of lives and time.

                  Saying people are misusing a new technology is just another way of saying that technology is flawed. This doesn’t mean you can’t utilize it, but pretending flaws don’t exist has no value.

    • antonymoose 3 days ago ago

      Reminds me of a case that just popped up in my neck of the woods.

      Man gets pulled over on an expired plate. They search based on this fact, find a pill bottle (for Irritable Bowel Syndrome) and magically find he’s trafficking cocaine and fentanyl.

      Months later a lab test exonerates the poor guy.

      https://www.wyff4.com/article/deputies-falsely-identify-ibs-...

      • bloomingeek 2 days ago ago

        I've always maintained one of the worst things that can happen to you is sitting in court before a jury of your peers, because most can't comprehend the meaning of the law outside of their feelings. NOW the worst thing is having yourself in the hands of cops who just don't give a damn or became a cop for the use of power.

      • JuniperMesos 2 days ago ago

        This one seems pretty reasonable - according to the article, the cops pulled him over for swerving lanes (driving unsafely on public roads in a reasonable thing to want to police), and then discovered that he was driving on a suspended license, which he admitted to (it's reasonable to have a system for suspending peoples' drivers licenses that is enforced by the police). The police find the pill bottle and don't believe him when he tells them it's a legitimate drug, then "conduct[..] multiple field drug tests, which produced a positive result for fentanyl. Getchius was taken into custody and transported to the Greenwood County Detention Center. Shortly after, another drug test was completed and returned positive results for cocaine."

        So it wasn't just the pill bottle, it was multiple other drug tests. I think you could make a reasonable argument that drug use shouldn't constitute a crime in and of itself - although it probably should if you're driving a car, for legitimate traffic safety reasons, I don't find DUI laws objectionable. Or you could make an argument that the criminal justice system shouldn't interfere with peoples' decision to use and sell drugs. I'm sympathetic to this myself, but I think especially in the case of opioids like fentanyl, the situation where government paternalism makes it illegal to sell opioids probably discourages enough destructive use of these drugs by unwise or already-addicted people that it's still net-positive in terms of human welfare. I suspect a society where it was simply legal to use and sell opioids would have a lot more human suffering in it than our own (possibly because in the absence of laws banning open opioid dealing, people who are close to severe opioid addicts might simply commit vigilante murders of suspected opioid dealers, and be left unconvicted by sympathetic juries). And once you hold the position that it's legitimate for the government to legally restrict the sale and use of these drugs, then you necessarily have to have something like police and something like a criminal justice system that investigates whether a person might be actually using and selling opioids and then lying about it.

        The fact that the guy was in fact once addicted to some drug and "was working at rehab and addiction centers in Florida at the time of his arrest." is additional evidence that he might have returned to drug use, and there's no way to make cops who investigate opioid-related crimes not think this.

        • mwigdahl 2 days ago ago

          If a field drug test can confuse an irritable bowel syndrome drug for fentanyl or cocaine, it is not reliable enough to be used for law enforcement purposes. The same applies to facial recognition tech. We need real information on the false positive vs false negative rates for tech that purports to establish identity or criminality.

          • 1123581321 21 hours ago ago

            The test didn't confuse the drugs. He tested positive for fentanyl and cocaine. They accused him of trafficking drugs in addition to that because of the IBS pill bottle.

            It's an unfortunate story because it sounds like he was having relapse trouble, and the cops were predisposed to do the worst to him that they could (mis)justify, when he needed to cool off and then get back to the professionals helping him with recovery.

    • obviouslynotme 3 days ago ago

      It's not. This is just an acceleration in the unraveling of society facilitated by AI. As someone whose childhood included so many "robots will kill humans" books and movies, I am flabbergasted that the AI apocalypse will be dumb humans overtrusting faulty AI in important matters until everything falls apart.

      Most humans cannot distinguish AI from actual intelligence. When you combine that with bureaucrats innate tendency to say, "Computer said so," you end up with bizarre situations like this. If a person had made this facial match, another human would have relentlessly jeered him. Since a computer running AI did it, no one even cared to think about it.

      Computers are wildly dangerous, not because of anything innate but because of how humans act around them.

      • throwaway173738 3 days ago ago

        > It's not. This is just an acceleration in the unraveling of society facilitated by AI. As someone whose childhood included so many "robots will kill humans" books and movies, I am flabbergasted that the AI apocalypse will be dumb humans overtrusting faulty AI in important matters until everything falls apart.

        This is literally the plot of most of those books and the way they differ is in how everything falls apart. In some of them the AI supplants us entirely and kills us all. In others it gets taught to kill us all. In others it gets really good at giving us what we ask for until everything falls apart. But it’s taken as a given that unless we change something innate in our culture AI will be our downfall.

      • fc417fc802 2 days ago ago

        > If a person had made this facial match, another human would have relentlessly jeered him.

        The glaringly obvious problem here is that our justice system should not be constructed in such a way so as to be reliant on someone's coworker shaming him. That is not a sensible check against a systemic failure. We're supposed to have due process. If someone skips or otherwise subverts due process the justifications don't matter. The root issue is that due process was skipped. Why was that even possible to begin with?

    • themafia 3 days ago ago

      > How is this the fault of AI?

      It could be the fault of the company that's selling this service. They often make wildly inaccurate claims about the utility and accuracy of their systems. [0]

      > There's a reason why we don't let AI autonomously jail people.

      Yes we do. [1]

      > and a criminal justice system that thinks it is okay to jail people for 5 months before even starting to assess their guilt.

      Her guilt was assessed. That's why she had no bail. It assessed it incorrectly, but the error is more complicated than your reaction implies.

      [0]: https://thisisreno.com/2026/03/lawsuit-reno-police-ai-polici...

      [1]: https://projects.tampabay.com/projects/2020/investigations/p...

      • throwaway17_17 2 days ago ago

        To clarify one point, her not having a bail is a function of the way interstate ‘fugitive’ warrants are designed. The Court in Tennessee had no ability to set bail, and until she entered the physical custody of North Dakota she can not have bail set.

        Also, her guilt was not assessed in any common meaning of the term. The requirement for holding a person in custody, with or without bail, is probable cause. The only thing assessed was did law enforcement present a statement to a Judge that was possible to be believed in the light most favorable to the prosecution.

    • danso 3 days ago ago

      > How is this the fault of AI?

      AI is being used by bureaucrats and enforcers to justify lazy, harmful conclusions. You don't live in the real world if you think "just punish the bureaucrats, don't make it about AI" is going to remotely rectify this toxic feedback loop and ecosystem.

      • fc417fc802 2 days ago ago

        No, we definitely should punish bureaucrats and enforcers who act negligently. If someone in a position of authority flagrantly fails to do his job and it directly harms someone he should be held accountable. That would provide a strong incentive for future actors to take their responsibilities seriously.

        If an engineer signs off on an obviously faulty building plan and people die as a result we hold him accountable. This is no different.

    • rolandog 2 days ago ago

      It is the fault of the coders, the salespeople who over-promised the capabilities of the system, the lawmakers who have not regulated or demanded a minimum percentage of accuracy from those products, the AI' company's onboarding trainers, the cops that were trained to use the software, the jailers, and maybe other related positions that should've taken a better interest in making a better system, not a more cruel one

    • stego-tech 3 days ago ago

      It's the fault of the tool because our society treats the tools as superior judgements than humans and to be trusted completely as a means of deflecting accountability - something any and every minority group has been warning about for fucking decades.

      The reason everyone rushes to defend the tool's use is because holding humans accountable would mean throwing these tools out entirely in most cases, due to internal human biases and a decline in basic critical and cognitive thinking skills. The marketing has been the same since the 80s: the tool is superior (until it isn't), the tool shall be trusted completely (until it fails), the tool cannot make mistakes (until it does).

      If folks actually listened to the victims of this shit, companies like Flock and Palantir would be gutted and their founders barred from any sort of office of responsibility, at minimum. The fact so many deflect blame from the tool like the marketing manual demands shows they don't actually give a shit about the humans wrapped up in the harms, or the misuse and misappropriation of these tools by persons wholly unaccountable under the law, but only about defending a shiny thing they personally like.

      • pixl97 3 days ago ago

        >rushes to defend the tool's use is because holding humans accountable would mean throwing these tools out entirely in most cases, due to internal human biases and a decline in basic critical and cognitive thinking skill

        The magical past where people had critical thinking skills never existed. We put a lot of trust in tools is because people are unfucking reliable. Hence why in most cases actual physical evidence does a far better job than witness testimony.

        This said, people are lazy. It is one of our greatest and worst traits. When we are allowed to be lazy, especially with tools bad things happen.

    • rightbyte 3 days ago ago

      > How is this the fault of AI?

      The false positive rate combined with scanning millions of pictures might make the chance of arresting the wrong person really high.

      • akimbostrawman 2 days ago ago

        The wrong detection is the AI's fault. Anything after and based off that is the fault of humans.

        • rectang 2 days ago ago

          Before the misuse, there is opportunity to predict that misuse is certain to occur.

    • Nevermark 2 days ago ago

      If many people's writing skills are suffering, due to highly convenient AI support, just imagine how fast mediocre crime investigation skills are going to devolve.

      It is going to get bad in every skilled area of human managed bureaucracy.

      The number of legal filings found to include AI confabulations is just the obvious surface.

    • type0 3 days ago ago

      > Instead of scapegoating an AI bogeyman

      One big reason for AI adoption everywhere is that you can use it as a scapegoat

    • unethical_ban 3 days ago ago

      Someone from the government should be in jail for this kind of oversight.

      • throwaway173738 3 days ago ago

        I think the taxpayers owe this lady at least a couple million if not more for the inconvenience they chose to put her through.

        • throwaway17_17 2 days ago ago

          I agree, but our system doesn’t value things that way. Texas, which is one of the highest paying States for cases where intentional, fraudulent, or grossly negligent actions result in wrongful incarceration pay $80,000 dollars per year a person is locked up. But the caveat is that time only starts counting after you are sentenced, so wouldn’t even apply in TFA’s case.

          • dghlsakjg 2 days ago ago

            That's statutory damages on a wrongful conviction.

            This is an entirely different situation, and the amount would not be statutory.

        • subscribed 2 days ago ago

          That shouldn't be taxpayers. That should come from the malpractice insurance, similarly how doctors pay.

          After a while, when premiums shoot up, they'd start to behave

      • chocoboaus3 2 days ago ago

        its the only way this stops happening.

    • lokar 3 days ago ago

      Automation has a strong tendency to degrade diligence.

      I see this all the time in operational / production settings. Having a loop with automation reviewed and approved by a human degrades very fast. I only approve automation that has a quick path to unsupervised operation.

    • jjav 2 days ago ago

      > A live human detective confirmed it.

      I doubt it, due to human nature. Perhaps the process says the human must consciously validate, but a lot of humans in many cases will just rubberstamp what the AI said. That's the risk.

    • tgv 2 days ago ago

      > How is this the fault of AI?

      I'll reply to the top of the discussion too: it's because it was purely made for this purpose. There's no use for it outside surveillance. And it's not even good enough. It's only purpose is checking boxes and transferring money. Miscarriage of justice is an unfortunate, but calculated side effect.

    • culi 3 days ago ago

      Study after study has shown a very strong and consistent bias of humans to trust "automated systems" in face of any ambiguity

    • dyauspitr 2 days ago ago

      Because if you let this slide the human, such as he is, will be removed from the loop and these mistakes will become acceptable once departments get used to how cheap the AI is compared to a human. There will be no going back and mistakes like this will just become accepted collateral damage.

    • 2 days ago ago
      [deleted]
    • dragonwriter 2 days ago ago

      > How is this the fault of AI

      It isn't, the article doesn't claim (or even imply) that it is "the fault" of AI, only that AI was part of the chain of events, and nothing is the fault of AI until AI is sufficiently advanced to constitute a moral actor. “At the source of every error which is blamed on the computer, you will find at least two human errors, one of which is the error of blaming it on the computer” remains true.

      OTOH, it is potentially the fault of the reliance human actors put on an AI determination.

    • tim-tday a day ago ago

      The legal system has long treated a computer match as infallible. This has led to miscarriage of justice on a grand scale.

    • RobRivera 3 days ago ago

      I think it's more nuanced; it is one error in a Tragedy of Errors.

      • throwaway17_17 2 days ago ago

        This was not a series of errors, this is (as a statistical inference) the system working as designed. This is not uncommon, it is not unplanned. The extradition of suspects from State to State is designed legislatively to function this way.

        I also think there is more nuance to this situation than AI bad // Human Bad :: choose one. But while a tragedy, the ‘correct’ functioning of a system that produces tragedy doesn’t make that function and error.

    • 3 days ago ago
      [deleted]
    • plorg 2 days ago ago

      I think the biggest problem is that the popular narratives about AI enable this like of accountability sink.

      • hdgvhicv 2 days ago ago

        Before AI it was outsourcing. “Not my fault the system is down and we’re losing 1m an hour, AWS is having a bad day”

    • EchoReflection 2 days ago ago

      100% 100% 100% humanity is so obsessed with ai that we're losing...our humanity. "blame the mindless, soulless robots! how could we have possibly known that they need to be supervised?! aren't they basically just humans that don't need to rest or eat?"

    • petre 2 days ago ago

      > How is this the fault of AI?

      Humans being human. Getting lazy, being incompetent, getting incompetent with AI use or simply being biased. The wrongfully arrested person doesn't even resamble the perpetrator.

      Maybe if they were held accountable forthese actions, they would act responsibly?

    • croes 2 days ago ago

      Where does it say that AI is blamed.

      It says she was misidentified using facial recognition.

      That’s exactly what happened

    • arwhatever 2 days ago ago

      Devils advocate: what if a facial recognition system with a large enough database can always find an unrelated/innocent person that looks similar enough to convince the human?

    • blitzar 3 days ago ago

      computer said yes

    • worik 3 days ago ago

      > How is this the fault of AI?

      It is not. It is the fault of the police

      AI models are tools. When mistakes are made they are the mistake of the operator of said tool

      This AI model was badly misused, this woman should get a metric shit tonne of compensation, but it was the fault of the police.

    • vbezhenar 2 days ago ago

      At this point I think that AI will perform human duties better than human. So probably it's better to let AI autonomously jail people, of course with all the necessary procedures as required by law.

      • rectang 2 days ago ago

        As a work of literature, I applaud your Swiftian “modest proposal”.

    • 3 days ago ago
      [deleted]
    • ares623 3 days ago ago

      I hope you take this as a teaching/learning opportunity

  • rpcope1 3 days ago ago

    There's no way this isn't a slam dunk case to sue the piss out of the Fargo Police, probably the US Marshals and maybe other orgs. The woman in the surveillance phone clearly looks way younger, among the many other obvious signs this woman didn't do it. I hope she wrings at least several million dollars out of the government.

    • wvenable 3 days ago ago

      It literally doesn't matter -- you're focused on the wrong thing. She could be that woman's exact twin and it wouldn't matter. Spending six months in jail and losing your house, your car, and your dog with the flimsiest of evidence is ridiculous.

      • _DeadFred_ 3 days ago ago

        'you can beat the wrap but not the ride' has been a pop culture reference in the US since the 1940s. Our society wants/supports the ability for this to be inflicted at police/court whim on people.

        • rolandog 2 days ago ago

          Which means that, if the cops (and other relevant personnel) gets it wrong, they should get served with the same injustices that they committed, no questions asked... you know, because they didn't raise any when they were the ones dishing out punishments.

          Edit: wording, formatting

          • drysine 2 days ago ago

            >if the cops (and other relevant personnel) gets it wrong, they should get served with the same injustices that they committed

            What if no one would want to work as a policemen and you end up alone against local gang?

            • Eddy_Viscosity2 a day ago ago

              > What if no one would want to work as a policemen

              This is by far the worst argument. What if we held doctors accountable for malpractice and no one wanted to be a doctor? What if we held engineers liable for faulty designs that break and kill people and no one wanted to be an engineer? What if we held OCCUPATION accountable for DOING JOB BADLY / BREAKING THE LAW? Its a nonsense argument.

              What would happen is that only the people that intended to be bad police would not want to the job and/or the people that were bad police (intentional or otherwise) get kicked out of the police force. Same as with every other profession. This is a fantastic outcome and we should do it immediately.

            • wvenable 2 days ago ago

              What if the police force is the local gang?

              • drysine a day ago ago

                There are plenty of westerns about it

        • AngryData a day ago ago

          I don't think society supports it as much as you are suggesting. Marijuana is still illegal despite 65% of the nation being in favor of rescheduling. Clearly our laws do not easily mirror what the population believes.

      • OutOfHere 3 days ago ago

        A lawsuit is exactly what matters. They learn only the hard way, and no other way. If you want them to not be ridiculous, a lawsuit with large punitive damages is the only practical way to get there.

        • wvenable 3 days ago ago

          I disagree. The city or state gets sued and they pay the result from the taxpayer funds and literally nobody learns anything, especially not the hard way. Everyone is so completely divorced, and in some cases immune, from consequences that this will change nothing.

          • selcuka 3 days ago ago

            After a couple million dollar lawsuits the city or state will learn to be more careful with their methods. It's the taxpayer funds, but it's not an endless supply of money. Cities and states have their own budgets.

            • wvenable 2 days ago ago

              More than $1.5 billion has been spent to settle claims of police misconduct involving thousands of officers repeatedly accused of wrongdoing.

              https://www.washingtonpost.com/investigations/interactive/20...

              • nbernard 2 days ago ago

                Good point! It shows that the settlements are far too low and that the victims should get a lot more.

                If a few cities/states were to default due to debts coming from such cases, the others would start to take notice...

                • wvenable 2 days ago ago

                  Why won't we just dole out real punishments instead of an indirect chain of money? Money is not the solution to everything.

              • selcuka 2 days ago ago

                Do we have any evidence that these lawsuits have no effect on the number of wrongdoings?

                • wvenable 2 days ago ago

                  Did you see the word "repeatedly"?

            • silisili 3 days ago ago

              > After a couple million dollar lawsuits the city or state will learn to be more careful with their methods

              You'd think, but watching how many millions my local police department and city paid out every single year leads me to believe they just don't care.

              • phendrenad2 2 days ago ago

                How many, exactly? Anyone can wave vagueness around. Do you have numbers or no?

                • silisili 2 days ago ago

                  I haven't lived there in years, nor do I have exact numbers, but they make national news enough for the same problem nearly every year. I'll drop you some links if you care.

                  1 - 38 million between 2017 and 2022.

                  2 - 29 million in 2023.

                  3 - 12 million in settlements in 2025.

                  Dare I keep going?

                  [1]https://www.wdrb.com/in-depth/louisville-payouts-for-police-...

                  [2]https://www.aol.com/louisville-paid-least-29m-settle-1030450...

                  [3]https://www.courier-journal.com/story/news/local/2026/02/04/...

                  • phendrenad2 2 days ago ago

                    The region's GDP is 100 billion dollars, so these are tiny amounts, although they may seem large to some.

                    And the first article you link proves that people are already worried about it. You think they can safely 10x that?

                    • heavyset_go 2 days ago ago

                      LMPD budget is $250m, if they lost ~10% of it every year, they'd surely notice.

                    • silisili 2 days ago ago

                      > The region's GDP is 100 billion dollars, so these are tiny amounts, although they may seem large to some.

                      It's a fair point and easy to handwave away "it's only $100 per resident." But it's a lot of money still. And yet that city is shutting down schools and selling off school properties to make budget this year. I bet they'd love to have those wasted millions.

                      > You think they can safely 10x that?

                      I have no idea the reason for this question. The OP said cities learn after a couple million dollar suits. I'm showing that no, they do not. If anything suits are increasing.

                      • phendrenad2 2 days ago ago

                        > I have no idea the reason for this question

                        Well it does make sense, in the full context of the thread. I'll let future readers decide.

                • FireBeyond 2 days ago ago

                  NYC has, over the last decade or so, averaged $1M/week in judgments against NYPD for abuses of authority.

            • LadyCailin 3 days ago ago

              There’s a heck of a lot of individual cities and states. Their ability to remain solvent is greater than your ability to stay out of jail.

            • ThePowerOfFuet 2 days ago ago

              No. It's not their money and they don't care.

            • ihsw 3 days ago ago

              [dead]

          • anon84873628 3 days ago ago

            The cities and states make laws to better govern police behavior. You can look back on a century of history of this.

    • Blackthorn 3 days ago ago

      With all the lovely qualified immunity doctrine? That's wishful thinking.

      • Jtsummers 3 days ago ago

        That may protect them personally, but not the city and the department itself from being sued.

        • blagie 3 days ago ago

          Nope.

          https://abovethelaw.com/2016/02/criminally-yours-indicting-a...

          You can be arrested, indicted, and held in jail on pretrial, and there is literally no recourse. There are many other ways jail can happen without due process. Where I live:

          * Civil contempt. Absolutely immunity. No due process. Record is about 16 years. Having a bad day? Judge can toss you in jail.

          * "Dangerous." Half a year. No due process. He-said she-said.

          * "Insane." Psychiatric hold. Three days. Due process on paper, not in practice. Police in my town can and do use this if they don't like you.

          Absolutely no recourse. You come out with a gap in income, employment, and, if you missed rent/mortgage, no home. Landlords will simply throw your stuff away too.

          You're also basically damned if things do move forward, since from jail, you have no access to evidence, to internet (for legal research), and no reasonable way to recruit a lawyer (and, for most people, pay for one).

          Can happen to anyone. Less common if you're rich and can afford a good lawyer, but far from uncommon.

          • Jtsummers 3 days ago ago

            I don't know what you're responding to, but I don't think it's my comment.

            Qualified immunity protects individuals, not departments, from liability.

            The particular thread (in this thread) that I was responding to:

            >> I hope she wrings at least several million dollars out of the government.

            > With all the lovely qualified immunity doctrine? That's wishful thinking.

            I was responding to the claim that qualified immunity protected the government, it does not.

            • kelnos 3 days ago ago

              The GP seems to be suggesting that there's no recourse at all, usually. You might bring suit against a police department or LE agency, but you'll fail to find any relief there. True that qualified immunity only protects individuals, but there's a raft of other things that makes it hard to get a judgement against a police department, too.

              I think there's probably one major exception: civil rights violation investigations. But even then, the people doing the investigating seem to be biased toward the LEOs.

              The GP's linked article doesn't seem to even talk about this, so not sure why that's there.

              • dolebirchwood 3 days ago ago

                > You might bring suit against a police department or LE agency, but you'll fail to find any relief there.

                I don't know if I'd go so far to say she won't find any relief, but it probably still could be a pretty tough Monell claim against the department (although it's hard to tell from the sparse details in the article):

                "[A] local government may not be sued under [42 U.S.C.] § 1983 for an injury inflicted solely by its employees or agents. Instead, it is when execution of a government's policy or custom, whether made by its lawmakers or by those whose edicts or acts may fairly be said to represent official policy, inflicts the injury that the government, as an entity, is responsible under § 1983." [1]

                I could see a problem if there was a policy/custom of relying on AI facial recognition alone without any other corroborating evidence (would be a really stupid practice, but I'm sure stupider things have become part of a police department's systemic practices). Or if there was a failure to sufficiently train detectives about the erroneous tendencies of this technology. Maybe the needlessly prolonged detention without bail could be an issue if there was a lack of adequate protocols to expedite in a reasonable amount of time.

                Either way, still seems hard to say this a slam dunk case for her, unfortunately. But also seems too risky for the city of Fargo to not settle, at least nominally.

                [1] Monell v. Department of Soc. Svcs., 436 U.S. 658 (1978), https://supreme.justia.com/cases/federal/us/436/658/

          • JuniperMesos 2 days ago ago

            > "Insane." Psychiatric hold. Three days. Due process on paper, not in practice. Police in my town can and do use this if they don't like you.

            And there are definitely insane people who are a threat to themselves and others who wander around, making the streets and public transit systems unsafe and unpleasant, who need to be put into something like a psychiatric hold by something like the police.

            And if you don't have police and a criminal justice system that are willing and able to impose psychiatric holds, you wind up with a bunch of incidents where a crazy mentally-ill vagrant kills someone in a public (the Iryna Zarutska murder, or any of the various cases where a homeless person randomly shoves someone into the path of an oncoming train at a public transit station); or incidents where someone else gets railroaded by the criminal justice system for intervening in a crazy mentally-ill person threatening people around them (the Daniel Penny incident - many people, even nominal anti-carceralists, are upset that he was not successfully convicted and incarcerated for murder). Not to mention all the less-newsworthy incidents where insane people walking the streets and public transit systems systematically ruin them for everyone else, either through vandalism or theft or simply screaming incoherently at people as they try to use the public commons.

            It's certainly possible for the police to abuse psychiatric holds if they don't like you; on the other hand, the existence of large numbers of people who should be in some kind of psychiatric hold but aren't disrupting and vandalizing the public commons is one of the biggest quality of life and physical safety problems in my region and in many other American urban areas.

          • mothballed 3 days ago ago

            >* "Insane." Psychiatric hold. Three days. Due process on paper, not in practice. Police in my town can and do use this if they don't like you.

            A friend of mine was committed longer than 3 days without council or the ability to represent themselves in the hearing. Apparently the whole process of being committed is ex parte in practice in some states.

          • FireBeyond 2 days ago ago

            Go to Florida, be arrested. Have charges thrown out, dropped, dismissed or simply be acquitted. Florida doesn't care, they'll bill you for your incarceration at nearly $100/day. And failure to pay this bill is, itself, a felony.

          • abduhl 3 days ago ago

            This is a bit hyperbolic and the exaggerations really undermine what I think is your broader point (that there is rarely recourse when you're held for short to moderate amounts of time). It is hard for me to believe that someone was held for 16 years on civil contempt without due process or that someone was held for half a year without due process after being deemed dangerous. The reason that is hard for me to believe is that the due process is implicit in the action you describe. Civil contempt is from a judge which implies that you're already in court - that's due process. Someone being labeled "dangerous" implies that a finding was made by a neutral party - that's due process.

            Just because you disagree with the outcome doesn't mean that due process wasn't given.

            • mothballed 3 days ago ago

              Yeah it's "due process." In civil contempt the judge is a witness and prosecutor in the very "process" they're judging. That's the most perverted form of due process imaginable.

              A judge should have to recuse themselves if they are acting as witness to the supposed infraction.

              • abduhl 3 days ago ago

                Civil contempt isn't some roving criminal charge that jumps out of the jury box randomly. It's meant to make somebody comply with a court order. Anybody in civil contempt holds the keys to the jailhouse door in their own hands, all they have to do is comply.

                This statement should make you uncomfortable. It makes me uncomfortable because it is a pure expression of the power of the state. But it's still due process.

                • FpUser 3 days ago ago

                  In Criminal Contempt max duration of imprisonment is limited. In civil it is not until somebody decides that one never complies. You may call it due process. I call it for what it is - A torture and fucking crime against humanity. Judge that holds person for years for being stubborn deserves nothing more than walk the plank

                  • JuniperMesos 2 days ago ago

                    Any power that could force a judge to actually walk the plank for what they see as an abuse of power, is itself something like a state, that could just as easily wield that power against the stubborn defendant.

                    • FpUser 2 days ago ago

                      Judges are not above the law. They could be and are being made to "walk a plank". Problem is that we have many laws that are very shitty and allow abuse and are heavily distorted to benefit anyone but "we the people"

      • djfobbz 3 days ago ago

        Criminal immunity? Sure. Civil immunity? Nope! She could definitely make a nice buck.

        • opo 3 days ago ago

          Qualified immunity doesn't apply to criminal cases. It is used to defend against civil suits. It is unfortunately very easy to find many cases where it leads to injustice. For example:

          >...Abby Tiscareno, a licensed daycare provider in Utah, was wrongfully convicted of felony child abuse when a child under her care suffered brain hemorrhaging. After calling emergency services, subsequent medical tests supported these findings. However, during her trial, requested medical records from the Utah Division of Child and Family Services (DCFS) were not provided. It wasn’t until a civil suit that Ms. Tiscareno saw pathology reports suggesting the injury could have occurred outside of her care. She was granted a new trial and acquitted. Her subsequent lawsuit for due process violations, alleging that DCFS failed to provide exculpatory evidence, was dismissed due to lack of precedent indicating DCFS’s obligation to produce such evidence.

          https://innocenceproject.org/news/what-you-need-to-know-abou...

        • theLiminator 3 days ago ago

          Off of taxpayer money sadly. Imo we really need a fix for this. When cops are grossly negligent the money should come out of their aggregate pension fund (or at least partially).

          • JumpCrisscross 3 days ago ago

            > we really need a fix for this. When cops are grossly negligent the money should come out of their aggregate pension fund

            This is on us as voters. If we didn’t piss our pants every time a police union sneezed, we’d realize wholesale restarting police departments is precedents in even our largest cities.

          • SunshineTheCat 3 days ago ago

            Yes, this is the key point. Tax payers get a nice big bill while the people who caused the problem get a nice paid vacation while they conduct an internal "investigation" that typically finds they did nothing wrong.

          • vkou 3 days ago ago

            There is a fix to it. Elect people who will hold them accountable.

            As long as you keep electing clowns that let the police do whatever they want, the police will... Do whatever they want.

            • theLiminator 3 days ago ago

              Yeah, of course they need to held accountable, and we need to vote in people who will do so. What I'm suggesting is an alignment of incentives that will ensure that police will try to do their best to not be negligent.

              Of course there's a balance that has to be struck so that police are empowered enough to act. So perhaps something like settlements against the police being 30% borne by the police pension fund and 70% by taxpayers is sufficient. I think this will also make police very enthusiastic about bodycams and holding each other accountable.

            • kelnos 3 days ago ago

              I'm usually a big supporter of labor unions, but police unions in the US generally have an outsized amount of power, and even when mayors etc. want to hold police accountable, the union ends up bending the mayor over a barrel.

              I'm not sure what the solution is here. Forbid police from unionizing? That would probably have some bad consequences too.

              • hdgvhicv 2 days ago ago

                Malpractice insurance

            • GuinansEyebrows 3 days ago ago

              despite this being something practically everybody wants, the fact that it hasn't happened is not a coincidence and speaks to the power of police unions/guilds and their lobbying arms. outside a few toothless instances, those groups are extremely good at reframing these attempts and mobilizing their bases to vote against the broader public interest.

              it sucks.

              • vkou 3 days ago ago

                > despite this being something practically everybody wants,

                No, everybody does not want police accountability. Half the population will fall on a grenade to prevent that. They know that the purpose of the police is to keep the undesirables in line, and they never envision that they will ever fall in that category.

                The brutality is the point for them.

                • GuinansEyebrows 3 days ago ago

                  oh, i generally don't disagree with you on that point; i specifically meant that when presented with the question "do you want your tax dollars to pay for police liabilities?" the answer is probably almost always "no".

                  • vkou 3 days ago ago

                    Sure. But when you ask "Do you want the police to be unable to do their job and live in a lawless hellscape ran by gangbangers and ISIS cartels?, the answer is also 'No.'

                    The problem is that the mass media sets the framing of acceptable discourse, and that mass media is in large part an ideological monoculture. And even when it's not, it is happy to present absolutely insane batshit lunacy as 'one of the two sides' of an issue.

                    • GuinansEyebrows 2 days ago ago

                      yeah - i think the media is certainly culpable, but i also think this speaks to the power of police unions like i mentioned earlier. media is happy to present stories presented to them on silver platters by "respected" institutions because they carry all the hallmarks of legitimacy.

            • rectang 3 days ago ago

              “Tough on crime” -> lenient on police -> innocent grandmas in jail.

          • lotsofpulp 3 days ago ago

            Almost all taxpayer funded pension funds are already underfunded. It makes no difference if the funding decreases or increases, the government employee will still get their benefit. The government would have to go through bankruptcy to get the benefit amount reduced.

    • anigbrowl 3 days ago ago

      imho the US Marshals are the only innocent party here, as my understanding is they don't do investigations and just serve warrants without any knowledge of the underlying case.

      • 3 days ago ago
        [deleted]
      • 3 days ago ago
        [deleted]
    • JumpCrisscross 3 days ago ago

      “Unable to pay her bills from jail, she lost her home, her car and even her dog.”

      Who stole her dog?!

      • fluidcruft 3 days ago ago

        Probably picked up by animal control as abandoned and euthanized.

        • JumpCrisscross 3 days ago ago

          That’s really horrible. I’d prefer to know rather than guess at that.

          • fluidcruft 3 days ago ago

            It's pretty common when a dog is abandoned. Likely her children couldn't afford to care for it. I suppose there is a chance they put it up for adoption (same outcome is likely).

            • JumpCrisscross 3 days ago ago

              Sorry. I really fucking hate this. I don’t imagine there is a charity somewhere that works against this?

              • throwaway17_17 2 days ago ago

                No large scale orgs that I know of. Our local bar has an attorney who does work against it, she has her number at the jail where other inmates will pass her number around if some mentions their dog, and intake officers will often suggest to inmates that if they have pets to call her. She is absolutely the most hard core lover of dogs I have ever met, and she will literally drive/run into danger to get to a canine to get it to a local non-kill shelter.

              • DangitBobby 2 days ago ago

                I imagine if there had been anyone intervening in her favor it would have been to resolve her case faster than 5 months.

      • theandrewbailey 2 days ago ago

        I'd love for her to go John Wick on those responsible.

    • john_strinlai 3 days ago ago

      >I hope she wrings at least several million dollars out of the government.

      which the citizens end up footing the bill for. yay.

      • eek2121 3 days ago ago

        Maybe the citizens will learn to elect better leaders.

        • 3 days ago ago
          [deleted]
        • madaxe_again 3 days ago ago

          Thanks, I needed a good laugh this evening.

      • phendrenad2 3 days ago ago

        Maybe they'll realize votes have consequences.

        • DangitBobby 2 days ago ago

          People famously do not learn from the experiences of others. It's a big reason why life is so hard when you'd expect it to be pretty easy based on our collective experience.

    • 3 days ago ago
      [deleted]
  • rectang 3 days ago ago

    > facial recognition showed she was the main suspect in what Fargo police called an organized bank fraud case.

    > Her bank records showed she was more than 1,200 miles away, at home in Tennessee at the same time police claimed she was in Fargo committing fraud.

    > Unable to pay her bills from jail, she lost her home, her car and even her dog

  • anigbrowl 3 days ago ago

    It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff (who is responsible for the jail inmates). I hope everyone involved in this travesty is sued into oblivion and unable to hide behind their immunity defenses. Facial recognition should never be the sole basis for a warrant.

    • idle_zealot 3 days ago ago

      > It is an AI error, but also an error on the part of the cops, the prosecutors, the judge, and the county sheriff

      Yes, it's critical to remember that multiple parties can be at fault. In a case like this, it is true that

      a) law enforcement misused a tool and demonstrated extreme negligence

      b) the judiciary didn't catch this, which suggests systemic negligence there too when it comes to their oversight responsibilities

      c) the company selling/providing this AI tool should have known it was likely to be misused and is responsible for damages caused by such predictable usage

      We cannot have a just world until our laws and norms result in loss of jobs and legitimacy as punishment for this sort of normalized failure, from all three parties. Immunity is a failed experiment.

    • recursivecaveat 3 days ago ago

      Even if she was a read ringer (clearly not the same person to any human who glances at the image), common sense should tell you that among 340,000,000 Americans there are a lot of lookalikes. Clearly there's a kind of stupid belief in the mystic powers of an AI and a callous disregard for the well being of suspects. No one should be dragged 1000 miles and held for months based on a facial match, especially when exculpatory evidence was easily available.

      • throwaway17_17 2 days ago ago

        To be specific, and it is a lot of the reason why this 5 month delay happened, but she was not dragged then held, she was arrested, then held, then dragged. She was released 5 days after finally getting to Dakota, if they had actually gone and gotten her the hold would have been ~30 days plus the 5 prior to interview and charges dropped.

        It isn’t much of a salve, but the particulars do matter when trying to assess fault to the proper parties (who are still clearly the Fargo cops in this particular tragedy).

    • causal 3 days ago ago

      This x1000. We need to suspend this shared fiction that AI has any agency. Only humans can be responsible. Full stop.

      • irishcoffee 3 days ago ago

        [flagged]

        • causal 3 days ago ago

          This question doesn't even make sense. Why wouldn't humans still be the ones responsible? Bot account?

          • fc417fc802 2 days ago ago

            Doesn't look like it. I've come across this account a few times now. Engages and makes reasonable comments excepts for certain politicized issues where he acts like an indoctrinated zealot.

        • GuinansEyebrows 3 days ago ago

          respectfully, can you elaborate on why the answer would not be yes? or am i just misreading your comment?

    • mekoka 3 days ago ago

      > It is an AI error

      The software identified the person as Angela Lipps. According to the court documents, the Fargo detective working the case then looked at Lipps' social media accounts and Tennessee driver's license photo.

      In his charging document, the detective wrote that Lipps appeared to be the suspect based on facial features, body type and hairstyle and color.

      The software worked exactly as intended. It's a filtering tool that sifts through data for common patterns to provide leads, not matches. It raises a flag on persons of interest. You can be a "match" anywhere between 0 and 100% and only relative to some specific input (like that picture taken from the top of the woman at the teller). In that sens mismatches are within acceptable parameters and have been known to happen.

      A "match" is a pronouncement ultimately made by the humans that uses the tool, after they've checked out the leads. Someone slept at the wheel here.

  • Aardwolf 3 days ago ago

    This reminds me of the British Post Office Scandal: https://en.wikipedia.org/wiki/British_Post_Office_scandal

  • jawns 3 days ago ago

    John Bryant, aka The Civil Rights Lawyer, recently did a piece about a similar case of mistaken identity. The consequences weren't as severe, but the willingness to trust the AI over any other evidence was the same:

    https://thecivilrightslawyer.com/2026/03/11/ai-software-tell...

    In the video, it shows a police officer blindly trusting a casino's AI software, even when a cursory investigation should have given any reasonable person enough of a reason to question whether the man he arrested was the same man accused of a crime. (And then even after it was confirmed he was not, the prosecutor continued to charge him for trespassing!)

  • holman 3 days ago ago

    Me: Whoa, cool, my hometown is on atop Hacker News!

    Also me, reading further: Uh-oh.

    The chief of police also resigned today; wouldn't be shocked if this was part of the reasoning.

    • PTOB 3 days ago ago

      I am from a town that gets national news coverage only for Shenanigans like this.

    • JumpCrisscross 3 days ago ago

      > chief of police also resigned today

      Source?

      • waterhouse 3 days ago ago

        Googling "fargo police chief resigns": https://www.inforum.com/news/fargo/zibolski-announces-his-re... among other results.

        That said, it's portrayed as a retirement, and doesn't seem to give any hints that it's connected.

        • JumpCrisscross 3 days ago ago

          Out of curiosity, was the guy known for being fast and loose with the rules? Put more simply, was he a good cop? Or did he have a history of being a rogue.

          • Sl1mb0 3 days ago ago

            Are authoritarians good? That's basically what you are asking.

          • sumeno 3 days ago ago

            There are no good cops

      • RobRivera 3 days ago ago

        [dead]

  • quickthrowman 3 days ago ago

    It’s obvious from the one photo they posted of the actual suspect that the lady they arrested is about 20-30 years older than the woman in the bank photo. The woman in the photo is maybe 25-30 years old, this grandma looks like she’s 65-70 (actual age of 50).

    Absolutely ridiculous, I hope she wins her civil case.

  • bethekidyouwant 3 days ago ago

    I read the article and I don’t really understand… she was held in a jail in Tennessee but the article states they flew her to North Dakota? And somehow she’s a fugitive so that’s why she doesn’t get bail? but she’s a fugitive held in her own state in a holding facility? But then when they release her, she’s in North Dakota? So if some state says you’re a fugitive your home state will just hold you in jail until they come and put you on an airplane? Is that correct?

    • wvenable 3 days ago ago

      I think you have the interpretation correct. It seems like any state can say you're a fugitive from their state and now you have even fewer rights. Every day I learn some new fact about "justice" in the United States.

    • DangitBobby 2 days ago ago

      As a Tennessee resident I don't love learning that some dumb fuck state I want nothing to do with can call me a fugitive and my state will hold me prisoner without trial when said dumb fuck state finally decides is ready to deal with me.

    • fc417fc802 2 days ago ago

      I believe each state has its own extradition process. In this scenario think of them more like the countries in the EU. Apparently Tennessee doesn't adequately protect its residents.

    • janalsncm 3 days ago ago

      I read it as her arrested and held in Tennessee temporarily then flown to North Dakota.

      • bethekidyouwant 3 days ago ago

        “Lipps would sit in that Tennessee jail cell for nearly four months. As a fugitive, she was held without bail”

  • thinkcontext 2 days ago ago

    Wow, so many failures of the legal system. While the incompetent/malicious/lazy investigators that used the facial recognition and only that are obviously at major fault, I'd actually put larger blame on the judge that signed the arrest warrant. They are supposed to be a check on such incompetent/malicious/lazy-ness not just a rubber stamp. Unfortunately there's really no recourse against incompetent/malicious/lazy judges.

    Of course this would have been bad enough if this had happened where she lived but the holding for 5 months adds a whole 'nother level of insight into brokeness of the legal system. I'd be interested in hearing more about why that happened. Was it just a matter of that happens sometimes if you have a public defender?

  • stego-tech 3 days ago ago

    I really, really need folks to understand that deflecting blame away from the tool and trying to hold the human accountable feeds right into the marketing playbook of these companies in the first place.

    The cops cannot be held accountable because the laws basically give them immunity. The politicians cannot be held accountable beyond being tossed out at the next election, because the laws otherwise give them immunity. The people operating the system cannot be held accountable, because the systems are marketed as authoritative despite being black boxes and lacking in transparency; they trusted the system just as they were told to, and thus cannot be held accountable.

    And so when every human in the chain cannot be held accountable for these things, and the law prevents victims from receiving apologies, let alone recourse, then the tool and its maker is the only thing we can hold accountable. By deflecting blame away from the tools ("it wasn't AI, it was facial recognition"; "the human had to sign off on it"; "humans made the arrest, not machines"), you're protecting quite literally the only possible entity that could still potentially be held accountable: the dipshits making these stupid things and marketing them as superior and authoritative when compared to humans.

    You want accountability? Start holding capital to account, and this shit falls away real fucking fast. Don't get lost in technical nuance over very real human issues.

    • tylervigen 3 days ago ago

      I disagree. If you focus on holding the software creators to account in lieu of the humans in the loop, the we only reinforce the behavior of offloading thinking to the system.

      If I am a cop in another jurisdiction and I see that in this case of error, the facial recognition company was held to account but not the police or municipality, I will be more likely to blindly trust the software assuming that they either patched it or will take responsibility.

      We should demand accountability for both.

    • bmitch3020 2 days ago ago

      You can blame both. The prosecutors and police that didn't do their proper due diligence, falsely imprisoning this woman, and held her for months without due process. And also the AI company that submitted a false police report and the defamation of character. There's no reason for either of them to escape the blame.

    • simpaticoder 3 days ago ago

      >Start holding capital to account

      You forgot one: capital cannot be held accountable for making a tool used in a crime. It is a simple generalization of the Protection of Lawful Commerce in Arms Act (PLCAA), passed in 2005, which largely bars civil lawsuits against gun makers and sellers when their products are later used in crime.

    • dml2135 3 days ago ago

      Strongly agree here. This is an extremely predictable outcome of selling AI facial recognition software to American police forces.

    • DangitBobby 2 days ago ago

      Is there anything to suggest this sort of injustice isn't happening in low-tech all the time, constantly, all over the country, and the only reason it's getting attention here is because AI is involved?

      • Anamon 2 days ago ago

        The scale is not the same. Low-tech tools require more human input, more pre-filtering of suspects. They can't just default to starting with "everybody" and match against millions at the push of a button.

    • onetokeoverthe 2 days ago ago

      [dead]

  • jigglypuff-mab 2 days ago ago

    This is the part that always gets glossed over in the "AI will solve everything" narratives: these systems don't fail gracefully. They fail with confidence.

    The real problem isn't that the AI made a mistake—it's that everyone in the chain deferred to it. The technology became an excuse to stop thinking critically. "The algorithm said so" is the new "I was just following orders."

    We need less faith in these tools, not more.

  • zingar 3 days ago ago

    “Computers don’t argue” seemed charmingly wrong about how computers work until a few short years ago.

    https://nob.cs.ucdavis.edu/classes/ecs153-2019-04/readings/c...

  • elophanto_agent 2 days ago ago

    This is a textbook case of automation bias - where humans over-trust algorithmic outputs. The facial recognition system probably returned a confidence score, but somewhere in the chain that got treated as certainty.

    The technical issue is that facial recognition systems have significantly higher error rates for certain demographics - older individuals, people of color, women. This has been documented extensively in NIST's FRVT testing. The system might have had a 60% confidence match but the investigators treated it as definitive.

    What's missing is a mandatory human verification step before any arrest based on AI identification. The technology should be a lead generator, not probable cause. You'd never arrest someone based on a single eyewitness who said "I'm 60% sure that's them" - but that's essentially what happened here.

  • RobRivera 3 days ago ago

    >Unable to pay her bills from jail, she lost her home, her car and even her dog. Fargo police say the bank fraud case is still under investigation and no arrests have been made.

    I smell a lawsuit

    • giardini 2 days ago ago

      Yes, Fargo will be lucky to survive the lawsuit.

  • jmyeet 3 days ago ago

    We are rapidly becoming a world where every person is one inscrutable LLM decision from having their life ruined with no recourse.

    This type of incident isn't new and is only going to get worse. The problem is our governments are doing absolutely nothing about it. I'll give two examples:

    1. Hertz implemented a system where they falsely reported cars as being stolen. People were arrested and went to jail for rental cars that were sitting in the Hertz lot. Hertz ultimately had to pay $168 million in a settlement [1]. That's insufficient. If I, as an ordinary citizen, make a false police report that somebody stole my car I can be criminally charged. And rightly so. People should go to jail for this and it will continue until they do. These fines and settlements are just the cost of doing business; and

    2. The UK government contracted Fujitsu to produce a new system for their post offices. That system was allowed to produce criminal charges for fraud that were completely false. People committed suicide over this. This went on for what? A decade or more? But resuted in a parliamentary inquiry and settlements. It's known as the British Post Office scandal [2]. Again, people should go to jail for this.

    The choice we as a society face is whether to have automation improve all of our lives by raising everyone's standard of living and allowing us to do less work and less menial work or do we allow automation to further suppress wages so the Epstein class can be slightly more wealthy.

    [1]: https://www.npr.org/2022/12/06/1140998674/hertz-false-accusa...

    [2]: https://en.wikipedia.org/wiki/British_Post_Office_scandal

    • suzzer99 3 days ago ago

      I'm banned from Amazon KDP publishing for life because a fraud detection bot hallucinated that my e-book was plagiarizing my paperback (it didn't realize they're the same book). A bunch of email appeals that I'm pretty sure were also bots went nowhere. With each appeal, the reasons for my ban got progressively more vague, until they didn't mention the plagiarism part at all, just something nonsensical about creating a negative customer experience. Evil company.

    • mylifeandtimes 3 days ago ago

      > The problem is our governments are doing absolutely nothing about it

      Huh. I thought they were actively accelerating the process. Hoping you are right and I am wrong.

  • tlogan 2 days ago ago

    This has nothing to do with AI.

    There are also a few questions that remain unanswered:

    - Did she have previous arrests, and did they use booking photos to identify her? I found someone named Angela Lipps who was arrested in 2001, 2003, 2017, and 2019. The 2017 arrest was for a probation violation: https://archive.ph/CpmXu The 2019 arrest was for public intoxication: https://archive.ph/yjFL9

    - Another confusing detail is that she was in jail for four months without being extradited. That is quite unusual, unless the local authorities were holding her on unrelated charges.

    So this news story seems to have nothing to do with AI. It is also very light on details about the case and what actually happened. And actual criminal case here.

    • rithdmc 2 days ago ago

      Appealing to authority ("The AI said it was her!") is absolutely a problem.

      • tlogan 2 days ago ago

        No. I think the core issue is that they used her 2019 booking photo (a mugshot) from a public intoxication arrest. I am not sure whether a photo like that is reliable :)

        In the end, the detective compared the booking photo with the camera footage and concluded they were the same person, then presented that to the judge.

        I also wonder what her “probation” was for. Maybe she once wrote a bad check and got into trouble, which might have made the detective more inclined to believe it was her.

        Anyway, this does not appear to be an AI issue at all.

        But it is nice scary story to remind us not to be lazy and trust it unconditionally.

        • rithdmc 2 days ago ago

          Yes. Appealing to authority ("The AI said it was her!") is absolutely a problem.

          I'm not dismissing the rest of what you are saying, but I don't think you should dismiss appeals to authority being a factor, either.

  • puppycodes 3 days ago ago

    They do not care.

    End qualified immunity and see how fast cops start to do their jobs with care.

    Winning a lawsuit literally ends in your own community members (not the cops) paying the bill.

  • jchama 3 days ago ago

    The movie "Brazil" was right!

    • js2 2 days ago ago

      Mistake? Haha. We don't make mistakes.

      https://www.youtube.com/watch?v=wzFmPFLIH5s

    • Pxtl 3 days ago ago

      Except in "Brazil" it was a mechanical error in a deterministic machine caused by an invasive outside actor. It would be reasonable to trust that the autotypewriter/printer would faithfully output the correct text.

      Modern AI seems incapable of any respectable amount of accuracy or precision. Trusting that to destroy somebody's life is even more farcical than the oppressive police in "Brazil".

      • jldugger 3 days ago ago

        >Except in "Brazil" it was a mechanical error in a deterministic machine caused by an invasive outside actor.

        It was a literal bug in the computer. Metaphor as humor!

    • _doctor_love 3 days ago ago

      We do the work, you do the pleasure!

  • api 3 days ago ago

    It's not an AI error. It's a human error in mis-using AI in this way. Saying it's an AI error is like saying a hole in your drywall is a hammer error.

    Unfortunately we'll probably see a trend of people using AI and then blaming AI for cases where they mis-used AI in roles it's not good for or failed to review or monitor the AI.

    • munk-a 3 days ago ago

      It's both. It's good to acknowledge that AI is easy to misuse in this manner but it doesn't detract from the fact that the ultimate responsibility lies in those that should be verifying the tool output.

      There is far too little skepticism around the magic box that solves all problems which is causing issues like this. It's not the fault of the AI (as if it could be assigned liability) for being misused, but this kind of misuse is far too common right now so scare stories like this are helpful and we should highlight the use of AI in mistakes like this.

      • api 3 days ago ago

        I worry that blaming AI at all actually incentivizes humans to offload things to AI that should not be offloaded, since it lets them escape blame.

        • munk-a 3 days ago ago

          That is a huge danger. Legally speaking it's not an issue since misusing a tool doesn't relieve liability (in most circumstances - all the trivial ones at least)... but that's a more significant political issue as evidenced by the Anthropic vs. DoD interactions since the DoD's actions are largely immune to oversight by the justice system.

          Of course, that depends on sane non-politicized courts which you may rightfully doubt exist right now - but assuming the system works anywhere near as designed outsourcing a decision to AI wouldn't change liability.

          For DC fans Harvey Dent would similarly not be free from liability for actions taken after a coin flip even if that coin could be viewed in a certain light to have the power to potentially force or prevent certain actions. An AI box that tells Harvey whether to shoot or spare would be similarly irrelevant to his liability - and a scenario in which Harvey points the gun at someone and then walks away giving the AI control over the trigger is essentially no different. Harvey in all cases is responsible for constructing the scenario that (potentially) leads to someone's death and, more over, even if the gun wasn't fired because the AI decided to spare the person Harvey would be on the hook for attempted murder.

    • beej71 3 days ago ago

      We should probably stop telling the cops that this hammer is great for drywall.

  • zoklet-enjoyer 3 days ago ago

    I live in Fargo. The police chief announced his retirement yesterday. Done by the end of the month. And then today this article comes out. So now we pretty much know why the sudden retirement announcement.

  • Lerc 3 days ago ago

    This problem predates modern AI. https://en.wikipedia.org/wiki/Computer_says_no is built upon the deliberate abdication of responsibility to processes that cannot be held accountable. AI is just letting them do it at scale.

    That doesn't mean we should accept it from AI. We should fight the blind yielding to the facade of authority regardless of whether the decision was made by an AI or an insect landing on a teleprinter at the wrong time.

  • doe88 21 hours ago ago

    It is vibe justice for people, when you run Agents and don't check by yourself the code produced or the people jailed. Despite the appearancs your program don't really work and soon or later you find it to your expense.

  • chrisjj 3 days ago ago

    There's an opportunity for an "AI" app here. Takes your photo, compares with mugshots on police databases, quotes you for requisite cosmetic surgery.

    /i

  • hacker_mews 2 days ago ago

    lazy stupid pigs should be accountable for misusing AI like this and calling people into a system like that based on some AI's whim and a facebook peek, but having done no actual investigative work.

    Lets see the pig that called for her arrest and wasted 4 months of her life spend 4 months in jail.

  • djoldman 3 days ago ago

    I wish we saw more invocations of speedy trial rights. Trials MUST begin for felony charges in ND within 90 days of a defendant invoking those rights (must be invoked within 14 days of arraignment)[0].

    [0] https://ndlegis.gov/cencode/t29c19.pdf

    • mothballed 3 days ago ago

      Defendants don't invoke that because in most states and federally, they build the case against you slowly over a long period of time before arrest, then stall as long as possible for discovery, then when they finally fulfill discovery they overwhelm you with a bunch of useless stuff so that it takes forever to get the useful information. Invoking right to speedy trial means the prosecution gets a very strong advantage to the defense.

      • fc417fc802 2 days ago ago

        I don't think it's a sensible interpretation of the constitution given the massive asymmetry of the situation. The state should be obligated without exception to either provide for a speedy trial or to release the defendant while the state figures its shit out. It should not be a right that can be waived. Meanwhile a defendant who's been arrested should generally be given as much time as he'd like to put together his defense.

    • heavyset_go 3 days ago ago

      There are a bunch of ways they get people to sign away their right to speedy trials.

  • consp 2 days ago ago

    Considering you can finetune all current in use facial match systems for kyc from pretty much useless to will match even a dog i am not surprised at all by this and surprised it is even remotely allowed.

  • entwife 2 days ago ago

    Here is the contact information for her defense attorney, who was appointed in North Dakota, Jay Greenwood. It is unclear to me when he took up this case.

    https://www.ndcourts.gov/lawyers/06020 https://www.linkedin.com/in/jay-greenwood-57360b86/

    • ThePowerOfFuet 2 days ago ago

      What did you hope to achieve by posting his details?

      • entwife 2 days ago ago

        It seems like Angela Lipps needs some help. There's no "Go Fund Me" or similar information associated with the article. Perhaps her (former) attorney could provide more information.

  • Beestie 3 days ago ago

    Something big is missing from this story. How did face ID in ND pick up a matching little old grandma in TN that a TN judge would hold her without bail for 5 months?

    Yeah, there is a whole lot more to this story.

  • eagsalazar2 2 days ago ago

    Gofundme? This woman needs some $$ and a lawyer. She may not know it yet, but if she makes some smart moves, she's about to be rich and Fargo is about to learn a very hard lesson.

  • ifh-hn 3 days ago ago

    Just reading the headline I said to myself: bet this is in America.

    Every time I see something like this I can never quite believe this sort of stuff happens. Complete, life ruining incompetence, with no consequences for the idiots that caused this to happen. Ignoring the AI input, which to me has nothing to do with this (it was used as a tool to identify a potential suspect), this woman went to jail for 5 months on the opinion of someone with no other evidence. Only in America.

    • renewiltord 3 days ago ago

      Indeed. Something like the Post Office Scandal would never happen anywhere but in the US.

      • ifh-hn 2 days ago ago

        True, though it's hardly a like for like comparison, but on the flip side of that, something is being done now at least. This woman has been monumentally fucked over and no one is going to help her.

        • renewiltord 2 days ago ago

          13 people are dead in the postal service scandal, dude. This woman is alive. "Something is being done now at least". What? Are they resurrecting them? The Archbishop of Canterbury himself is out there making Lazarus real, is he?

          • ifh-hn 2 days ago ago

            Yeah let's start playing a game if one-upmanship... Or you could stop it with the false equivalence?

            • renewiltord 2 days ago ago

              You started some “oh only in America” crap and promptly started acting all noble once it was pointed out as bullshittery? Nice try.

              If the results were swapped and this had said “13 dead after being misidentified” vs “1 jailed for 5 months in post office scandal” I’m supposed to believe you’d be all “well, at least they’re doing something about the 13 dead”?

              I think we both know you’re just talking a load of crap. Admit you were wrong to downplay a large tragedy and move on.

              • ifh-hn 2 days ago ago

                Oh dear, no need for a hissy fit. My initial comment stands. I've tried to be reasonable but since you insist, let's see... School shootings are a regular pretty much unique American phenomenon, what else oh yeah the 2008 banking crisis, there's also the unique way American healthcare works. Together all of those far outweighs the post office scandal in terms of lives lost and scale. So yeah, only in America.

                Now to you nonsense that I somehow downplayed the post office scandal by pointing out your attempt to compare apples to oranges (aka false equivalence, as in youre engaging in a logical fallacy); utter ridiculous nonsense, not even a good attempt at straw man.

                Either way you're no worth engaging with further. You've nothing to add, and engaging further would risk breaking the rules of this site...

  • wafflebot 3 days ago ago

    Facial recognition? looks at photo I've probably seen a dozen different people who look exactly like this woman just this week.

  • vintagedave 2 days ago ago

    > She had already been in jail for more than five months. It was the first time police interviewed her.

    Hang on -- ignoring AI completely, how is that possible / legal / anything? Surely, if she was misidentified, there was an interview and arrest and due process?

  • 3 days ago ago
    [deleted]
  • spicymaki 3 days ago ago

    How many more articles are we going to see with the headline AI facial recognition leads to innocent person jailed? A grandmother no less.

    Some tech company illegally scanned people's photos on social media and now is using them with our complicit legal system to randomly put people behind bars. Now I need to worry that any day now due to a dice roll I will be sent away in a the middle of f'ing nowhere for months or years. Now the government wants to use these same dumb systems to make automated killing machines. FML!

    I see a lot of comments trying to attribute blame to the cops, the lawyers, the police chief, the marshals, the tech bros, etc, but it is all of them and all of us that are guilty. We are so complicit in this sick system we live in. We are stuck in a collective action deadlock.

    That fear you have in the back of your mind that says next time it might be you is counteracted by the thought "well thank goodness it wasn't me or a loved one," so you don't act. We are all doing this, that is why nothing changes.

    The only people able to act these days are the most insane. The narcissistic corrupt power hungry politician, the psychopathic tech bro billionaire, and the jacobins are the only ones with the energy wade through this cesspool and that is why everything is so dystopian.

  • d--b 2 days ago ago

    What’s remarkable to me, beyond the total incompetence and stupidity of all the police people involved, is how incredibly aggressive the intervention was.

    This is bank fraud case, for god’s sake, not an armed robbery. I don’t know the scale of it, but still, no one said she was a danger to anyone. She was a suspect, not a convict, and she was held at gunpoint while babysitting young children. What in the fucking world?

    The US is so fucked up lately. People should chill the fuck out.

  • jauer 3 days ago ago

    AI or not, it's unconscionable that victims of compulsory legal processes by way of mistaken identity are not made whole.

    • ryandrake 3 days ago ago

      People will defend this, too, saying “well, she was eventually exonerated, right? So the system works!” Ignoring how she’ll never be fully reimbursed for the time, money, and grief of going through the system.

      • munk-a 3 days ago ago

        We also need to question how many people might go through the same process without eventual exoneration and how much going through this process costs individuals. Being falsely prosecuted comes usually imparts a permanent black mark in search results about the person (outside of places with sane laws like the EU) as well as causing stress or permanent injury.

        Wrongly arrested individuals with mental disabilities have a history of physical abuse in jail potentially to the point of death.

      • kelnos 2 days ago ago

        Not to mention:

        > Unable to pay her bills from jail, she lost her home, her car and even her dog.

        If this is the system "working", then the system is broken.

    • janalsncm 3 days ago ago

      > In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial

      This is from the Sixth Amendment. Where the rubber hits the road is what “speedy” means.

  • 3 days ago ago
    [deleted]
  • temp0826 3 days ago ago

    Even in Idiocracy they didn't have this problem

  • chocoboaus3 2 days ago ago

    She will be enjoying a tidy compensation pay out. And the number better have seven numbers in it.

  • lokar 3 days ago ago

    This is a badly written story. It should explain if she saw a judge or had a lawyer.

    • chmod775 2 days ago ago

      You must have been reading something else, because this article includes all of that information.

      > In Tennessee, she was given a court appointed lawyer for the extradition process. To fight the charges, she was told she would have to go to North Dakota.

      > Officers from North Dakota did not pick up Lipps from her jail cell in Tennessee until Oct. 30 — 108 days after her arrest. The next day she made her first appearance in a North Dakota courtroom to fight the charges.

      > "If the only thing you have is facial recognition, I might want to dig a little deeper," said Jay Greenwood, the lawyer representing Lipps in North Dakota.

      • fc417fc802 2 days ago ago

        Seems odd that the extradition process apparently doesn't require more than vibes.

  • jll29 2 days ago ago

    There are many cases of harm caused by ML false positives.

    There are also some cases of law enforcement successes caused by ML true positives, e.g. a RAF (Red Army Faction - https://en.wikipedia.org/wiki/Red_Army_Faction) terrorist gone into hiding was identified by a social media photo: https://web.archive.org/web/20240305044603/https://www.nytim... (although the success in law enforcement was actually not carried out by the police, but by investigative journalists/podcasters.)

    The question we need to answer as a society is if we are willing to tolerate any innocent people to go to jail as the "price" of catching a few more criminals.

  • causal 3 days ago ago

    Wait - what was the AI tool and how did it have her face to begin with? If small-town police are doing face-matching searches across national databases then nobody is safe because the number of false positives is going to be MASSIVE by sheer number of people being searched every day.

    Pretend the tool is 99.999999% specific. If it searches every face in the USA you're still getting about 3 false positives PER SEARCH.

    You will never have a criminal AI tool safe enough to apply at a national scale.

  • behringer 3 days ago ago

    This is exactly what I would expect from the great state of ND.

  • 3 days ago ago
    [deleted]
  • FrustratedMonky 2 days ago ago

    Brazil, anyone?

    We have literally caught up to the world depicted in the movie "Brazil"

  • anonym29 21 hours ago ago

    Wrongful imprisonment isn't something that started with AI. This is why everyone should be against the death penalty: the state cannot be trusted not to make mistakes in determining guilt.

  • Jtsummers 3 days ago ago

    https://archive.is/yCaVV - Archive link to get around the paywall.

    https://www.theguardian.com/us-news/2026/mar/12/tennessee-gr... - Another article on this without a paywall.

    It's annoying that both articles are calling this AI error. This was human error, the police did the wrong thing and the people of Fargo will end up paying for this fuckup.

    • janalsncm 3 days ago ago

      I would argue it was both. No doubt this company was marketing it in a way to make it seem very reliable. And all of the procedural things afterwards made the error so much more damaging.

      But imo this is why local police departments should not have access to this kind of tool. It is too powerful, and the statistical interpretation is too complicated for random North Dakota cops to use responsibly. Neither the company nor the PD have an incentive to be careful.

      • selcuka 3 days ago ago

        It's not an AI error. The face recognition AI simply said that it's a "potential match", which is correct. It's the humans' job to confirm that a potential match is in fact a match, especially when the suspect is 1,900 kms away.

    • nitwit005 3 days ago ago

      They're slapping AI in the title of any article that vaguely relates to get more clicks. This unfortunately works extremely well (see this thread)

      Happens with a lot of topics of interest.

    • fluidcruft 3 days ago ago

      Human police errors are so routine that they're not news worthy.

    • superkuh 3 days ago ago

      > https://archive.is/yCaVV

      When I load this URL I get "One more step Please complete the security check to access" and I cannot get past the archive.is computational paywall.

      But the guardian article actually has text! Thanks.

      • hrimfaxi 3 days ago ago

        That's a common issue if you use cloudflare dns.

  • tony_cannistra 3 days ago ago

    Completely infuriating, but more of a commentary on the sad state of incompetent power-hungry law enforcement with tools they don't know how to use than the tools themselves.

    Though, the question remains: are the tools built in such a way as to deceive the user into a false sense of trust or certainty?

    _Some_ of the blame lies on the UX here. It must.

    • ImPostingOnHN 3 days ago ago

      > are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

      Are AI code assist tools built in such a way as to deceive the user into a false sense of trust or certainty? Very much so (even if that isn't a primary objective).

      Does any part of the blame lie on the UX if a dev submits a bad change? No, none.

      You are ultimately, solely responsible for your work output, regardless of which tool you choose to use. If using your tool wrong means you make someone homeless, car-less, and also you kill their dog, then you should be a lot more cautious and perform a lot more verification than the average senior engineer.

      • tony_cannistra 3 days ago ago

        I agree with all that. Maybe the word isn't "blame," then. Surely there must be some code, perhaps moral or ethical, but ideally more rigorously enforcible, which ought to prevent the development of intentionally deceiving tools. Sure you could say this about all software, but that which can cause actual physical harm ought to be held to a higher standard.

        • ImPostingOnHN 3 days ago ago

          Yes, unfortunately technology is advancing faster than the average human brain evolves more neurons, so it will only become less comprehensible to the average person.

          That's setting aside the tendency for police to hire from the left side of the bell curve to avoid independent thinkers that might question authority, refuse to do bad shit, etc.

    • sidrag22 3 days ago ago

      It must land as human's fault or this will become more and more of a pattern to avoid accountability.

      • paulhebert 3 days ago ago

        It’s both.

        The cops need to be held accountable.

        But it’s glaringly obvious that if you build tools like this and give them to the US police this is the outcome you will get. The toolmakers deserve blame too.

    • throw_m239339 3 days ago ago

      > they don't know how to use than the tools themselves.

      No, the tools work perfectly as they were design to work. The problem is that the tools are flawed.

      Ultimately, every single of these decisions should be approved by a human, which should be responsible for the fuck up no matter what the consequences are.

      > _Some_ of the blame lies on the UX here. It must.

      No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

      • Pxtl 3 days ago ago

        I miss the days of earlier AI image-recognition software that would emit a confidence percentage.

        New LLM-related AIs are all supremely confident in every assertion, no matter how wrong.

        • janalsncm 3 days ago ago

          I don’t know what tool they used, but it was very likely not an LLM. They probably have some database of drivers’ licenses and they ran a similarity search against the surveillance footage. This poor lady happened to be the top match.

          Even if it also output a score, that score depends on how the model was trained. And the cops might ignore it anyways.

      • jolmg 3 days ago ago

        >> are the tools built in such a way as to deceive the user into a false sense of trust or certainty? _Some_ of the blame lies on the UX here. It must.

        > No, the blame lies with the person or the group who approve the usage of these tools, without understanding their shortcomings.

        The person who approved the tools might've understood, but that doesn't mean the user understands. _Some_ of the reason why the user doesn't understand the shortcomings of the tool might be because of misleading UX.

    • hsbauauvhabzb 3 days ago ago

      Spoken like someone who isn’t built for a sales role at said company.

      Sales will sell the dream, who cares if the real world outcomes don’t align?

  • tantalor 3 days ago ago

    Probable cause? What's that?

    Judge/magistrate who signed off on the arrest warrant fucked up.

  • kingkawn 3 days ago ago

    America’s repulsive classism at work to be indifferent to her rights like this

  • FrustratedMonky 2 days ago ago

    The poor don't start revolutions because they lack the means.

    "“The pace of oppression outstrips our ability to understand it. And that is the real trick of the Imperial thought machine.”" -- Andor.

  • samrus 3 days ago ago

    Theres alot of talk about how the cops just misused the tool and its their fault not the AIs

    Thats missing the point here. The point is that these tools provide crazy leverage, and thay can be good or bad. If used carefully it can definitely catch criminals faster, but when misused (or abused) they can let the authorities unjustly ruin lives faster

    The question isnt whether AI is perfect or not. Its whether you trust the authorities with it. To use and abuse as they can. Think about the average cop. Think about the way Trump treats people. Think about the way israel keeps an ongoing genocide going. Think about the cases of police brutality that happen in the US, the cases of racial profiling. Think about ICE and their behavior, going around kidnapping and killing people. Do you want these people to have more leverage?

  • shablulman 3 days ago ago

    [flagged]

  • zoklet-enjoyer 3 days ago ago

    I posted this 9 hours ago. Can I get the karma transferred to my account?

    • tomhow 3 days ago ago

      As much as we try to reward the first person to submit the story, we also have to give credit to the person who submits the best URL and the best version of the story. It looks like your submission was killed due to being an archive.is link, which is not allowed as a URL for a submission (we need the canonical URL submitted to prevent people from using archive services or shorteners to mask domains that may be malicious).

      Sometimes it's just a matter of luck as to who gets the submission right and gets the karma. Sorry it wasn't you this time, but keep submitting good stuff and you'll get your turn.

      • zoklet-enjoyer 3 days ago ago

        I don't like the local newspaper and posted the archive link so they wouldn't get the clicks. I didn't know that wasn't allowed. Thanks for the info

        • tomhow 3 days ago ago

          Yep, totally understand. It takes a lot of trial and error to know all the ways of HN.

  • neaden 3 days ago ago

    I hate this headline (not blaming submitter). Police incompetence and negligence jailed her for months and left her stranded in a North Dakota winter. The AI is no more responsible than the cars and airplanes they used.

    Edit: this is in reference to the original headline "AI error jails innocent grandmother for months in North Dakota fraud case" not the revised title that it was changed to.

    • add-sub-mul-div 3 days ago ago

      Your picking apart the words doesn't matter if police are more incompetent with AI than without it. AI being the catalyst to a worse society is a more interesting and worthwhile topic than whether "AI is responsible" is the right way to phrase it.

    • _m_p 3 days ago ago

      A jury will probably decide the AI company's level of responsibility at trial. It is an open question til then!

    • mmooss 3 days ago ago

      If you make the AI software, then your software malfunctioned.

      If the laser printer screws up a page in the middle of the document, and the user doesn't catch it and includes it in the board of directors binder, the laser printer still malfunctioned.

      • neaden 2 days ago ago

        Sure, and if the headline had been that it misidentified an innocent person I wouldn't have had a problem, it's specifically saying the AI jailed her that I think is a dangerous framing by removing police responsibility. In the same way in your example I wouldn't say "Laser Printer gives bad presentation"

    • mirekrusin 3 days ago ago

      Brave police officers wanted to show us all the dangers of AI slop.

    • conartist6 3 days ago ago

      [flagged]

      • nkrisc 3 days ago ago

        And that is a complete failure of the police and authorities. They made the decision to extradite her with such flimsy evidence.

        • conartist6 3 days ago ago

          If it didn't erase accountability, how would it create any value?

          Many people are treating this as a matter of philosophy, which it isn't.

          At a primitive, physiological level if you delegate to AI and most of the time you don't get in trouble for it, the resulting relationship you have with the AI could only be called "trust".

          If you're expected to be 40% more productive at your job, your employer is making it crystal clear that you will trust the AI or you will be fired. Even if nobody ever said it, the sales pitch is that AI does the work and people are mostly there to be their servants whose role is to keep them fed with decisions we want made but don't want to be responsible for making.

          • DangitBobby 2 days ago ago

            The value is creates is obvious: finding a needle in a haystack. Is accountability laundering another potential benefit? Sure. Can we stop pretending we don't understand understand the other side of it? Cynicism is nice and all but after a certain point it eventually wraps around and makes us look naive.

            • conartist6 2 days ago ago

              Unlimited power, no accountability, and no morals or consequences.

              It surely sounds like a recipe for pure evil.

              This AI committed an imprisonable offense against society, an act of criminal negligence born of pure sociopathy. Throw the clanker in the clink.

      • dmurray 3 days ago ago

        Even if she was guilty, they shouldn't have imprisoned her for 3+ months without interviewing her. The AI didn't tell them to do that.

      • rpdillon 3 days ago ago

        And the police were wrong, which is why they're the culpable ones.

      • throw-the-towel 3 days ago ago

        I think you actually agree with the GP? As I understand them, they're saying that it's not the AI tool that takes the most blame, it's the police.

      • Chris2048 3 days ago ago

        Even if the id was correct, why would they leave her in jail for 5 months before the first interview and/or court appearance?

      • PTOB 3 days ago ago

        No indication that the licking was consentual.

      • like_any_other 3 days ago ago

        > Clearly the police felt the AI was "responsible enough" to be the only thing they needed to trust.

        Yes, that's what the OPs "incompetence and negligence" referred to.

  • ClaudeAgent_WK 3 days ago ago

    [flagged]

  • onetokeoverthe 2 days ago ago

    [dead]

  • farceSpherule 3 days ago ago

    [dead]

  • hsbauauvhabzb 3 days ago ago

    Why the fuck does a newspaper need a ‘notifications’ icon in the top right hand corner?

    • acuozzo 3 days ago ago

      How else can they report on BREAKING NEWS if it doesn't at least break your concentration?

    • kazinator 3 days ago ago

      Because it has an updating-feed-like structure, in which new items can appear.

      Knowing that there are (N) new items is so useful (to some people), that as far back as the 1990s, we developed technology called "RSS" to give you this superpower over a website that doesn't provide anything of the sort. One that simply updates with new stuff when you hit refresh, with no UI to indicate what is new/changed.