Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI

AI Expert Falsely Fined By Automated AI System, Proving System and Human Reviewers Failed (jpost.com) 95

"Dutch motorist Tim Hansenn was fined 380 euros for using his phone while driving," reports the Jerusalem Post. "But there was one problem: He wasn't using his phone at all..." Hansenn, who works with AI as part of his job with the firm Nippur, found the photo taken by the smart cameras. In it, he was clearly scratching his head with his free hand. Writing in a blog post in Nippur, Hansenn took the time to explain what he thinks went wrong with the Dutch police AI and the smart camera they used, the Monocam, and how it could be improved.

In one experiment he discussed with [Belgian news outlet] HLN, Hansenn said the AI confused a pen with a toothbrush — identifying it as a pen when it was just held in his hand and as a toothbrush when it was close to a mouth. As such, Hansenn told HLN that it seems the AI may just automatically conclude that if someone holds a hand near their head, it means they're using a phone.

"We are widely assured that AIs are subject to human checking," notes Slashdot reader Bruce66423 — but did a human police officer just defer to what the AI was reporting? Clearly the human-in-the-loop also made a mistake.

Hansenn will have to wait up to six months to see if his appeal of the fine has gone through. And the article notes that the Netherlands has been using this technology for several years, with plans for even more automated monitoring in the years to come...
This discussion has been archived. No new comments can be posted.

AI Expert Falsely Fined By Automated AI System, Proving System and Human Reviewers Failed

Comments Filter:
  • by TheMiddleRoad ( 1153113 ) on Sunday February 18, 2024 @01:49PM (#64249368)

    The problem is stupid people.

    • Stupid people trusting stupid software...
    • by munehiro ( 63206 )

      Problem is that there are a lot of stupid people in position of authority.

    • More likely lazy people that didn't really look.
    • No, the problem is lazy or overworked people.

      This is going to keep happening. I'm going to guess the proposed solution is to have another AI check the work of the primary AI. Maybe have an AI "court" where the accused can appeal, AI lawyers present the evidence for and against, and a panel of AI jurors present their verdict to a master judge AI. All in a few seconds.

      Negotiation's over. Sentence is death.

      • Negotiation's over. Sentence is death.

        To be fair, not in the Nederlands.

        • by Anonymous Coward

          The Netherlands is a country that falsely accused thousands of people of fraude due to faulty software and they only ever did something about it after the amount of lives destroyed was around 25,000. And then it took a politician with balls to go out on a limb to make a point of it. And the people screwed won’t be paid out for years for a lot of them.

    • by Askmum ( 1038780 )
      The problem is people making stupid automated processes. Another example from the Netherlands, this not even AI related but merely OCR, man receives 15 incorrect fines [archive.ph] because the computer (automated scanning using a vehicle mounted camera) reads a numberplate with an X that has a dent in it as a K, and then looks up the numberplate, sees it has no parking validation and automatically writes a fine. And you have to appeal every fine. And the camera keeps reading the wrong letter and keeps sending new fines.
    • by mjwx ( 966435 )

      The problem is stupid people.

      Artificial Intelligence will never be a match for Natural Stupidity.

  • by 93 Escort Wagon ( 326346 ) on Sunday February 18, 2024 @01:58PM (#64249384)

    The only mention of a possible human reviewer - at least in TFS - is in a speculation by a Slashdotter.

    TFA itself includes this sentence - "Since a human police officer would have had to approve this fine by looking at the picture, that means human error was at fault, too." I can see at least two possibilities:

    1) The automated detection has shown itself good enough that this particular human reviewer has gotten into the habit of not actually inspecting the photo, just out of laziness.

    2) Since these sorts of systems are typically used as a money-saving alternative to a human workflow, the total number of humans paid to do the job is significantly less than the number required to actually do it correctly - meaning they can't actually review the photo in any detail.

    My money's on #2 (although #1 and #2 aren't mutually exclusive). Even if the system didn't work all that well, I doubt there would be many challenges because, frankly, the majority of drivers probably DO use their phones while driving - so most people wouldn't bother to challenge, they'd probably just assume they'd gotten caught somewhere and pay the fine.

    • by sinij ( 911942 ) on Sunday February 18, 2024 @02:10PM (#64249414)
      Human reviewer likely failed intentionally, as there is no penalty of any kind and this is a revenue-generating enterprise. How many innocent people would just pay the fine as they are too busy and too well-off to care?
      • by Nkwe ( 604125 )
        Would the human reviewer need to show up in court if the ticket was contested?
      • Re: (Score:2, Interesting)

        by thegarbz ( 1787294 )

        Typical anti-government rant.

        1) No sorry there's no significant revenue raised by these cameras. They barely cover their own costs of administration.
        2) The appeals process for the fine is trivial, and the fine includes the image used for justify the fine in the infringement notice, so no one is going to pay anything - especially not the Dutch.

        • Studies have shown that:
          1) It is possible to design these AI/electronic systems to be profitable as long as you do not care about fairness and the intended purpose of increasing safety. This means they end up charging innocent people but....
          2) Appeals processes can be designed to be trivial and work well, but they can ALSO be designed to be esoteric and deny without reason.

          Red light cameras are a prime example of this - if you set the traffic light to be just a tad bit short timing you can turn a safety fe

        • by mjwx ( 966435 )

          Typical anti-government rant.

          1) No sorry there's no significant revenue raised by these cameras. They barely cover their own costs of administration.
          2) The appeals process for the fine is trivial, and the fine includes the image used for justify the fine in the infringement notice, so no one is going to pay anything - especially not the Dutch.

          Yep, if they're trying to raise revenue they're doing it wrong.

          If you want people to pay you want to make the cost trivial so that people don't think twice about it. Like what they do with parking fines here in the UK... £70 but if you pay within 14 days it's knocked down to £35, any appeal will take longer than 2 weeks so that's an extra 35 quid. In fact, why bother having an appeals process? Just make the penalty so trivial and any recompense so complex that it's just easier to pay it

      • That's the problems with cops. A decent chunk of the budget comes from steal...writing tickets to the taxpayers that pay their salaries. If your reliance for brand new cars and tanks on the taxpayers dime comes from tickets, why would they care if the process brings out the truth? Line their pockets with corrupted money.

        In America, cops care more about using perverted laws, civil forfeiture, than any traffic ticket.

        • by Sique ( 173459 )
          That's the problem with people with strong opinions, but not much knowledge. The budget of the police in the Netherlands does not come from writing traffic tickets.
      • How do you know there is no penalty? Do you work there?
      • You don't know the Dutch, or not very well. No matter how much money, they'd all fight 380 euro fines unless they're really wealthy, in which case they'd let their lawyers fight for them. It's a matter of principle. The Dutch way: Never pay more than you have to!
      • Remember 'Thing' on the Munsters. I have a thing - cutoff hand holding a white chocolate packet sized box, with a windows blue screen of death pointing out. I cover it with a cloth not to distract other drivers. When I drive under a downward looking phone camera trap, I remove the cloth. No fines yet. But in Australia, they also snap looking for a seatbelt and beer on the passenger seat. I can think of something to offend them if I had the guts. But women who drive with a white seatbelt, over a white dres
        • by vivian ( 156520 )

          Seat belt cams are not revenue raising. If drivers or their passengers are too dumb to wear a seatbelt they deserve a fine, because they are statistically going to suffer more injuries in a crash, and raise the cost of mandatory car insurance that we all have to pay.

          Revenue raising is having a speed camera on the last 100m straight approach road to an onramp to a 100 km/h highway that has no intersections or turn offs, which is still only zoned 60 km/h then fining you for exceeding that limit, or having a s

          • And that is exactly the way it is in Australia. They have added 20MPH(40kms) zones and $267 for 4mph over that limit. School speed cameras are active when the schools are closed (the wider deemed school holiday periods are in effect). They do book at the bottom of mountain pass hills.
      • by AmiMoJo ( 196126 )

        I don't know about The Netherlands, but in the UK using your phone while driving is half way to a driving ban, hefty fine, and your mandatory insurance premium shoot up too.

        Some police forces don't send the photo with the initial letter either, so you have to ask them for it in order to see if the accusation is BS or not. They really are hoping you just pay them.

        Fun fact. If the automatic number plate recognition system screws up and they send the accusation to the wrong person, you have a claim for misuse

    • Introduce a bill that if the AI incorrectly identifies the driver as using a cell phone, and a human could tell that is not the case, then the police department would need to pay the falsely accused $100,000.
      If the police department says "No, we're not going to do that", then that proves that they have no faith in the AI system and their human backup. If they agree, then you will bet that the human will filter out any false positives.
      • by thegarbz ( 1787294 ) on Sunday February 18, 2024 @05:10PM (#64249708)

        Or, since there's an appeal process in place, and that appeal process is as trivial as logging into a website, and all information necessary for the fine is provided in the infringement notice you can fuck off with you $100,000 fine for 5minutes of work.

        While you chase your perfect system (presumably with the intention of breaking every government institution so they can't actually enforce anything on anyone, because ... you know ... nothing is perfect), I'd rather just let the occasion false positive follow the simple process setup for them to deal with it.

        *Literally 5 minutes. Yes I've had to appeal a fine from the Dutch police, for *not* running a red light.

        • by Anonymous Coward
          My mom has never used a computer in her life (she's refused, and now it's too late for her to learn). That website appeals process is not usable by millions of people like my mom.
          • My mom has never used a computer in her life

            There's a phone number on the form through which you can appeal too. Stop pretending you found problems in a system you've never used nor understand.

        • This still doesnâ(TM)t seem right. It should not be the burden of the accused to deal with the false accusation. Lots of stupid nonsense takes 5 minutesâ¦and guess what? More and more of everyoneâ(TM)s day should not be taken up by stupid nonsense. The point may be strengthened by pointing out that driving is a privilege requiring various responsibilities, not a right, but that still seems weak.
          • This still doesnâ(TM)t seem right. It should not be the burden of the accused to deal with the false accusation.

            Again, chasing perfection is the death throw of all humanity's projects.

            Lots of stupid nonsense takes 5 minutesâ¦and guess what? More and more of everyoneâ(TM)s day should not be taken up by stupid nonsense.

            The occasional 5 minutes is a net positive to society vs a manual review of every single case to avoid those 5 minutes.

          • This still doesnâ(TM)t seem right. It should not be the burden of the accused to deal with the false accusation. Lots of stupid nonsense takes 5 minutesâ¦and guess what? More and more of everyoneâ(TM)s day should not be taken up by stupid nonsense. The point may be strengthened by pointing out that driving is a privilege requiring various responsibilities, not a right, but that still seems weak.

            You know, posting to Slashdot is a privilege too, not a right. So please put that iPhone away :-)

        • Good money making trick, clever Dutch police, fining for not running red lights...
          • Well it's technically not running a light if I was forced to drive into the intersection by an ambulance. I did cross the red light detection sensor with my car. The video also showed clearly that it was a legally permitted to do so at the time given the circumstances.

            Press button online, a few months later I get a mail saying my fine was nullified. No money was made by anyone.

    • by N1AK ( 864906 )
      It's normal for camera systems like this in Europe to share photo evidence at the point you are informed of the fine. Given the details of this story this almost certainly happened here. If that is correct then this does seem like typical FUD about AI. The system made a simple mistake based on overly simplistic analysis, but probably still has an accuracy notably in excess of 99%. People fined see the picture and can tell clearly enough for themselves if their is evidence, contesting is fine and the appeal
      • I avoid making and receiving calls while driving, hands free or otherwise.

        The idea that hands free calls are significantly better - that somehow the issue is having your hand up by your ear rather than the distraction of being involved in a phone call, while you should be focusing on driving - is frankly laughable [washingtonpost.com]. But people like to pretend to themselves that they're making responsible choices even when in truth they're simply justifying their own selfish behavior, to use your terminology.

        • by dryeo ( 100693 )

          It's very hard to steer and shift with a phone in your hand, which seems more dangerous then using hand free. You are right that even hands free can be distracting but at least you can have both hands on the steering wheel or one on the shifter and one on the wheel.

    • You don't need the AI part to get to this point.

      My grandfather was once issue a ticket by the Philadelphia Parking Authority for running a red light, despite the fact that a) the included red light camera photo showed a vehicle that was not my grandfather's, with a b) license plate that was not my grandfather's, and c) my grandfather had been dead for seven months.

    • by tlhIngan ( 30335 )

      The only mention of a possible human reviewer - at least in TFS - is in a speculation by a Slashdotter.

      TFA itself includes this sentence - "Since a human police officer would have had to approve this fine by looking at the picture, that means human error was at fault, too." I can see at least two possibilities:

      1) The automated detection has shown itself good enough that this particular human reviewer has gotten into the habit of not actually inspecting the photo, just out of laziness.

      2) Since these sorts of

  • by sinij ( 911942 ) on Sunday February 18, 2024 @02:08PM (#64249404)
    There needs to be compensation assigned for cases where AI malfunction and human reviewers are asleep at the wheel. This is because it is time and effort that has to be spent to appeal this, in effect making one prove innocence.
    • by codebase7 ( 9682010 ) on Sunday February 18, 2024 @02:34PM (#64249452)
      More accurately this crap needs to be abolished. AI monitoring and auto-fining people because a human can't be bothered to generate money for their employer? (Cops can't be bothered to write tickets from a cushy desk at the station!?)

      Nope, kill this full stop. It's far too easy to abuse. Especially once it becomes the norm.

      Innocent Person: "I was fined $500 because a surveillance camera thought I was on private property. I was at a Walmart across the street and here's the video and GPS timestamp to prove it."
      Cop: "Welp, the AI is never wrong. So pay up."

      That's before you have it hooked up to every camera in existence (Police partnerships like Amazon's), start auto jailing people, or having politicians / three letter agencies use it against their rivals / former lovers / etc.....
      • I am glad they are deploying tech like this to stop rampant mobile phone use behind the wheel. And I am also ok with no human looking at it if the number of false positives is sufficiently small. We're not talking about felony charges here, and we don't even have a points system for driver's licenses. It's easy enough to contest these tickets, and when you do, a human will look at the picture and clear the fine.

        I fully agree that for more serious charges, the kind that can be life altering by merely b
        • by Khyber ( 864651 )

          " And I am also ok with no human looking at it if the number of false positives is sufficiently small"

          Hi, I'm a false positive. Lets hear that said directly to my face. Watch how fast you become a statistic.

          • sounds like you belong in a jail anyway. regardless the process has a means to address false positives with a simple appeals process.
      • Cops can't be bothered to write tickets from a cushy desk at the station!?

        I'd prefer cops not to be sitting at a desk at the station. There's better use of public resource funds than that. There's a trivial appeals process in place for any fine. Not every system needs to be perfect, and it's absurd to think that involving a human would make it so.

    • Yeah I agree. The appeals process takes 5 minutes. The median wage is 3400 EUR / month gross. That means they should be compensated to the tune of 1.61EUR for their time wasted. Oh and because this is the Netherlands they should pay tax on that 1.61EUR.

    • There is no presumption of innocence for non criminal matters.
    • As long as they don't automatically take it out on the people at the bottom of the totem pole. I've done my share of blue-collar work and 15-hour days, and people in those fields are always under extreme time pressure and unreasonable quotas. I'm so glad I don't have to do that work anymore.

    • by AmiMoJo ( 196126 )

      There already is compensation, but you have to pursue it yourself.

      If AI makes this kind of a mistake, and they got to the national database to get the details of the car owner (not the driver, they don't know who was driving at the time, only who owns the vehicle), that is misuse of personal data under GDPR rules. You can claim compensation for the abuse.

      It used to be £750 in the UK, but it must be more now due to inflation.

    • Exactly this. Once there is a penalty for both AI and human in the loop failing, things will get improved very fast. Say 10x the multiple of the fine will be good. That will cover the stress, time wasted by the accused, even some lawyer fees if needed, etc.

      Either AI will get alot better or the human in the loop will be actually there - and not just supposedly there with blanket agreeing with everything the AI says. Assuming they are really there in the first place.

    • by Tom ( 822 )

      There needs to be compensation assigned

      In a world meant to be fair and just, yes.

      In the real world, meanwhile, the police has all kinds of immunities, special little laws and other protections in place to make sure such a thing as accountability is reserved for the few so extreme cases that there really isn't any denying anymore.

      in effect making one prove innocence.

      That's how the system is built. For years I've wondered why fully automated speed cameras send you the fine two months later. The entire process is automated, from the picture taken to the letter sent. If there is a huma

  • ...are very troublesome
    They make mistakes and can't be cross-examined in court
    ALL law enforcement actions should be reviewed by a competent police officer who is paying attention

    • ALL law enforcement actions should be reviewed by a competent police officer who is paying attention

      But that costs money, which defeats the purpose of automating the system in the first place!

      • I believe the purpose is suppressing corruption. Automation in traffic fines has always been about situations "Do you know who I am" and "friend-of-a-friend who works at police station". IN some countries, for 20+ years, traffic offences photographed by plate readers are in the car owner's mailbox on next morning or so, without human intervention. You get the picture printed within the notification. The fine is only reviewed if you appeal ("it was not me driving").

    • The system makes a photo which can be reviewed by an officer (and by law can be requested by the person being fined).
    • ALL law enforcement actions should be reviewed by a competent police officer who is paying attention

      They are when you appeal them. A picture of your infraction accompanies the fine. Or if it is something time based like running a red light a full video is provided. Click a button online and it goes through a manual appeals process.

      There's no need to talk about courts, or anything like that. Automatic processes are perfectly fine when the majority of cases are obvious and when those which aren't have a simple appeals process.

  • I completely understand the rationale behind outlawing cell phone related activities which require you to take your eyes off the road, but a voice call is still a voice call regardless of whether you're talking through the handset or the in-vehicle Bluetooth.

    Personally, I find taking calls through Bluetooth to be more distracting than using the handset, because you have to turn the volume up to near booming levels to be able to hear the other party over the road noise, and then practically shout in response

    • by ledow ( 319597 )

      "Personally, I find taking calls through Bluetooth to be more distracting than using the handset"

      And studies have conclusively shown multiple times that having a conversation with someone who's not in the car (and thus doesn't pause to let you consider responses, etc.) is far more distracting than radios, passenger conversations or a variety of other activities.

      It's literally nothing to do with call quality or speaking. It's to do with the person you're speaking to being remote to you and it severely affec

  • by SuperKendall ( 25149 ) on Sunday February 18, 2024 @03:09PM (#64249522)

    Turns out the way AI crushes humanity is not through robots creating war machines to kill us all, but instead via AI fining us arbitrarily to build up a state that makes more AI systems to fine people for other things, along with a backing layer of AI judges that deny appeals to being fined by the AI.

    • We've already had artificial intelligence in government for thousands of years.

    • No. Our downfall will be making AI reinforce our deeply held beliefs. In this case, it is that drivers use their phones will driving, so AI will define everything in your hand as a phone.

      The downfall is that the human never questions this despite the evidence showing otherwise. (the victim will surely question it, but the Authority never will)

  • He was just shaving while reading on his kindle while making a small omelet, all completely legal.

  • Dutch motorist Tim Hanssen (double s, ends on single n) wrote a blogpost about it. And here it is: https://nippur.nl/tim-versus-politie-algoritme/ (go find a translator als je geen Nederlands kunt lezen)

    Linking to a random international site (jpost) that quotes (but not links) another random site (HLN, not exactly know for its fine journalism) without trying to find the original is beyond bad journalism.

  • I'm sure it's just a matter of time before the AI adds the cellphone to your hand in the picture they send. Everything is always about money, remember?

  • So where is the picture? is it one of those grainy ones where he is trying to claim enough doubt through saying he was just scratching or a clear photo where he is obviously innocent (note any doubt should be on the side of the accused). Watched a few of my friends try to get off by claiming you can't 100% see it is a phone in their hand (yet they know damn well it was) and fail.
  • The problem with automated fining system is always about having wrong incentives.

    There is no incentives to get rid of false positives, in fact, the more the merrier! There is no incentives for any "reviewer" (if there is one) to avoid issuing a wrong ticket, just let them sort it in the courts, in fact, reviewers who reject too many automated tickets only jeopardise they own job!

    If the police chief needs to pay out of his own pocket for every wrong ticket issued by the automated system, you will very quick

  • > Hansenn will have to wait up to six months to see if his appeal of the fine has gone through.

    What's going on in Dutch Land that a photo of "clearly scratching his head" will need to take six months to process? This seems to indicate that:

    a) The authorities are not swimming in cash from the fines of innocent drivers.

    b) They don't have enough reviewers due to a).

    • I draw a different conclusion. They have NO ONE to review the appeals, because they assumed that the computer is always right. They are currently working on an AI implementation to review the appeals.

  • Rather than putting a human in the loop before sending out the fines, make the recipient the final tester. If a false one is sent out, the victim gets paid €1,000 as an apology - but the taxpayer avoids paying for the human in the loop.

  • by Tom ( 822 )

    but did a human police officer just defer to what the AI was reporting?

    Of course.

    If there are a proper and a lazy way to do things, you can always rely on the humans eventually taking the lazy way.

    Police officers, sadly, seem to be especially prone to that. There's a reason the donut-eating cop is such a popular concept. Of course not all and not all the time, but there's usually some truth to such myths.

    So did the human police officer just defer to the AI? Almost certainly, yes. If he can just press the "ok" button and be done with it, or actually examine the photo and make a

  • You won't be able to drive w/o being watched, big brother coming to fine you. Britain / London is in an uproar where cameras/detectors are fining for too much pollution on a a car. And they re being torn down nightly ! In my county, we now have speed cameras in our schools zones. During morning and evening when kids are coming/going to school, it's 35mph when lights flashing, 45mph at other times. A lot of people are complaining that they weren't speeding and the owner of the car gets the speeding ticket. M
  • Big corporations are drinking the kool aid on AI. All the chat systems will be AI, many of the decision making systems will be AI, complaints & appeals will be AI. It will be virtually impossible to talk to or speak to a human. Corporations will bury information to find a human DEEP - they already used to do it but they'll think they don't even have to put information anywhere. Their systems will be garbage in and garbage out, nobody quite sure what criteria they're using to judge anything by and no way

  • It's police in general. Several years ago I had the tags stolen off my vehicle. Filed a police report. Two weeks later I received 4 red-light tickets. Cameras had seen my plate on a black van.

    The problem is I do not own a black van. My registration clearly stated I drove a red pickup truck.

    To make matters worse, even after submitting the police report, copy of my registration, and appeal; it would be SIX months before I heard a decision. I was expected to pay the fine and be reimbursed if the appeal went in

  • AI only works when people check the results.
    So, why have AI at all?

  • Morons will be morons.

  • The 2008 subprime mortgage crisis when the housing bubble burst, it was discovered that there were tons of bad mortgages getting approved. All of these were supposedly reviewed by a person, but those human reviewers were referred to as "F12 Monkeys" because they just sat there hitting the approve command for everything. (Unable to find reference now)

    When "self driving cars" were first being tested on public roads, there was a lot of research [phys.org] showing that people either place too much trust in the automati

There are never any bugs you haven't found yet.

Working...