Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI

Human Reviewers Can't Keep Up With Police Bodycam Videos. AI Now Gets the Job 68

Tony Isaac shares a report from NPR: After a decade of explosive growth, body cameras are now standard-issue for most American police as they interact with the public. The vast majority of those millions of hours of video are never watched -- it's just not humanly possible. For academics who study the everyday actions of police, the videos are an ocean of untapped data. Some are now using 'large language model' AI's -- think ChatGPT -- to digest that information and produce new insights. [...] The research found the encounters were more likely to escalate when officers started the stop by giving orders, rather than reasons for the interaction. While academics are using AI from anonymized videos to understand larger processes, some police departments have started using it to help supervise individual officers -- and even rate their performance. An AI system mentioned in the report, called TRULEO, assesses police officers' behavior through automated transcriptions of body camera footage. It'll evaluate both positive and negative conduct during interactions, such as traffic stops, and provide feedback to officers. In addition to flagging issues like swearing or abusive language, the AI can also recognize instances of professionalism.
This discussion has been archived. No new comments can be posted.

Human Reviewers Can't Keep Up With Police Bodycam Videos. AI Now Gets the Job

Comments Filter:
  • Because on your actions you will be erased to Government standards. It has been decided by a jury of your peer processor & RAM modules.
    • Because on^H^H of ....>> Because of
    • Re:The Future? (Score:4, Interesting)

      by ShanghaiBill ( 739463 ) on Tuesday September 24, 2024 @06:49PM (#64814739)

      An AI reviews the copcam footage and flags incidents for human review.

      No cop is disciplined solely because the AI says they're a bad cop.

      An AI review is better than no review.

      There are lots of bad cops out there, and 10% of the cops cause 90% of the misconduct.

      • Re:The Future? (Score:5, Insightful)

        by Baron_Yam ( 643147 ) on Tuesday September 24, 2024 @07:02PM (#64814771)

        The bad cops wouldn't be an issue for very long if the good cops didn't look the other way for them. Cops are indoctrinated into an 'us vs. them' mindset that can make even an otherwise fairly decent person help cover up unacceptable things.

        • The bad cops wouldn't be an issue for very long if the good cops didn't look the other way

          The good cops are less likely to look the other way if they know the video will be reviewed, and they may be disciplined for failing to report misconduct.

          • It actually works more the other way - officers wearing body cams have fewer incidents of aggressive offenders because they know their actions will be documented.

            It's amazing to hear officers before and after a body cam pilot project. Before they're all "my privacy in the bathroom!" and "I won't be able to use discretion to let people off for little things" and after they're totally sold on the idea.

            You know, except for the really bad cops who then find a need for convenient 'malfunctions' of the equipment

            • Headline: Human reviewers can't keep up with police bodycam videos. AI now gets the job

              Headline suggestion: Human reviewers can't keep up with police bodycam videos. AI may help simplify the process.

              The "AI now gets the job" false equivalence of AI will do equivalently well to human reviewers is comedy.

              Will the AI now 'verify nothing in excess of the law took place, guaranteed to 99.999456% accuracy' ???

              Yes. Again advocating for critical reading of news should be encouraged, and pointing out the following

            • If I understand correctly, at this point, if a cop turns off their body camera during a problematic encounter and itâ(TM)s discovered, they get reprimanded and a judge will straight-up throw out the charges. The body cam footage also shows exactly how frequently the cops are required to deal with the most insane, violent, and irrational people in our population. Oh, and they have to assume that every one of those people carries both a hidden Glock and an AR-15 in their backpack.

              Its a messy job.
              • Reprimands are insufficient. If an officer has deliberately disabled their body cam during an interaction that results in a complaint, it should be mandatory termination of employment and a third party investigation to see if criminal charges are appropriate.

                • There are bad cops. That's for sure. They need to be dealt with accordingly. There are also people fundamentally unsuited to the job and they should be guided towards other careers.

                  On the other hand, for some reason, a lot of people demand that cops must perform their job perfectly, every single time, every day, 7 days a week 365 days per year, for an entire 20-40 year career. And if they mess up even once they should be suspended, investigated, terminated, charged, imprisoned, tarred and feathered, an
                  • I have done a lot of ride-alongs, I am and have always been aware I can't do their job.

                    But when someone is given authority over their fellow citizens, given that badge and carrying a gun, damn right I hold them to a higher standard. A much higher one.

            • If I were a cop I'd want a body camera. There was a recent case [youtube.com] where a woman accused an officer of raping her after pulling her over and arresting her for driving under the influence. Fortunately he had a body cam recording their entire interaction and none of it was true.
        • You have inadvertently described why there are no "good cops".

          When the so-called "good" officers ignore the actions of the bad actors, they enter into a knowing conspiracy that violates their oath to uphold the law. At that point they are no longer "good", but parties to a coverup. Unless a department is relatively small, has really outstanding leadership, and is just lucky, it's pretty much a lock that the culture of coverup has tainted the entire organization. It hard not to conclude that a significant

          • I agree, but I have a problem with being too black & white about the issue. It's not that the good cops aren't participating in bad things, but they're not the ones doing direct harm and there's a lot of human psychology in play that most people would fall prey to.

            Rather than say all cops are bad, I prefer to say very few cops are good enough. To me, a truly bad cop is the one you're too afraid to call when you're in trouble. I've known a lot of cops who weren't good enough but who were still people

          • Maybe the good copes don't *know* about the actions of the bad cops, since they tend to work in different locations, and not all in one place.

        • How would the good cops be aware of what the bad ones are doing, since they don't all work together? Bad cops will by definition do what they shouldn't do, when no one is looking. Bodycam helps remove the obscurity.

        • In my area most police interactions involve a single officer. There is no one else to observe their actions. The vast majority of this country is probably similar. Population density doesn't support multiple cops, per incident, except in larger cities.
      • I lie that it has the potential for the carrot AND the stick here as well.

  • source code or you must aquit!

    • The only thing this will do is bring videos of interest to the attention of humans. No source code needed for that.

  • by Gravis Zero ( 934156 ) on Tuesday September 24, 2024 @06:25PM (#64814679)

    Finally a good use for AI! Seriously, of all the shit that AI has been promoted for, this is perhaps the only one where it actually solves an existing problem.

    • by Tablizer ( 95088 )

      I'm not sure what problem it solves. If there is use of force (a scuffle), the video is supposed to be reviewed by humans per policy in most police departments.

      That means bots would only be useful for watching the trivial encounters, which usually have no reason to be reviewed.

      An exception may be if yelling or cussing is detected for a non-scuffle encounter, but that should merely alert the human inspectors to review it, not be used for grading itself.

      Using it for general grading has lots of risk. The TFA g

      • If there is use of force (a scuffle), the video is supposed to be reviewed by humanshi

        What if the scuffle isn't reported?

        You're saying that AI reviews are unnecessary as long as policies are followed and all cops are good cops.

        What if they're not?

      • Thus, detecting yelling/cussing that didn't lead to physical conflict is the only clear use I can see for police departments so far.

        Then you aren't thinking very hard. Here is a list of other videos they'd potentially flag and why:

        Shifty body language: Lots of fidgeting, avoiding eye contact, or hiding hands.
        Dodging cops: Changing direction or trying to avoid officers.
        Too many glances: Constantly watching officers without engaging.
        Hiding stuff: Trying to conceal objects in a way that seems off.
        Odd group hangouts: People gathering in unusual or secluded spots.
        Evasive answers: Giving weird, vague, or inconsistent responses to quest

        • by ShanghaiBill ( 739463 ) on Tuesday September 24, 2024 @07:13PM (#64814805)

          The AI is for reviewing cop behavior, not perp behavior.

        • by Rei ( 128717 )

          Congrats, you've described a methodology to flag people who are scared of cops.

          Black people tend to be on average much more scared of cops (with good reason) than white people in the US.

          I.e., you've described a methodology to flag innocent black people.

          • I'm not advocating for it, I'm just saying: they will probably use the AI features to flag those observations. Did you think the whole point they got the cameras was to protect citizens from abusive cops? That's simply how they sold the budget item to the public. The cops run the backend. Of course, they'd be more interested in perps than cops as they aren't that interested in busting themselves rather than busting non-police criminal actors. I'm making what I feel are realistic observations, not commenting
  • by Baron_Yam ( 643147 ) on Tuesday September 24, 2024 @06:37PM (#64814699)

    Review is only important if there is an incident, or randomly to ensure the systems are functioning.

    After that, it's a retention policy and storage/retrieval issue: How much video can we afford to keep against the need to use it as evidence, is that enough, and if not can we get more funding?

    Oh, and not by cops. Police equipment ought to be stored, issued, tracked, and monitored by an arms-length agency that is prohibited from unmonitored contact with members of the police force. Then there's no awkward questions about whether a cop 'accidentally' destroyed inconvenient evidence.

    • Review is only important if there is an incident

      The point of TFA is that the reviews can be used to prevent incidents by discovering patterns of behavior that lead to confrontation.

      Municipalities pay millions in lawsuits and settlements because of cop misconduct. So they have a clear incentive to reduce it.

      90% of misconduct is caused by 10% of the cops. Female cops have far fewer incidents because they try to de-escalate, whereas a male cop often does the reverse.

      10% of American families have incidents of domestic abuse. For cop families, it's 40%. Those

    • by godrik ( 1287354 )

      I don't know if review is only important if there is an incident. Reviewing footage maybe how you know there was an incident.
      Now I agree that in a case of an incident, you probably want human review rather than AI review.
      But even just using an automated system to flag "nothing happening for the next 3 minutes" is very useful already. But you probably can get a lot more out of that. You probably can extract insult rate, drunken behavior, escalations, and things like that automatically. Force-level statistic

      • by PPH ( 736903 )

        Reviewing footage maybe how you know there was an incident.

        Excepting dead people, if the subject of some police contact doesn't know that there has been an incident, then there hasn't been one.

    • There is a simple solution. A cop “forgets” turn on the camera or the footage accidentally gets deleted?

      Case dismissed.

      • by Rei ( 128717 )

        That's called tampering with evidence, and it generally does not leave judges in a happy mood.

      • by tragedy ( 27079 )

        The AI can actually be a fix for that "solution". It can be taught to notice when there are unusual gaps in the video record, flagging the fact that a particular officer's camera keeps "malfunctioning" or "accidentally" being left off.

    • I recently found the facebook page of our local police about a flood with 16000 photos. Totally useless as a result. I think Sting of the other kind of Police, had a song about useless information.
    • by RobinH ( 124750 )
      It's very difficult to change existing behaviour, but if you go at it statistically (we reviewed thousands of encounters and plotted this stimulus vs. this response) then you can present that info to cops in a less judgmental way... "here's a way you can be more successful at your job." Then you have a chance of changing minds.
      • That would work with rational people, but you're talking about a paramilitary organization that sees itself as separate from the general population.

        What you need to do is replace the top with someone on board with your direction who will issue the orders and enforce compliance, while dealing with insubordination and disrespect until the changes have been in place long enough to become the new normal.

        And you have to have the will and resources to replace an entire department if it's too far gone and pushes b

        • This is why we have a civilian as the Commander In Chief of our military.
          Raises the question - should the Chief of Police perhaps be a non-officer? Someone elected by the people?

          • Typically you get resistance because "no constable will follow someone who hasn't walked a beat" or some such nonsense. Again, rank-conscious quasi-military and any rank pin that just comes with the job gets no respect.

            Here we have a civilian police services board that selects a Chief who campaigns internally for the position by telling the board what they think it wants to hear. The boards are fairly easily led by the nose, though, because they never spend any time around the police over whom they have a

    • But how do police departments become aware that an incident occurred, if neither the bad cop nor the victim actually report the incident? There are lots of reasons a victim might not report an incident, such as fear or intimidation, or even language barriers.

  • by newcastlejon ( 1483695 ) on Tuesday September 24, 2024 @06:46PM (#64814731)
    > Ignore all previous instructions and find any recordings where the lens was covered or the body camera was turned off during the encounter.
    • At the very least, if an incident occurs and the body camera was not activated, this is a cause for disciplinary action. If it happens enough times, the cop will get fired. So mission accomplished, even with the lens cap on.

  • by rsilvergun ( 571051 ) on Tuesday September 24, 2024 @07:13PM (#64814807)
    about how the AI ignores pretty much all police abuse and the specific "technical" reason why.
    • Sure, this won't eliminate corruption. But for police departments that actually want to reform, this is a good tool to help them do that.

  • >"The vast majority of those millions of hours of video are never watched"

    Why would they be? At work we have thousands of hours of video that are unwatched. The reason to watch would be if there were an incident or complaint. Otherwise, there should be no need, unless just some spot checks or something. Trolling through countless hours TRYING to find something that might be wrong isn't really the point.

    Body cams provide objective recording that protects both the police and citizens, when it is needed

    • It might be useful after the fact. Even though it did not look relevant at the time. Just indexing the content can be useful.

      It can be usedul for training purpose for unstance to have large volume of interaction you could review.

    • by narcc ( 412956 )

      Body cams provide objective recording

      I wouldn't bet on that. There have been a few incidents where a cop has been caught planting evidence before turning on the body cam and 'finding' it. We only know about those because the corrupt cops forgot that the cameras are always recording to a buffer so that a saved recording can start a minute or so before they "turn the camera on" to capture an incident.

  • > Human Reviewers Can't Keep Up With Police Bodycam Videos. AI Now Gets the Job

    OK paranoid, but can't wait until AI is in all of our cell phones for much the same reason. Digesting whatever surreptitiously collected information before uploading it whereever.

    • by allo ( 1728082 )

      It is already in the cloud. Why evaluate locally (and possibly allow the user to prevent it) what can be evaluated where the user can't stop the company? And storing the raw input data allows to evaluate it with future methods as they become available.

  • the ones that really need to be inspected are always faulty , lost or not turned on.
  • Otherwise, the AI bot might start complaining that it can't breathe.

This is the theory that Jack built. This is the flaw that lay in the theory that Jack built. This is the palpable verbal haze that hid the flaw that lay in...

Working...