Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
AI

Minneapolis Bans Its Police Department From Using Facial Recognition Software (techcrunch.com) 25

An anonymous reader quotes a report from TechCrunch: Minneapolis voted Friday to ban the use of facial recognition software for its police department, growing the list of major cities that have implemented local restrictions on the controversial technology. After an ordinance on the ban was approved earlier this week, 13 members of the city council voted in favor of the ban, with no opposition. The new ban will block the Minneapolis Police Department from using any facial recognition technology, including software by Clearview AI. That company sells access to a large database of facial images, many scraped from major social networks, to federal law enforcement agencies, private companies and a number of U.S. police departments. The Minneapolis Police Department is known to have a relationship with Clearview AI, as is the Hennepin County Sheriff's Office, which will not be restricted by the new ban.
This discussion has been archived. No new comments can be posted.

Minneapolis Bans Its Police Department From Using Facial Recognition Software

Comments Filter:
  • FRAAS (Score:4, Insightful)

    by PPH ( 736903 ) on Friday February 12, 2021 @08:15PM (#61057872)

    Facial Recognition As A Service

    Just sent the pics off to Google, some division of the Tata Group or maybe Experian. They do the recognition and return the identity for a fee.

    • I'm not sure if you are joking, but I think you are closer to the truth than you know. The number of cameras available for surveillance is steadily increasing. The police departments don't have the people/resources to deal with photos and videos from these cameras. Automation is the key, but that is politically difficult.

      So I think Jane/Joe citizen will take their Ring cam picture of their package thief, run it through the private facial recognition system, and then Jane/Joe will add that list of possibl

      • by PPH ( 736903 )

        I'm not sure if you are joking

        I'm never quite sure myself.

        There are plenty of examples of local police departments "subcontracting" tasks that they can't afford to do or are not allowed to do to private entities or federal TLAs. It isn't a stretch of the imagination to step from "There's an unknown suspicious looking person on my doorstep! Send help!" to "My Ring camera shows a suspicious looking person on my doorstep! Here's the photo and included biometric ID. Send help!" The local police respond to the complaint as submitted. If I r

    • hmm.. yes.. Google's 1 year moratorium on police using its Rekognition software ends in June or so. They may have other pretended uses to keep the data flowing in the meantime.
  • 1..2..3.. (Score:2, Insightful)

    by Vinegar Joe ( 998110 )

    It's wacist!!!

    From the article:
    "Many privacy advocates are concerned that the AI-powered face recognition systems would not only disproportionately target communities of color"

    • by I75BJC ( 4590021 )
      All the reports on facial recognition (that I recall reading) say that minorities (brown, black) are not recognized as well as human eyes. Doesn't this mean that FR can't "read" minority faces and that FR is not a threat to minority communities.

      Is this another Woke decision -- with basis in reality?

      I don't know the answer to my above question and have not read any reportings that FR works as effectively with minorities as with majorities.
      Plus, I don't like FR for general governmental use.
      YMMV
      • by XXongo ( 3986865 )

        All the reports on facial recognition (that I recall reading) say that minorities (brown, black) are not recognized as well as human eyes. Doesn't this mean that FR can't "read" minority faces and that FR is not a threat to minority

        Nope. If you read the articles, it means that when the facial recognition system is a black face on criminal in a video, it randomly identifies any face and says "that's the guy!"

  • Pokey man (Score:3, Funny)

    by Tablizer ( 95088 ) on Friday February 12, 2021 @08:23PM (#61057900) Journal

    Don't post your real portrait anywhere. I use Pikachu instead. If I do something bad, then Pikachu goes into the pokey.

  • by xavdeman ( 946931 ) on Friday February 12, 2021 @08:29PM (#61057922)

    "Minneapolis City Council Member Jamal Osman talked about the chilling effects of government surveillance against minorities and immigrants"

      Of course if the crime rate among your subset of the population is higher, you are going to get investigated more often. This technology would help to *more specifically* target criminals than the alternative: random police stops in crime-ridden neighborhoods, would. Be careful what you wish for.

  • by superdave80 ( 1226592 ) on Friday February 12, 2021 @08:54PM (#61057970)

    Many privacy advocates are concerned that the AI-powered face recognition systems would not only disproportionately target communities of color,

    OK, I'm not sure why that would be, but for the sake of argument I'll go along with that. However:

    ... but that the tech has been demonstrated to have technical shortcomings in discerning non-white faces.

    Um, it it can't identify minorities as well as whites, wouldn't it be LESS effective against minorities? Someone want to help me out, because this entire statement makes no sense to me.

    • ... but that the tech has been demonstrated to have technical shortcomings in discerning non-white faces.

      Um, it it can't identify minorities as well as whites, wouldn't it be LESS effective against minorities? Someone want to help me out, because this entire statement makes no sense to me.

      No, it means all minorities look the same to it. Same as many police. Does not mean it looks on them favorably.

    • by XXongo ( 3986865 ) on Friday February 12, 2021 @11:08PM (#61058280) Homepage

      ... but that the tech has been demonstrated to have technical shortcomings in discerning non-white faces.

      Um, it it can't identify minorities as well as whites, wouldn't it be LESS effective against minorities?

      It is less effective in correctly identifying minorities. It is more effective in incorrectly identifying a minority as a criminal.

      • But facial recognition is not the final check on if someone is the perpetrator.
        • But facial recognition is not the final check on if someone is the perpetrator.

          Yes, and a 911 call that brings a SWAT team to your house because somebody reported a hostage situation is not final evidence of guilt either.

          Nevertheless people can die when they are misidentified as being a violent criminal and the police are sent out to apprehend them with the mistaken belief that they are armed and dangerous.

          But, yeah, you're right: in the best case, when everything else works right, you just get handcuffed and taken in to the station, and after a night in jail they release you saying

  • by biggaijin ( 126513 ) on Friday February 12, 2021 @09:32PM (#61058066)

    It is clear than all commonly used methods of identifying criminals are racist, since so many people of color are imprisoned in the US. It's all part of the racist plot dominant cultures have organized against the oppressed, which also now includes weather reporting, mathematics, and history.

  • In fact, why should the MPD even try to catch criminals? The oppression of criminals is white supremacist racism.

  • ...this is the same City Council that was advocating the elimination of the police ...while simultaneously paying $4500 in taxpayer funds PER DAY for their own armed private security.

    https://www.insider.com/minnea... [insider.com]

  • Thinking ahead here, as advances in human/computer interfaces evolves, I can't help but think that this law is all but certain to be ultimately found as discriminatory for people who might have otherwise untreatable memory disorders.

    It makes more sense to me to instead of banning the technology from the department, to just set specific limits on how the technology is allowed to be used.

    Hand guns are almost universally used to kill people, and killing people is bad. Yet we do not abolish hand guns.

  • If you think about it, facial recognition also has the capacity to exonerate a suspect as well as finger one. Would you rather be found guilty by a human witness that may or may not have seen you commit a crime and may finger you out of revenge? Or would you rather an AI make a decision without emotion.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (5) All right, who's the wiseguy who stuck this trigraph stuff in here?

Working...