Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
AI Microsoft

Microsoft Researchers Say NLP Bias Studies Must Consider Role of Social Hierarchies Like Racism (venturebeat.com) 150

As the recently released GPT-3 and several recent studies demonstrate, racial bias, as well as bias based on gender, occupation, and religion, can be found in popular NLP language models. But a team of AI researchers wants the NLP bias research community to more closely examine and explore relationships between language, power, and social hierarchies like racism in their work. That's one of three major recommendations for NLP bias researchers a recent study makes. From a report: Published last week, the work, which includes analysis of 146 NLP bias research papers, also concludes that the research field generally lacks clear descriptions of bias and fails to explain how, why, and to whom that bias is harmful. "Although these papers have laid vital groundwork by illustrating some of the ways that NLP systems can be harmful, the majority of them fail to engage critically with what constitutes 'bias' in the first place," the paper reads. "We argue that such work should examine the relationships between language and social hierarchies; we call on researchers and practitioners conducting such work to articulate their conceptualizations of 'bias' in order to enable conversations about what kinds of system behaviors are harmful, in what ways, to whom, and why; and we recommend deeper engagements between technologists and communities affected by NLP systems."
This discussion has been archived. No new comments can be posted.

Microsoft Researchers Say NLP Bias Studies Must Consider Role of Social Hierarchies Like Racism

Comments Filter:
  • at Monkeyshit Corp must be exterminated, not appeased. It is fucking disgraceful what all this imported garbage has turned America into.
    • Damn straight, if everyone can come here and there's more competition for limited resources, we might end up with capitalism!

  • by LynnwoodRooster ( 966895 ) on Monday June 01, 2020 @07:18PM (#60133052) Journal

    Is it only because a person believes the results are free of bias? Must results always be exactly equal to distribution of race, creed, gender? Such that immigrants of color would be penalized because they tend to outperform native-born people of color on tests?

    This sounds like some folks are afraid that using a completely dispassionate method of reaching conclusions (smart systems/AI) may, in fact, show that there ARE differences amongst groups of individuals, and that would bring the walls of the SJW castle crashing down...

    • by ShanghaiBill ( 739463 ) on Monday June 01, 2020 @08:00PM (#60133186)

      Must results always be exactly equal to distribution of race, creed, gender?

      It depends on what the system is used for.

      The system that triggered this debate was a ML system that looked at historical data, compared it to a current defendant's case, and then made a recommendation whether the defendant should be released on bail or held in pre-trial confinement.

      It turned out to be very "racist", denying bail far more often to black defendants.

      But it wasn't "wrong" because race is actually very strongly correlated with jumping bail. Blacks are significantly more likely to fail to show up for trial.

      But as a society, we have decided that is unacceptable. Everyone should be treated as an individual, not as a "black guy" or "white guy".

      So they tried and failed to fix the system by tweaking the inputs. If race was excluded, the system was still nearly as racist because it found correlations between the defendant's zip code, type of offense (blacks use crack cocaine, whites prefer powder cocaine), and even the defendant's name (Deshaun is more likely to be black than Travis). No matter what they did, the outputs were still "racist".

      So if the result is socially unacceptable, and tweaking the inputs doesn't work, the obvious solution is to tweak the outputs.

      Just let the system do it's work and then apply a "race filter" on the output to select the same percentage of whites and blacks for bail. This will result in slightly more no-shows, but I think that is an acceptable tradeoff.

      • by russotto ( 537200 ) on Monday June 01, 2020 @08:24PM (#60133232) Journal

        The original system (COMPAS) never included race as an input. It had 137 features, but none of them was race (nor name). This is hard to analyze. Some researchers [researchgate.net] made a similar predictor with only two features -- age, and number of convictions. It performed about as well, and was about as "racist".

        Just let the system do it's work and then apply a "race filter" on the output to select the same percentage of whites and blacks for bail. This will result in slightly more no-shows, but I think that is an acceptable tradeoff.

        Yeah, but that gives away the game.

        • Re: (Score:2, Flamebait)

          by AmiMoJo ( 196126 )

          Some researchers made a similar predictor with only two features -- age, and number of convictions. It performed about as well, and was about as "racist".

          Probably because the cops are more likely to give a young white guy a stern talking to and take his weed away, where as a young black guy will end up getting charged.

          • Totally. That must also be why young black men are convicted of murder many many orders of magnitude more often than old white women. Because cops just give the old white women murderers a stern talking to.

      • What's the point of using the system if you're just going to ignore the results. If it's showing disparities for something that should not matter then it should be able to identify what other weights factor into the outcomes. I'd wager that if you fed it even more data, particular socio-economic information, you'd find out that being black has very little to do with it and that the driving factor would be poor and single-parent household, which are the same main predictors for criminality in the first place
        • What's the point of using the system if you're just going to ignore the results.

          They shouldn't ignore the results. But they shouldn't just use them blindly either.

          If it's showing disparities for something that should not matter

          Often something that shouldn't matter actually matters.

          I'd wager that if you fed it even more data, particular socio-economic information, you'd find out that being black has very little to do with it

          I would gladly accept that wager because that is what they did, and they didn't get the results you are predicting. Political correctness is not the same as correctness.

        • > Ignoring the results here is like doing an analysis that tells you that CO2 contributes to global warming, but deciding to build more coal power plants just because phasing them out would make people uncomfortable.

          <Cries in Australian>

      • How do you correlate

        race is actually very strongly correlated with jumping bail

        with

        select the same percentage of whites and blacks for bail. This will result in slightly more no-shows

        are blacks slightly more, or strongly more likely to jump bail?

      • But as a society, we have decided that is unacceptable

        We haven't; a vocal subset of society has decided that it is unacceptable, and the people in charge have decided that they really really don't want to be called racist even if it's completely unfounded. The vast majority of people, on the other hand, are perfectly fine with the idea that high risk individuals should be treated as high risk, even if they happen to be an inconvenient colour.

      • by AmiMoJo ( 196126 )

        It turned out to be very "racist", denying bail far more often to black defendants.

        But it wasn't "wrong" because race is actually very strongly correlated with jumping bail. Blacks are significantly more likely to fail to show up for trial.

        So you are saying that the system is biased against black people, and that black people are less likely to participate in the system that is biased against them so it's not "wrong"?

        Seems that if you are white and you know you will probably get off with a slap on the wrist and a fine you are much more likely to not skip bail than if you are black and expect to do significant time, doesn't it?

        • Seems that if you are white and you know you will probably get off with a slap on the wrist and a fine you are much more likely to not skip bail than if you are black and expect to do significant time, doesn't it?

          Sure, if you're a moron. Skipping bail guarantees you won't get off with a slap on the wrist. It's a self-fulfilling prophecy.

          • by AmiMoJo ( 196126 )

            The term you are looking for is "vicious circle."

            • Call it what you will, it's fucking stupid. Either way it doesn't change the fact that since blacks are more likely to skip bail, you can't expect them to be granted bail at the same rate as whites.

              • by AmiMoJo ( 196126 )

                Either way it doesn't change the fact that since blacks are more likely to skip bail, you can't expect them to be granted bail at the same rate as whites.

                That's textbook racism. Someone is black so they can't get bail because other people with a similar skin tone have skipped bail.

                Justice is supposed to be blind, i.e. treat everyone fairly and the same.

                • That's textbook racism.

                  It's not racism at all, let alone textbook. The ever increasing abuse of the word over the last two decades has rendered such accusations completely meaningless.

                  Someone is black so they can't get bail because other people with a similar skin tone have skipped bail.

                  No, that's not what anyone is suggesting. You just made it up.

                  Justice is supposed to be blind, i.e. treat everyone fairly and the same.

                  Yes, and when we make it completely blind (as in the originally referenced AI model) we find that blacks are denied bail at higher rates than whites.

                  You don't want justice to be blind; you want justice to see clearly, and then discriminate in a way that makes you feel like theres no rac

                  • Dude, you are very, very wrong. If you are treating someone differently on the basis of their skin color, guess what? That's racism.

                    Yes, the word has been abused and people have tried to change it. This isn't it, though.

                    • Dude, you are very, very wrong. If you are treating someone differently on the basis of their skin color, guess what? That's racism.

                      And, again, nobody is suggesting doing that. What you seem to be unable to comprehend is that a disparity in outcomes does not have to be driven by "treating someone differently". Different outcomes for different race/sex/age/culture groups arise even when the system is completely blind to those characteristics. I know that this seems impossible to cultural relativists and biology-deniers, but it is true nonetheless.

                    • You wrote this:

                      Either way it doesn't change the fact that since blacks are more likely to skip bail, you can't expect them to be granted bail at the same rate as whites.

                      That is racism. It doesn't matter if 10% or 90% of black people skip bail; you don't deny bail based on what other people with the same skin tone have or haven't done.

                      There is no control on how people treat each other; that is the outcome where there are different results and it won't be equal across race/sex/age/culture. That is MUCH different from a legal system that says "others with your skin tone have done this, so bail denied."

                    • That is racism. It doesn't matter if 10% or 90% of black people skip bail; you don't deny bail based on what other people with the same skin tone have or haven't done.

                      No, it isn't; your reading comprehension skills are abysmal. The selection which you quoted in no way states or implies that anyone should be denying bail based on skin tone.

                    • Again, you wrote this:

                      Either way it doesn't change the fact that since blacks are more likely to skip bail, you can't expect them to be granted bail at the same rate as whites.

                    • Again, you wrote this:

                      Either way it doesn't change the fact that since blacks are more likely to skip bail, you can't expect them to be granted bail at the same rate as whites.

                      Yes, I did. And, again, it neither states nor implies the idea that anyone should be denied bail based on their skin tone.

                      Is English your second language? Are you still having trouble understanding how differences in outcome can arise between groups even when factors like race aren't used as inputs? What exactly is the misunderstanding here?

                    • Yes, I did. And, again, it neither states nor implies the idea that anyone should be denied bail based on their skin tone.

                      Holy shit. You don't even understand what you type. I'll try one more time. You wrote:

                      you can't expect them to be granted bail at the same rate as whites.

                      You're trying to say that "neither states nor implies the idea that anyone should be denied bail based on their skin tone", but it DOES.

                    • ESL then?

                      If I tell you that blue people steal 20 times more often than red people, therefore you can't expect them to be arrested at the same rate, does that imply we need to go out and start arresting all the blue people? Or does it mean that they're going to get arrested at a higher rate even if the person doing the arresting is colour blind?

                      If you can answer that question then maybe we can figure out whether your issue is with language or with logic.

      • Just let the system do it's work and then apply a "race filter" on the output to select the same percentage of whites and blacks for bail. This will result in slightly more no-shows, but I think that is an acceptable tradeoff.

        I don't find that acceptable. Treating everyone exactly the same regardless of race is the only way to actually avoid racism, whether you like it or not.

    • No one denies there are differences.

      What they are against is treating different groups differently in the law and in common social environments. They are against treating the individual based on the group they are lumped with.

      What possible reason could you have to argue that we should be biased, and jail working black people who use marijuana, and let white college students smoke all they want?

      That there are differences doesn't mean we should act is if to keep perpetuating some of the negative diff
      • Re: (Score:2, Interesting)

        No one denies there are differences. What they are against is treating different groups differently in the law and in common social environments. They are against treating the individual based on the group they are lumped with.

        Is that what is happening here? Or are the results - with ZERO input on race - coming out as you don't like, and thus must be racist?

        What possible reason could you have to argue that we should be biased, and jail working black people who use marijuana, and let white college students smoke all they want?

        In this case, the study didn't even include race - but the results said "deny bail" more times to black defendants than white (as a percentage of population) and thus must still be racist. Even though the "AI" system didn't even know about race.

        That there are differences doesn't mean we should act is if to keep perpetuating some of the negative differences. You can't get an ought from an is.

        Likewise, we should not penalize someone for being of a different group - even if that group is "better" somehow. Stop using race a

        • In this case, the study didn't even include race - but the results said "deny bail" more times to black defendants than white (as a percentage of population) and thus must still be racist. Even though the "AI" system didn't even know about race.

          So you train your AI on racist data and get a racist AI...
          Who knew...

          • Fail. Reading comprehension really isn't your strong suit, is it? I guess it's just racism inherent in the system that results in higher crime rates for 3rd+ generation native-born blacks? Or is it racism inherent in the system that shows vastly lower crime rates for 1st and 2nd generation black immigrants relative to those 3rd+ generation native-born blacks?
            • Does America have a racism problem, yes/no ?
              • First: tell me how you judge whether or not there is a racism issue. Then define what you mean by racism.
                • So you can't/won't answer go it.
                  • I just want to know what rules you plan to twist to get your way.

                    Tell ya what: you first tell me how racism only applies to 3rd+ generation blacks, and then we can talk about if there is racism in the first place.

                    • Tell ya what: you first tell me how racism only applies to 3rd+ generation blacks, and then we can talk about if there is racism in the first place.

                      Easy, it doesn't.

                      Your turn:

                      Does America have a racism problem, yes/no ?

                    • It doesn't? So racism holds back all black people? Is that why 1st and 2nd generation black immigrants succeed so well - much better than native born 3+ generation blacks? Because racism?
                    • GOTO 10 [slashdot.org]
                      Already asked and answered.
                      You're setting up a strawman anyway. Being disadvantaged doesn't mean you cant succeed.

                      Still your turn:

                      Does America have a racism problem, yes/no ?

                      Also already explained to you here [slashdot.org]

                      Selection bias.
                      Immigrants of all kinds outperform the comparative native born people for any group.
                      People motivated enough to move to another country for a better opportunity are more likely to take advantage of that opportunity. They not only benefit financially, making it easier for t

                    • Do you get paper cuts? Sure. Do you have a problem with it, where it is a constant issue that hinders your life? No.

                      In the same vein, does racism exist? Sure. Does pedophilia exist? Absolutely. Does the US have a problem with either where it is a constant issue endemic throughout society? No.

                      People motivated enough to move to another country for a better opportunity are more likely to take advantage of that opportunity. They not only benefit financially, making it easier for their kids. But instill that kind of work ethic in their offspring.

                      So the problem isn't that racism exists, but that somehow people lose their drive to succeed and take advantages of the opportunity that this country affords. I believe - if you were actually honest - you'd agr

                    • 95% of LEO killings are of men. Does America have a sexism problem based on this data?

                      The answer will be the same for your question and mine. Hit it.

                    • Tell a black person racism is just like a paper cut. I'm sure that will go down well.

                      You are claiming Black people 'lose their drive' more than white people.
                      Isn't that racist of you?

            • You really know nothing about this do you Lynnwood.
              This Google search for racist AI [google.com] will help get you started
    • This sounds like some folks are afraid that using a completely dispassionate method of reaching conclusions (smart systems/AI) may, in fact, show that there ARE differences amongst groups of individuals, and that would bring the walls of the SJW castle crashing down...

      This sounds like someone who doesn't understand that dispassionately training (smart systems/AI) on racist data will result in a racist AI.
      That would bring the walls crashing down on everyone pretending there isn't any racism in the original data.

      How are you explain the racism getting in there? What are you planning to do about it?
      Or are you just happy to expand racism into future technology/generations because you pretend you can't see it?

      • How are you explain the racism getting in there?

        How do you know it's racist? What data do you have to draw that conclusion?

        What are you planning to do about it?

        Nothing, it's it's not racism - which it does not seem to be.

        Can you explain why 1st and 2nd generation African immigrants vastly out-perform [qz.com] 3rd+ generation native-born blacks in scholastic and financial success? Why they have a dramatically lower incarceration rates [nber.org] - meaning they commit fewer crimes? Are we "racist" only against some blacks - but not others?

        • Are we "racist" only against some blacks - but not others?

          Yes. Not every place in America is equally racist.

          Can some black be successful despite the racism? Yes.
          Did you have a point?

          Does America still have a racism problem, or is it all solved? yes/no ?

          • So 1st and 2nd generation blacks only settle into places without racism. That's your answer?
            • So 1st and 2nd generation blacks only settle into places without racism. That's your answer?

              No that's your conclusion.

              Are you claiming they spread out perfectly around the country?
              Is that what your data says? I'd be happy to read it if you can show it.

              Or are you instead claiming all parts of America are exactly equally as racist as the rest.

              And you still didn't answer this:

              Does America still have a racism problem, or is it all solved? yes/no ?

              • No, that's what the data shows. Look at the academic and economic success of the two groups: one is quite on-par with most immigrants, and indeed very close to whites; indeed, 2nd generation immigrant blacks outperform whites educationally, and are very close economically. Then by the 3rd+ generation, it starts to fall apart.

                If your premise is true - that America is fundamentally a racist nation - then how does it "allow" those 1st and 2nd generation blacks to succeed so well? Perhaps because the US does

                • If your premise is true - that America is fundamentally a racist nation - then how does it "allow" those 1st and 2nd generation blacks to succeed so well?

                  That's your usual strawman.
                  Just because some blacks succeed (As I've already showed you a few times now [slashdot.org]) doesn't mean racism doesn't exist.
                  It's either fundamentally dishonest or profoundly stupid for you to claim that Lynnwood.

                  Selection bias.
                  Immigrants of all kinds outperform the comparative native born people for any group.
                  People motivated enough to move to another country for a better opportunity are more likely to take advantage of that opportunity. They not only benefit financially, making it easier

    • by AmiMoJo ( 196126 )

      NLP stands for "Natural Language Processing", i.e. they are trying to make their system understand normal human speech patterns. So there is no need for belief, you can simply test it to see if it scores as well with various different cultural speech patterns and accents.

      Try running this through Siri: https://youtu.be/tFdUmLEJqtY [youtu.be]

    • by jbengt ( 874751 )

      Is it only because a person believes the results are free of bias?

      It is hard to show that all bias is eliminated, especially if you are unaware of your own subjective biases, but it is not impossible to create objective criteria. These studies, though, show that it is harder than previously thought.

      Must results always be exactly equal to distribution of race, creed, gender?

      Obviously not.

      This sounds like some folks are afraid that using a completely dispassionate method of reaching conclusions (smart syst

  • Is there absolutely nothing in this world that these folks don't insist on mucking up and slowing down?

    If you want to improve the world by working on a bias issue that matters, turn on the news this week. Any channel, it doesn't matter.

  • What is NLP? (Score:5, Insightful)

    by Bradmont ( 513167 ) on Monday June 01, 2020 @07:22PM (#60133066) Homepage

    If you're going to use an acronym that much in your post, please define it. After a quick search it seems OP is talking about Natural Language Processing.

    • Re:What is NLP? (Score:5, Informative)

      by gurps_npc ( 621217 ) on Monday June 01, 2020 @07:42PM (#60133150) Homepage

      I agree, but it is also theoretically possible to be:
      Neuro-Linguistic Programming.

      The confusion exists because the original source at Cornell was incompetent and despite including an acronym in the title of the paper did not describe that acronym in the synopsis.

  • by sjames ( 1099 ) on Monday June 01, 2020 @07:38PM (#60133134) Homepage Journal

    This is basically just a complicated way of putting words in people's mouths. Suddenly making a neutral statement using the vocabulary you grew up with becomes a "micro-aggression" or even frankly x-ist.

    Followed closely by creeping meanings. New really for-real neutral words get insisted upon. Those words promptly gain all of the connotations of the old 'unacceptable' words. Because in final analysis, it's not the words themselves but the intended meaning behind them that makes the difference.

    Newspeak was just taking it to the extreme of not actually having words that might have a negative connotation. That is, restricting the ability to express oneself until it becomes impossible to express anything that isn't officially approved. But even then, you can still feel it.

    • by gweihir ( 88907 )

      On the plus-side, attempts to force you to change your vocabulary in this way are just plain aggression, no "micro" about it. It does allow for easy identification of the fuckups that have nothing actually worthwhile to contribute but want to dominate the conversation anyways.

    • by hondo77 ( 324058 )

      ...making a neutral statement using the vocabulary you grew up with...

      My grandparents grew up using the words "colored boys" to refer to black teenagers, which they continued to use in their grandparent years. They didn't mean anything by it (as least I'm pretty sure they didn't), it's just what they were used to using.

      Vocabulary changes. Deal with it.

      • by sjames ( 1099 )

        My grandmother also used "colored". That was the proper term to be used in her day. It was the term used by groups advocating for racial equality in her day.

        • It was the term used by groups advocating for racial equality in her day.

          And still is. Pop quiz: What does the "C" in "NAACP" stand for?

    • by AmiMoJo ( 196126 )

      What does growing up with something have to do with it? My mum had a gollywog growing up, doesn't make them any less racist now and doesn't make her a racist since she discarded it when she came to understand why it was a problem.

      Anyway, this has nothing to do with people, it's about computers understanding what is said to them. Natural Language Processing, i.e. what Siri does to make sense of your request.

      • by sjames ( 1099 )

        TFA wandered well beyond computers understanding speech. TFA did, in fact include as an implicit assumption that NLP doesn't work well for (some) African-Americans because software engineers are racially biased.

  • Interpreted, it means: We must replace socially acceptable biases for socially unacceptable biases.
  • As the recently released GPT-3 and several recent studies demonstrate, racial bias, as well as bias based on gender, occupation, and religion, can be found in popular NLP language modelsref [venturebeat.com]

    And water is wet. If you read a book by a German or South American author then the text has to reflect the background of the writer. What do you expect. Why is this deemed in need of correcting.

    I just wonder is anything like this “research” is being investigated in places like China or Sa
  • by dryo ( 989455 ) on Tuesday June 02, 2020 @02:47AM (#60134424)

    ARGH. God I hate TLAs. Undefined three letter acronyms... what the hell? Who writes this stuff? And who is it for? All of the uber nerds in the world in a giant circle jerk? I thought for sure this was referring to neurolinguistic programming, but that didn't make any sense. Define your terms unless you don't care to communicate with anyone outside your little club of shibboleth-slinging jargon monsters.

  • As the recently released GPT-3 and several recent studies demonstrate, racial bias,
    as well as bias based on gender, occupation, and religion,
    can be found in popular NLP language models.

    Making a summary by cutting and pasting from the first paragraph of a paper, written
    in cryptic abbreviations, is not at all helpful to readers.

Remember, UNIX spelled backwards is XINU. -- Mt.

Working...