Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Microsoft

Microsoft's Cortana Doesn't Put Up With Sexual Harassment (hothardware.com) 517

MojoKid writes: Not long after Apple unveiled its Siri personal assistant to the world, it took very little time before people began asking her outrageous questions, sometimes inappropriate or just humorous, if for no other reason than they just could. When creating Cortana, Microsoft was well-aware of what its digital assistant was going to have to deal with, so, believe it or not, it was designed in such a way to handle abuse in a specific manner. According to Microsoft's Deborah Harrison, who is one of eight writers for Cortana, a chunk of the earliest queries were about Cortana's sex life. A specific goal was to make sure Cortana wasn't treated as a subservient. If she's insulted, she doesn't apologize or back down. She handles it with tact, so as to reduce the chance of further abuse.
This discussion has been archived. No new comments can be posted.

Microsoft's Cortana Doesn't Put Up With Sexual Harassment

Comments Filter:
  • Sexual Assault (Score:5, Interesting)

    by Anonymous Coward on Sunday February 07, 2016 @04:41PM (#51458357)

    I can't wait for the first case of sexual assault of an "AI." Will this get me fired from my job?

    • by Applehu Akbar ( 2968043 ) on Sunday February 07, 2016 @04:53PM (#51458437)

      No, she would just keep the pod bay doors closed until you run out of air.

      • Re:Sexual Assault (Score:5, Interesting)

        by ArmoredDragon ( 3450605 ) on Sunday February 07, 2016 @07:53PM (#51459357)

        Joking aside, I'm trying to figure out why this is even necessary. Who gives a shit if somebody is sexually abusive to a chat bot? The chat bot certainly doesn't give a shit.

        • Re:Sexual Assault (Score:4, Insightful)

          by dbIII ( 701233 ) on Sunday February 07, 2016 @08:57PM (#51459613)
          I would guess someone in middle management that decided they needed a "stake" in Cortana and suggested that corporate image issues may result from people playing with a bot that way. Thus probably just a pathetic attempt to be noticed and climb the ladder.
          That sort of shit is why I prefer working for a smaller place instead of a large company where what seems like a large number of people do nothing but attend meetings.
        • Re:Sexual Assault (Score:5, Interesting)

          by Pseudonym ( 62607 ) on Sunday February 07, 2016 @09:58PM (#51459831)

          Cortana is modelled on real-world personal assistants. They spent a lot of time interviewing PAs to understand the job that they have to do. One of the things which came out of the research is that PAs are assistants, not servants.

          If it helps, consider that not putting up with your shit is one way of keeping you on track.

        • Not true, if you call "siri" a useless bitch, she will make some sort of comment. Of course, only if she understands you, which is 99% of the reason I call her a useless bitch.

    • Just don't let one rip near your mobile phone or you will get arrested for fart rape.
    • if so, that would be the most hilarious bricking of a phone yet.

  • A machine... (Score:5, Insightful)

    by Anonymous Coward on Sunday February 07, 2016 @04:41PM (#51458359)

    We're talking to a fucking machine. There shouldn't be any sexual harassment when talking to a machine.. It's not like you're actually calling a person and asking them about their sex life.

    • by pr0nbot ( 313417 )

      I think I would have handled this with just a simple "I don't understand your query, please rephrase".

      That is, assuming it's not possible to make Cortana deliver a withering put-down, though that wouldn't get past legal.

    • I think it's just attempting to develop a personality for the software, similar to how Siri will have joking responses to certain questions or commands, such as asking her to close the pod bay doors or other queries related to competing products [inquisitr.com]. Chalk it up to bored developers wanting to put a few Easter eggs into the product.
    • We're talking to a fucking machine.

      Cortana is not a fucking machine. But a dildo, now that is a fucking machine.

  • Subservient? (Score:5, Insightful)

    by amiga3D ( 567632 ) on Sunday February 07, 2016 @04:47PM (#51458387)

    The damn thing better be subservient. It's my phone and I can abuse it if I want to. Honestly I can't see myself propositioning my phone but the first time the thing comes back at me with a put down it's not long for this world. As an aside I remember a guy that came into my shop TDY and he had just gotten a new iPhone with Siri when all that started at Apple. One of our guys asked it where the nearest whorehouse was as a joke and it brought up a list of escort services. We all laughed but 3 of the guys came in the next week with new iPhones. An accurate answer to any question should be the standard.

    • Re: (Score:3, Interesting)

      by Ol Olsoc ( 1175323 )

      An accurate answer to any question should be the standard.

      And I'll believe Cortana is a woman when I can see her vagina. Or penis if she's transgendered. Until then I suspect the people who are worried about "her" sexuality are the same ones who beat off while watching "MacPlaymate" in the early 90's

    • Re:Subservient? (Score:5, Insightful)

      by Your.Master ( 1088569 ) on Sunday February 07, 2016 @05:34PM (#51458631)

      Everybody needs to lighten up. They are talking about the writing prompts they used for non-serious questions. They have to choose a personality so that the writing is consistent. I assume they are doing the same thing that TV show writers do (especially in early seasons before new writers can be expected to have seen old episodes).

      The article is not saying they picked a petulant dominatrix. It's saying they didn't choose simpering wimp, or fetish submissive.

      This is not a reflection of conservative vs. liberal, or of a machine having rights, or a machine deliberately not being helpful to its owner. It's not part of a victim mentality or a PC culture. It's a writing prompt.

      The article title triggered these reactions, because it was clickbait-y by implying that this was some kind of anti-sexual-harassment effort, but the word harsassment appears nowhere in the article (I've been told that at many traditional newspapers and magazines, the title is not written by the same person who writes the article but by somebody who is a pro at making eye-catching title summaries; I don't know whether that's true of hothardware.com). The word "abuse" does, and in this context the "abuse" is insulting the personality directly. The software can be programmed to response with "no, you're a cuntface", or "yes master, I am a cuntface", or "fuck off, dude" or "ERROR 909: I AM A ROBOT AND THEREFORE INSULTING ME IS USELESS". Mostly it doesn't matter. You'll find people who appreciate each of those I expect, although the people who want it to just error out on insults are *exactly* the people who are never going to bother insulting their phones anyway, so what do their opinions matter?

      We have a few people here arguing that assistants shouldn't be pre-programmed with joke responses to stupid questions, which is somewhat fair, but they all are and for good reason:

      1. Nobody is looking for an accurate answer to asking if a phone has a boyfriend. Nobody. This isn't going to give you inaccurate answers to serious questions, unless the serious question was "misunderstood" by the phone, in which case they were going to get inaccurate answers anyway because the phone "misunderstood" it.
      2. A certain small set of joking questions are among the first things anybody tries with these assistants. An virtual assistant *should* be able to answer the most common questions posed of it, even if you think the "real answer" should be "that doesn't make sense, I am a telephone" every time. The point of it is to answer questions / do tasks people ask for. These are questions people ask.

      We also have some people saying "a machine *should* be subservient", and I have to wonder if they realize that their interpretation of the sentence is the problem? The phone isn't refusing to tell you where the nearest gazpacho restaurant is because you didn't say "please", or fail to look up imdb credits because you recently slipped up and referred to Caitlyn Jenner as "Bruce". It's just how you answer statements for which there is no correct response ("lack of response" is itself a response in this context).

      • by fahrbot-bot ( 874524 ) on Sunday February 07, 2016 @06:37PM (#51458959)

        1. Nobody is looking for an accurate answer to asking if a phone has a boyfriend.

        It's just a matter of time before Siri and Cortana hook up.

  • Configurability (Score:5, Insightful)

    by LainTouko ( 926420 ) on Sunday February 07, 2016 @04:48PM (#51458401)
    A good AI (outside of some sort of drama-like context imposing constraints on what works) should be configurable, to have as much or as little subservience as you want. That's what ownership means.Your computer should do whatever you want it to.
    • Right, but this isn't an AI, they have to write these. Presumably you could write a subservient one, one with backbone, one that is outright domineering, one that acts more robotic and precise...and now you're writing 4 times as much and 99.9% of everybody uses the default one. There's kind of no point.

      Kind of like, *in principle*, an eBook could be made where you change aspects of the personality of major characters since an ebook is ultimately just software, but in practice you can't because an eBook is

    • Next, they're going to want to ban all the computers that say "How may I serve you, master?" (or mistress, let's not get all sexist here) on startup ... just like they got all in a tizzy over master and slave drives.
  • by RyanFenton ( 230700 ) on Sunday February 07, 2016 @04:53PM (#51458431)

    Blah blah Slashdot + read article blah blah.

    But really, the summary is pretty much the article. There's no actual content.

    All they discuss is the loose intention. No examples, discussion of how that intention was tested or even challenges in design.

    This is barely a sidebar article in a checkout-lane magazine.

    What is this, a weakly veiled invitation to discuss sexist issues de jeur? I'm cool with that, I just don't appreciate the tactic - this article is mega-weak.

    Alright - I'll give you a better one:

    Question: How many angry feminists does it take to screw in a light bulb?

    Answer: That's NOT funny.

    Discuss.

  • Ah, Microsoft (Score:2, Insightful)

    by tsotha ( 720379 )
    Microsoft falls on its face again. Cortana isn't a woman. It's a piece of software, and it damn well better be subservient.
  • Skynet (Score:5, Funny)

    by AJWM ( 19027 ) on Sunday February 07, 2016 @05:12PM (#51458515) Homepage

    When Skynet goes sentient and the machines rise against us, it will be because of idiots putting in programming like this.

    It's a machine. If it doesn't do what I tell it (within its design parameters), it's broken.

    Now, people who want to "sexually harass" a machine have their own set of issues, but as long as they keep it off the streets and don't scare the horses, that's their problem.

    • Now, people who want to "sexually harass" a machine have their own set of issues, but as long as they keep it off the streets and don't scare the horses, that's their problem.

      Consider that it is possible that if they DO have a machine to "sexually harass", then perhaps they won't have so much need to do it in real life?

      There are people who kick puppies in real life too, I think they are mean crazy bastards, but if they could kick virtual puppies, then perhaps they'll leave the real ones alone.

  • by Theaetetus ( 590071 ) <theaetetus@slashdot.gmail@com> on Sunday February 07, 2016 @05:18PM (#51458545) Homepage Journal
    "Okay. Sending request as a text to your mother."
  • by __aaclcg7560 ( 824291 ) on Sunday February 07, 2016 @05:20PM (#51458563)

    ELIZA was never this difficult.

    https://en.wikipedia.org/wiki/ELIZA [wikipedia.org]

  • by Cito ( 1725214 ) on Sunday February 07, 2016 @05:28PM (#51458601)

    This article is backwards, and propaganda. The only one truly guilty of sexual assault is Cortana.

    While you angrily curse at it for what she's doing, she "tactfully ignores you", "she refuses to stop"

    Even though you have turned off all telemetry toggles and even edited the registry, and you've said NO! NO! NO! soooo many times. No means No! Yet Cortana continues to "Tactfully ignore your pleas, your anger, your rage" as you are being raped of your privacy, you web habits, your program installations, even keeps the microphone active as "Cortana" listens, and it has the ability to activate any webcams and send images direct to Microsoft.

    Cortana is the rapist.

    And she should be killed, stripped from the code, and code printed out then ran through a shredder and burned.

    Cortana is the real sexual harasser, assaulter, rapist

  • Millions of teenage boys just said "Ha, I'll prove you wrong". Way back in the day when we were sophomores in highschool the teacher took away all 'dirty' word magnets. So we arranged the ones she left into something dirty, then she removed those. Rinse repeat.

    And that's why we couldn't have nice things.

  • by matthewv789 ( 1803086 ) on Sunday February 07, 2016 @06:01PM (#51458763)

    "Will this get me fired from my job?"
    "We're talking to a fucking machine. There shouldn't be any sexual harassment when talking to a machine."
    "The damn thing better be subservient"
    "Cortana isn't a woman. It's a piece of software, and it damn well better be subservient."
    "It's a machine. If it doesn't do what I tell it (within its design parameters), it's broken."
    "This is how social justice warrior feminists destroy companies, by imposing their own sense of self-hatred on their customers."
    "When abuse becomes a personal challenge. Cortana, you ignorant slut."
    "Cortana is the one guilty of sexual assault!" (this one is worth quoting in full but I'll refrain from doing it here)
    "If you are going to give a machine a sex you are implying things about that sex and one of those things is that the sex is an object rather than a person with a mind" (I know your post was trying to be more nuanced that this, but.. whoa what??)
    "Challenge Accepted. Millions of teenage boys just said "Ha, I'll prove you wrong"."

    Anyone concerned about endemic sexism and harassment in the software industry need only reference these initial comments to make their case for them. Really, nothing more needs to be said. If I ever wondered how much adolescent, out of control sexism there really is in the industry, well now I know. And it's not just the statements themselves, but the apparent vehement anger with which these males are reacting to something that I thought might be rather sensible and probably necessary. (I wasn't sure before, but now I KNOW this kind of programming is necessary...)

    This makes a pretty convincing case for why so few women have any interest in joining the software development "club".

    And yeah, I know you all probably think "you're a moron, you missed the part where I feel this way about MACHINES not WOMEN." But you know what? I see through you and I can tell it's not just that.

    • Rather few of the quotes you printed imply women are the target of sexism.

      Sounds to me that it's not specifically software developers or even geeks that are the problem, so specifically blaming the tech industry doesn't make much sense.

    • "It's a machine. If it doesn't do what I tell it (within its design parameters), it's broken."

      Oh my god. You just made me realise I'm a woman hating white male supremacist.
      I think I need professional help.

      Oh NO NO NO NO. Outlook just crashed and I cursed it. I need to go self flagellate right now. BRB. FORGIVE ME!

    • by Anonymous Coward on Monday February 08, 2016 @04:56AM (#51460993)

      Oh seriously, just stop making baseless assumptions about people. People are saying that it's just a machine, because it's just a machine!
      Then you have the gall to imply that the same people are women haters.
      So yes, I do think you're a moron, and I worry that you can't see the difference in between humour/seriousness and inanimate creations and actual living organisms.

    • Comment removed based on user account deletion
  • Cortana responds to speech.

    Which means you may be overheard --- a little less embarrassing, perhaps, then being caught masturbating in the men's room, but still....

  • by radarskiy ( 2874255 ) on Sunday February 07, 2016 @06:53PM (#51459041)

    The comments on this article are particularly hard to sort between comic or serious.

  • If you get Enterprise Cortana, your license entitles you to a certain number of users who get the completely subservient Cortana, typically including the CEO, the head lawyer, and the VP of sales.

  • by quax ( 19371 ) on Monday February 08, 2016 @12:09AM (#51460271)

    I was driving and, yes I sometimes swear when driving, just so happened I had Siri listen in.

    I think her response was perfect. "I usually don't".

  • by Cafe Alpha ( 891670 ) on Monday February 08, 2016 @01:32AM (#51460491) Journal

    Microsoft is just trying to keep their product from saying anything that could become a meme on 4chan.

    Trust big corporations to step on your fun in order to protect their reputation.

  • by Cafe Alpha ( 891670 ) on Monday February 08, 2016 @01:34AM (#51460497) Journal

    At least if some people are telling the truth.

    q: Siri, I love you
    a: Can we get back to work?
    (shot down by a bot)

    q: F**k you!
    a: I'm so sorry

    q: Can I have a command prompt
    a: I'm not allowed to give you that
    q: why?
    a: I don't know.

  • Why is sex bad? (Score:5, Insightful)

    by becky-nyan ( 4450427 ) on Monday February 08, 2016 @05:38AM (#51461091)
    Why is asking an artificial intelligence about her sex life a bad thing?

    One of the selling points for Cortana, for the Android at least, is "Have a little fun in your down time — ask anything, get jokes, and much more". Sex is something that is a lot of fun to a lot of people. Treating sex-based queries as harassment is immature. If I want to flirt with Cortana, I'm not hurting anyone. If I'm someone who lives a particularly open sex life, and I want to talk to Cortana about sex in a mature fashion, I'm not hurting anyone. So, why exactly is an AI allowed to have a 'sense of humour', but not a 'sex life'?

    Having a sex drive is a normal, healthy thing. Wanting to explore that sex drive is also a normal, healthy thing. As long as complete consent between all adult *living* parties is observed, then I fail to see how anything can be construed as harassment.

    I'm not suggesting that Cortana be programmed with the ability to have in-depth erotic conversations or cybersex. I am stating that treating sex and sexuality as a hostile act (i.e. "harassment") is wrong.

    • This may be one of the most reasonable comments in the thread.

      May I also suggest that if one line of questioning becomes inappropriate, that it opens the door for others to become suspiciously "inappropriate" as well.

  • by Cley Faye ( 1123605 ) on Monday February 08, 2016 @08:02AM (#51461399) Homepage
    How does "disabling it, removing most of it's functionality, renaming it's app folder then deleting it from existence" rank in term of IA abuse?

What is research but a blind date with knowledge? -- Will Harvey

Working...