Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Apple Technology

Apple Contractors Were Each Listening To 1,000 Siri Recordings a Day, Says Report (theverge.com) 24

According to The Verge, citing a report from the Irish Examiner, Apple used human contractors to listen to an arduous 1,000 Siri recordings every day to help make the digital assistant better at giving you what you want. Some of those recordings included personal data and snippets of conversations, one contractor says. An anonymous reader shares the report: Apple's Siri assistant currently records and sends snippets of your voice requests back to Apple to be studied so that Apple can try to make Siri better at giving you what you want. In July, The Guardian reported that contractors also hear some rather personal things, like "confidential medical information, drug deals, and recordings of couples having sex." A contractor that spoke with the Irish Examiner said his job involved noting when Siri could actually help or if Siri was triggered accidentally. The Guardian said contractors "regularly" hear confidential information, but the contractor speaking with the Irish Examiner said the recordings "occasionally" had "personal data or snippets of conversations." There's a reason why we're likely learning more details about the contractors' work now: they may be out of a job. Apple has temporarily stopped using contractors to listen to Siri conversations, and the Irish Examiner reports that Apple no longer needed the services of Cork, Ireland-based contracting company GlobeTech, which employed the contractor who spoke to the Irish Examiner.

Neither Apple nor GlobeTech is denying that there may have been layoffs: GlobeTech merely referred the Irish Examiner to a statement that said it ended a "client project" early. Apple said in a statement to the paper that it is "working closely with our partners" as it reviews its processes around grading Siri conversations. Apple has not replied to a request for comment from The Verge. The work Siri contractors reportedly did sounds similar to how Microsoft contractors transcribe Cortana recordings to help train Microsoft's voice assistant. One of the Cortana contractors told Motherboard that contractors are expected to transcribe and classify roughly three "tasks" every minute. For the Siri contractors, transcribing 1,000 voice commands means they likely had to do about two per minute, assuming they were working an eight-hour day.

This discussion has been archived. No new comments can be posted.

Apple Contractors Were Each Listening To 1,000 Siri Recordings a Day, Says Report

Comments Filter:
  • Just look for the times when people are using Siri mutliple times in a row, without accepting whatever it is that she gave them the first time (eg, not looking at the links she gives, etc.)

    I'm sure you'll find a few of of me cussing out Siri. Maybe also look for phrases such as 'you suck, Siri'

    • Amazon Alexa seems to have better voice-recognition tech than Siri. When I log in to alexa.amazon.com, I am prompted to self-correct anything that was transcribed incorrectly. That seems better than using contractors, since I know exactly what I said. Alexa will also identify individual household members, and learn their speech patterns and preferences.

    • Haha, I was thinking along the same lines. I’m wondering if any of these contractors have heard me exasperatingly ask Siri the same thing multiple times - followed by my usual query “Hey Siri, why do you suck so badly?”

      • by rtb61 ( 674572 )

        They were training the voice recognition system to particularly transcribe the specific individuals voice so the automated system could more accurately transcribe that individuals spoken words. Common sense would have gone with a voice training program, that would run through various exercises and update the core server online. They probably felt people would not accept doing it and many would be incapable of completing the exercise. I have to say that reflects pretty poorly on the human race (the one and o

  • Apples are delicious, ignore this silly talk of snakes.
  • counseling services for their employees transcribing Siri?

  • Apple's PR Dept will probably play this as a disgruntled former contractor, but sadly, wouldn't it be more surprising to hear a corporation with access to your personal information didn't abuse the privilege.

  • I did side work like this for a while (not saying it was Apple or anyone else in particular). In many thousands of samples, I never heard one that contained any identifying information.

    Yes, I heard some pretty sick stuff. Mostly you chuckle at it. The whisperers were the creepiest ones. They were generally from one of two pools, very young children searching for porn or adults searching for porn, often child porn. There was always this hint of eyes glazed over with lust and excitement of the forbidden in th

  • Of course people are listening to what you say to Siri. There's no other way to train it than to have people listen and then correct it when it's wrong. Help me here - is the issue simply that Apple hasn't properly told everybody that what you say to Siri may not be "private"? Isn't that obvious? I feel like I'm missing something when I keep seeing these inexplicable (to me) "stories" about some "scandal" where someone heard what you said to Siri/Alexa/Bixby/whatever. Newsflash - if you don't want someone to hear it, don't say it.

    • Of course people are listening to what you say to Siri. There's no other way to train it than to have people listen and then correct it when it's wrong.

      The way Apple, Google, Amazon, et al are doing this is not the only way to do the job of transcribing human speech. One should begin by asking questions aimed at putting the user in control of their own computer, questions including which people get to listen and correct the transcription? A privacy-preserving system could let the user train their own voice

    • What is the issue? They say they don't record what you say, but they do record what you say. Everything you say near one of these devices is stored on a server somewhere.As you say, there is no other way to train the program to work better.

      I don't have to worry about it, I would never own one of these things.

    • There are tons of ways to train a computer voice recognition tool. We have a lot of public domain voice recordings, public TV broadcasts, and Apple can use all of the podcasts they host as well. There are tons of voice samples with lots of accents and languages from around the world they can use. They don't need to use private recordings for this.

  • If an employee is listening to 1,000 recordings/shift, assuming an 8 hour shift and no breaks, that puts them at about 2.5 recordings/minute...

    Less than 30seconds to hear, process and respond before the next one starts.

    So what.

    It's like overheating riders on an amusement ride as they whiz by.

    • by AHuxley ( 892839 )
      Re "So what.".
      Who is the customer for that voice, voice print and content?
      Who is paying for the product?
      Under the USA Freedom Act roving NSA wiretaps are still extra legal.
      Who gets to share with the NSA when content is legally collected?
      Down at the local Fusion centre? The "detect" part of a many agency effort to find criminal and terrorist activity all over the USA.
      DEA? CIA? FBI? Immigration? US mil as its "near" a base, port, camp, fort?
      City and state police can afford "ad" packaged mic co
  • So Siri is a sex doll now?

  • but for quality control.
    So the NSA gets the best voice prints every decade?
    Cant have the ads and the NSA as customers not getting the quality expected.
    Another big US brand looking after quality of product for its real customers.

    Remember its not your "confidential information" under the USA Freedom Act.
    USA Freedom Act https://en.wikipedia.org/wiki/... [wikipedia.org].

    For very legal roving wiretaps to work, the NSA needs the best quality of audio from every mic in the USA.
    Thats needs constant contractor support at
  • I find it oddly telling that a company claiming privacy as their only selling point has humans listening to you having sex.

    • I find it oddly telling that a company claiming privacy as their only selling point has humans listening to you having sex.

      Let's get realistic here. You are having sex. You or your partner have their iPhone nearby and say "Hey Siri". You now have invited Siri to listen. If your girlfriend says "Hey Siri, Joe's willy is too small, what can I do to come", that's surely not Apple's fault. You want someone to listen.

  • No! Really?
  • "Apple's Siri assistant currently records and sends snippets of your voice requests back to Apple to be studied so that Apple can try to make Siri better at giving you what you want."

    Ah, like Google, they insist on trying to give me what they _think_ I want, instead of giving me what I fucking asked for.

  • ... and not quite subtle, is that AI and machine learning are failures.

    Several on /. have observed this before: When it comes down to the nut-cutting, algo doesn't work.

  • Uh huh, right.

      I wonder how much of the more saucy stuff ended up on a flash drive for 'homework' so they can "study" it more.

  • Obviously no problem here. Anyone who listened to 1,000 recording in a day is not going to remember anything. Probably reduced to a vegetable state.

    Just joking, but no funny mods. No joke. I'd have thought the topic had some potential for humor or wit.

  • Or is it a scarcity of readers of Slashdot?

    Repeating the meta-suggestions for solution approaches:

    (1) Find a viable financial model to fund improvements. (CSB is one that could get some of my limited money.)

    (2) My first preference for improvement would be the moderation system. (MEPR is the one I'd put my money towards.)

It is easier to write an incorrect program than understand a correct one.

Working...