Skip to main content

Apple responds to Guardian report about contractors hearing private conversations while ‘grading’ Siri

A report today from The Guardian details claims from one of Apple’s contractors about the conversations that are heard as interactions with Siri are reviewed and analyzed. The report brings up privacy and transparency concerns and Apple has released a statement addressing the matter.

The Guardian’s source for this latest report is said to be a contractor that “grades” Siri. The whistleblower said that “Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control” for Apple’s voice assistant.

The report mentions that Apple doesn’t clearly let consumers know that a small percentage of recordings are sent to contractors to improve Siri. Apple shared an official statement saying that less than 1% of Siri activations are analyzed to improve the service.

“A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company added that a very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.

Just earlier this month, Google came under fire for the same practice. It said that 0.2% of Google Assistant queries are transcribed by humans to improve the voice assistant. Amazon also uses humans to review Alexa interactions.

One of the bigger concerns from the Apple contractor was the data they claimed to be attached to the Siri recordings.

There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.

Apple notes in its Siri privacy policy that users can turn off features like Location Services for Siri or turn off Siri altogether.

Finally, the whistleblower said that contractors reviewing Siri interactions are only supposed to report technical problems with the service, and not anything based on the content they hear. But they had concerns about the contractors who might misuse the data.

Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].

Here’s Apple’s full privacy for Siri and Dictation:

FTC: We use income earning auto affiliate links. More.

HyperDrive USB-C hub
You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel

Manage push notifications

notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing
notification icon
We would like to show you notifications for the latest news and updates.
notification icon
Please wait...processing