If you thought Apple is one of the most secure brands concerning data and privacy, you should definitely read this.
According to a recent report by The Guardian, Apple’s contractors get to hear whatever Siri listens to, including private and intimate conversations. The audio clips are sent to third-party contractors (basically humans) who grade the response, noting its accuracy, among other things.
Apple does state on its privacy page that “To help them recognize your pronunciation and provide better responses, certain information such as your name, contacts, music you listen to, and searches is sent to Apple servers using encrypted protocols.” but it doesn’t really disclose clearly that humans would be listening to your clipping and doing their analysis.
According to a statement by Apple to the Guardian, “A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.”
Siri starts hearing not just through the catchphrase, ‘Hey Siri,’ but according to the report, something as odd as just the sound of a zipper is also known to trigger it.
What’s worse is that on an Apple Watch, when the watch senses that it’s being raised (thinking that the user might be using Siri) the assistant starts recording, This has lead to various private discussions, criminal dealings, even sounds during sex being recorded by Siri, which is sent to the contractors (albeit without any connection with the account it was recorded on), where it is graded and analysed.
While it isn’t new that personal recordings are being sent to third party contractors to analyse and improve the responsiveness of the smart assistants, (Google and Alexa have being doing it since a while now) it is a shocker coming from a brand like Apple, who has always given utmost importance (or so we thought) to user’s privacy.