Privacy as a concept in today’s world should be considered as a luxury than a general requirement. Saying so will be more likely to make sense when you get to know about the recent scandal where companies like Apple, Amazon, and Google, which are leading providers in the market for smart home devices and services, get to hear your private conversations starting from confidential medical information, drug deals or even couples having sex.
The voices are heard while retrieving the information for quality control and grading purposes of the company’s voice Assistant. The specific Siri voice Assistant of Apple has been examined and exposed in an exclusive post by The Guardian.
Apple doesn’t disclose the privacy controlling factor entirely but does the documentation for a small portion of Siri recordings which is passed on to contractors working for the company and scattered across the world. The examination takes place to detect whether the recording was activated deliberately or accidentally and was the voice appropriate for Siri to rectify the query and proceed with providing answers to it or not. This as a process was targeted to help Siri understand and recognize tasks better, according to Apple.
Apple told the Guardian: “
A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements. A very small random subset, less than 1% of daily Siri activations, are used for grading, and those users are typically only a few seconds long.”
However, according to a source, the clippings involve extremely sensitive information starting from medical reports to the private space conversation. There have been voice records of the business deals and even criminal dealings. Moreover, the clippings include user data such as showing location, contact details, and app data.
Talking about the details, The source reveals,
“Sometimes, you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”
The discomfort that contractors felt listening to such private information, the contractor said they were motivated to go public about their job because of their fears that such information could be misused. “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.
With such information coming up, the trust over voice assistant seems to dissolve apparently.