By Tyler Durden
Should it come as any surprise? And yet the details are shocking and outrageous. A whistleblower working for Apple has revealed to The Guardian that its popular voice activated helpful virtual assistant Siri, now in millions of households, “regularly” records people having sex, and captures other “countless” invasive moments which it promptly sends to Apple contractors for their “quality control”:
Apple contractors regularly hear confidential medical information, drug deals, and recordings of couples having sex, as part of their job providing quality control, or “grading”, the company’s Siri voice assistant, the Guardian has learned.
We’ve long pointed out that according to Amazon’s Alexa terms of use, the company collects and stores most of what you say to Alexa (or perhaps what you groan) – including the geolocation of the product along with your voice instructions.
However, what’s not disclosed or at least not well known up to this point is that a “small proportion” of all Siri recordings of what consumers thought were private settings are actually forwarded to Apple contractors around the world, according to the new report. Supposedly this is to ensure Siri is responding properly and can continue to distinguish dictation. Apple says, according to The Guardian, the data “is used to help Siri and dictation… understand you better and recognise what you say”.
Also Read: Researchers Turn to Amazon and Google For Bedroom Health Surveillance
But an anonymous current company insider and whistleblower told The Guardian:
There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on. These recordings are accompanied by user data showing location, contact details, and app data.
Contradicting Apple’s defense that these sultry samples are “pseudonymised recordings,” Apple employees can know precisely who is having sex and where, and what time the deed was done.
Apple’s formal response to The Guardian investigation was as follows:
A small portion of Siri requests are analysed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.
Just trust us, Apple appears to be saying. Most of what can be deemed sensitive data is captured through so-called accidental activations by “trigger words,” according to the report, with the highest rates of such occurrences via the Apple Watch and HomePod smart speakers.
“The regularity of accidental triggers on the watch is incredibly high,” the company whistleblower explained. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”
The insider continued, “you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal… you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch.”
Further less than comforting is just how many across the globe have access to these private moments: “There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad,” the contractor continued. “It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on.”
“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”
The evidence continues to mount: Siri is a blackmailer’s dream come true… or spy agency, or voyeur, or political adversary, or just plain pervert.
No comments:
Post a Comment