Apple contractors 'regularly hear confidential details' on Siri recordings...hear people having sex, making drug deals...
Apple contractors
'regularly hear confidential details' on Siri recordings
Workers
hear drug deals, medical details and people having sex, says whistleblower
Apple contractors
regularly hear confidential medical information, drug deals, and recordings of
couples having sex, as part of their job providing quality control, or
“grading”, the company’s Siri voice assistant, the Guardian has learned.
Although Apple does not explicitly
disclose it in its consumer-facing privacy documentation, a small proportion of
Siri recordings are passed on to contractors working for the company around the
world. They are tasked with grading the responses on a variety of factors,
including whether the activation of the voice assistant was deliberate or
accidental, whether the query was something Siri could be expected to help with
and whether Siri’s response was appropriate.
Apple says the data “is used to
help Siri and dictation … understand you better and recognise what you say”.
But the company does not
explicitly state that that work is undertaken by humans who listen to the
pseudonymised recordings.
Apple told the Guardian: “A small
portion of Siri requests are analysed to improve Siri and dictation. User
requests are not associated with the user’s Apple ID. Siri responses are
analysed in secure facilities and all reviewers are under the obligation to
adhere to Apple’s strict confidentiality requirements.” The company added that
a very small random subset, less than 1% of daily Siri activations, are used
for grading, and those used are typically only a few seconds long.
A whistleblower working for the
firm, who asked to remain anonymous due to fears over their job, expressed
concerns about this lack of disclosure, particularly given the frequency with
which accidental activations pick up extremely sensitive personal information.
Siri can be accidentally activated
when it mistakenly hears its “wake word”, the phrase “hey Siri”. Those mistakes
can be understandable – a BBC interview about Syria was interrupted by the
assistant last year – or less so. “The sound of a zip, Siri
often hears as a trigger,” the contractor said. The service can also be
activated in other ways. For instance, if an Apple Watch detects it has been
raised and then hears speech, Siri is automatically activated.
The whistleblower said: “There
have been countless instances of recordings featuring private discussions
between doctors and patients, business deals, seemingly criminal dealings,
sexual encounters and so on. These recordings are accompanied by user data
showing location, contact details, and app data.”
That accompanying information may
be used to verify whether a request was successfully dealt with. In its privacy
documents, Apple says the Siri data “is not linked to other data that Apple may
have from your use of other Apple services”. There is no specific name or
identifier attached to a record and no individual recording can be easily
linked to other recordings.
Accidental activations led to the
receipt of the most sensitive data that was sent to Apple. Although Siri is
included on most Apple devices, the contractor highlighted the Apple Watch and the company’s HomePod
smart speaker as the most frequent sources of mistaken recordings. “The
regularity of accidental triggers on the watch is incredibly high,” they said.
“The watch can record some snippets that will be 30 seconds – not that long but
you can gather a good idea of what’s going on.”
Sometimes, “you can definitely
hear a doctor and patient, talking about the medical history of the patient. Or
you’d hear someone, maybe with car engine background noise – you can’t say
definitely, but it’s a drug deal … you can definitely hear it happening. And you’d
hear, like, people engaging in sexual acts that are accidentally recorded on
the pod or the watch.”
The contractor said staff were
encouraged to report accidental activations “but only as a technical problem”,
with no specific procedures to deal with sensitive recordings. “We’re
encouraged to hit targets, and get through work as fast as possible. The only
function for reporting what you’re listening to seems to be for technical
problems. There’s nothing about reporting the content.”
As well as the discomfort they
felt listening to such private information, the contractor said they were
motivated to go public about their job because of their fears that such
information could be misused. “There’s not much vetting of who works there, and
the amount of data that we’re free to look through seems quite broad. It
wouldn’t be difficult to identify the person that you’re listening to,
especially with accidental triggers – addresses, names and so on.
“Apple is subcontracting out,
there’s a high turnover. It’s not like people are being encouraged to have
consideration for people’s privacy, or even consider it. If there were someone
with nefarious intentions, it wouldn’t be hard to identify [people on the
recordings].”
The contractor argued Apple should
reveal to users this human oversight exists – and, specifically, stop
publishing some of its jokier responses to Siri queries. Ask the personal
assistant “are you always listening”, for instance, and it will respond with:
“I only listen when you’re talking to me.”
That is patently false,, the
contractor said. They argued that accidental triggers are too regular for such
a lighthearted response.
Apple is not alone in employing
human oversight of its automatic voice assistants. In April, Amazon was
revealed to employ staff to listen to some Alexa recordings,
and earlier this month, Google workers were found to be doing the same with Google Assistant.
Apple differs from those companies
in some ways, however. For one, Amazon and Google allow users to opt out of
some uses of their recordings; Apple offers no similar choice short of
disabling Siri entirely. According to Counterpoint Research, Apple has
35% of the smartwatch market, more than three times its nearest competitor
Samsung, and more than its next six biggest competitors combined.
The company values its reputation
for user privacy highly, regularly wielding it as a competitive advantage
against Google and Amazon. In January, it bought a billboard at the Consumer
Electronics Show in Las Vegas announcing that “what happens on your
iPhone stays on your iPhone”.
Comments
Post a Comment