Apple and Amazon curtail human review of voice recordings
Apple and
Amazon curtail human review of voice recordings
Photo:
Houston Chronicle Screenshot
Apple Inc. says the company will stop using
contractors to listen in on users through Siri to grade the voice assistant's
accuracy.
Apple and Amazon announced Friday
they would curtail the use of humans to review conversations on their digital
voice assistants, a move that gives users more privacy controls over their
communications.
Apple said it would stop using
contractors to listen in on users through Siri to grade the voice assistant's
accuracy after an Apple whistleblower had told the Guardian that the
contractors responsible for "grading" the accuracy of the digital
assistant regularly overheard conversations about doctors' appointments, drug
deals and even couples having sex. Their job was to determine what triggered
Siri into action - whether the user had actually said, "Hey, Siri" or
if it was something else, such as the sound of a zipper.
Apple said it would suspend the
global analysis of those voice recordings while it reviewed the grading system.
Users will be able to opt out of reviews during a future software update.
"We are committed to delivering
a great Siri experience while protecting user privacy," said Cat Franklin,
an Apple spokeswoman, in an email to The Washington Post.
Later Friday, Amazon updated its
privacy policy regarding voice recording made by its Alexa service. Amazon will
now let users opt out of having humans review those recordings, selecting a new
option in the settings of the Alexa smartphone app. Amazon employees listen to
those recordings to help improve its speech-recognition technology.
The company tweaked Alexa privacy
features in May, giving users the ability to delete recordings of their voices.
And users could already opt out of letting Amazon develop new features with
their voice recordings.
Many smart-speaker owners don't
realize that Siri, Alexa and, until recently Google's Assistant, keep
recordings of everything they hear after their so-called "wake word"
to help train their artificial intelligences. (Amazon founder Jeff Bezos owns
The Washington Post.) Google quietly changed its defaults last year, and
Assistant no longer automatically records what it hears after the prompt
"Hey, Google."
Apple said it uses the data "to
help Siri and dictation . . . understand you better and recognize what you
say," Apple said. But this wasn't made clear to users in Apple's terms and
conditions.
"There have been countless
instances of recordings featuring private discussions between doctors and
patients, business deals, seemingly criminal dealings, sexual encounters and so
on," the Apple whistleblower told The Guardian. "These recordings are
accompanied by user data showing location, contact details, and app data."
In response, Apple said that the
recordings accounted for only 1 percent of Siri activations and lasted just a
few seconds. They also were not linked to users Apple IDs.
Apple contractors in Ireland told the
Guardian that they had been sent home for the weekend and were told it was
because the global grading system "was not working." Managers stayed
on site, but said that no one knew how the system's suspension would affect
their employment.
The Apple whistleblower said the
Apple Watch and the HomePod, a smart speaker, were especially prone to
accidental activation.
A 2018 study from investment firm
Loup Ventures found that HomePod's Siri was accurate in answering standardized
questions 52 percent of the time.
Comments
Post a Comment