90 Policy Groups Call On Apple To "Abandon" '1984'-Style Surveillance Tool
90 Policy Groups Call On Apple To "Abandon" '1984'-Style Surveillance Tool
BY TYLER DURDEN FRIDAY, AUG 20, 2021 - 05:20 PM
More than 90 civil society organizations wrote an open letter to
Apple, demanding the company abandon its surveillance tool that plans to be
integrated into iPhones, iPads, and other Apple products that will scan images
before they are uploaded to iCloud for child pornography.
"Though
these capabilities are intended to protect children and to reduce the spread of
child sexual abuse material (CSAM), we are concerned that they will be used to
censor protected speech, threaten the privacy and security of people around the
world, and have disastrous consequences for many children," the open
letter wrote, which was organized by the US-based nonprofit Center for Democracy
& Technology (CDT).
The '1984'-style
surveillance tool, called "neuralMatch," is
expected to be installed on US iPhones via a software update. The artificial
intelligence system can proactively inform a human team of reviewers if it
finds a CSAM on a user's Apple device. If reviewers confirm the material, law
enforcement will be contacted.
The open letter said the new surveillance tool creates risks for
children and could censor speech and threaten the privacy and security of
people. It added that governments might force Apple to scan for other images
that might be objectionable to those in power.
"It's so disappointing and upsetting that Apple
is doing this, because they have been a staunch ally in defending encryption in
the past," Sharon Bradford Franklin,
co-director of CDT's Security & Surveillance Project, told Reuters.
As explained by FT, here's how neuralMatch works:
Apple's
neuralMatch algorithm will continuously scan photos that are stored on a US
user's iPhone and have also been uploaded to its iCloud back-up system. Users'
photos, converted into a string of numbers through a process known as
"hashing," will be compared with those on a database of known images
of child sexual abuse.
[...]
The
system has been trained on 200,000 sex abuse images collected by the US
nonprofit National Center for Missing and Exploited Children.
Another issue described by the open letter states:
Algorithms
designed to detect sexually explicit material are notoriously unreliable. They
are prone to mistakenly flag art, health information, educational resources,
advocacy messages, and other imagery.
... and
further, this means "iMessages will no longer provide confidentiality
and privacy to those users through an end-to-end encrypted messaging system in
which only the sender and intended recipients have access to the information
sent. Once this backdoor feature is built in, governments could compel Apple to
extend notification to other accounts, and to detect images that are
objectionable for reasons other than being sexually explicit."
The new surveillance tool seems like something authoritarian
governments (like the CCP) would use. The problem is when Apple or the
government starts abusing the tool.
The open letter asks Apple to consult more regularly with civil
society groups and "abandon" the new surveillance tool so that users
with end-to-end encryption are protected.
Apple's move is a regressive step for individual privacy and is
ushering in a '1984'-style surveillance world.
https://www.zerohedge.com/markets/90-policy-groups-call-apple-abandon-1984-style-surveillance-tool
Comments
Post a Comment