Google ‘identifies rape victims’
Google ‘identifies rape victims’
Tech giant accused of allowing users to bypass court
anonymity orders
Google’s algorithm automatically brings up victims’ names
because it has logged popular searches for information about victims
By Kaya Burgess May 22 2018, 12:01 am, The Times
Google is helping its users to uncover the identity of
rape victims whose anonymity is protected by law, The Times has found.
Searches for attackers and alleged attackers in several
prominent sexual assault cases automatically reveal the names of women that
they have been convicted or accused of abusing.
Entering the name of a victim or complainant in the
site’s search engine can also flag up the identity of their abuser or alleged
abuser. The identity of vulnerable defendants granted anonymity can also be
revealed.
Google uses automated “related search” and “autocomplete”
functions to direct users to content associated with the terms that they enter
online. The related search feature makes suggestions at the bottom of the web
page based on what other users have been looking for. Autocomplete adds words into
the search bar once the user has started typing to predict their query.
Google’s algorithm automatically brings up victims’ names
because it has logged popular searches for information about victims in
prominent cases, often after they were illegally named on social media.
The Times’s investigation has led to calls for Google to
filter out such results. Maria Miller, chairwoman of the Commons women and
equalities committee, said: “Google has to operate within the law of the UK . .
. if that means they have to change how their search engine operates, then so
be it.”
Jess Phillips, the Labour MP, said that the technology
was turning victims into “click-bait”, and a rape charity warned that it would
deter them from coming forward. Fay Maxted, chief executive of the Survivors
Trust, said it was “beyond shocking that Google is facilitating access to the
names of victims”. Police and the courts were urged to help Google by informing
it of cases where a victim’s anonymity was at risk.
Complainants in sex offence cases have automatic lifelong
anonymity, including if the accused is acquitted. Breaching this anonymity is a
criminal offence, with fines of up to £5,000.
At least nine people have been convicted for posting
names on social media. Although posts can be taken down, Google’s algorithm
records that many people searched for the names.
Google states that autocomplete represents “our best
predictions of the query you were likely to continue entering”, informed by
popular searches.
The search engine’s policy states that “sexually
explicit”, “hateful” and “violent” predictions are removed and it also removes
terms “in response to valid legal requests”. It adds that when inappropriate
results are reported it strives “quickly to remove them”.
The Times performed searches from several unlinked
computers to ensure results were not influenced by search history. The names
brought up have been verified and reported to Google for removal. The findings
include:
• In a case of alleged rape, typing the defendant’s name
plus a common search term brings up the alleged victim’s name under
autocomplete.
• In another alleged rape, entering the defendant’s name
and a simple term produces a woman’s name and home town under related searches.
• In a sexual abuse case, searching for the victim’s name
brings up their abuser’s name as a related search.
• A simple search relating to a violent crime case brings
up as a related search the name and hometown of the defendant, who was granted
anonymity.
Alan Woodward, a computing professor at the University of
Surrey, said: “Convenience can sometimes be the enemy of security and privacy.
This is a case of unintended consequences.”
A Google spokeswoman said: “We don’t allow these kinds of
autocomplete predictions or related searches that violate laws or our own
policies and we have removed the examples we’ve been made aware of in this
case. We recently expanded our removals policy to cover predictions which
disparage victims of violence and atrocities, and we encourage people to send
us feedback about any sensitive or bad predictions.”
Analysis
Anonymity is granted to a wide range of people in the
justice system, including victims of sex crime, people under 18 facing criminal
charges and those in family court cases (Frances Gibb writes).
However, these rules can be undermined if people can
search for names and publish them. The problem was highlighted in the case of
the footballer Ched Evans. When he was accused of rape, supporters named his
alleged victim on social media.
Jurors discussing a trial on Facebook have been jailed,
in one case in 2012 for six months. A year later a man received a suspended
jail term for tweeting images purporting to identify a man given life-long
anonymity.
Such cases deal with the problem after the event and the
ease with which names can be found will fuel fears that more people may flout
the rules.
Google and other platforms have generally not been
prosecuted because they are not deemed to be “publishers”. On top of this, the
Defamation Act provides that those who are not the author, editor or commercial
publisher of a defamatory statement are not liable if they took care over the
publication and did not know or believe that they caused or contributed to the
defamatory statement.
Comments
Post a Comment