Facebook’s darkest secret: a platform for paedophiles
Facebook’s darkest secret: a platform for paedophiles
Alexi Mostrous Head of Investigations April 13 2017,
12:01am, The Times
Facebook is an admired social network for many of its
users, but there are pages that host disturbing and potentially illegal content
that most people would find alarming
“Welcome,” one forum poster said, “to the darkest corner
of the web.” The writer was not referring to encrypted messaging services or
obscure internet chatrooms. He was talking about Facebook.
The world’s most popular social media network hosts
communities where users exchange images of children so explicit that they are
likely to breach UK laws.
Facebook often allows the content to remain on the site,
an investigation by The Times shows. In some cases, the network’s automatic
algorithms even promote such content to users that they think might be
interested.
Front Line Lolicons, a group with almost 5,000 members,
is dedicated to posting pictures of “lolicons”, a Japanese term for drawings
depicting prepubescent females. Many of the pictures on the forum depict young
girls or toddlers being violently abused. They remain accessible on Facebook
despite a 2009 British law outlawing any images, including cartoons, that
depict children in a grossly offensive and pornographic manner.
Using a fake profile set up to investigate offensive
content on the site, The Times reported the worst cartoons to Facebook. Its
moderators decided to keep up many of the images including two that, according
to a leading QC, “plainly” contravene UK law.
A cursory glance at Front Line Lolicons would have raised
concerns. On the group’s Facebook download page, a document entitled “Sexual
life of the child” is freely available.
Facebook’s decision to keep up such content could expose
it to prosecution.
Many members joke about paedophilic activity, including
one who boasted about seeing “the most beautiful girl today . . . she was 5 or
6, but sadly I did not get to talk to her and no interaction was made except
for eye contact”. When another user questioned his remarks, the member said:
“This is a pedo page.” A third user, one of the most prolific commentators,
asked: “Why are little girls so f***ing sexy dammit?”
Although Front Line Lolicons purportedly shows only
cartoons of abuse, images of real children appear on it and dozens of other
similar forums such as Lolicon Hell and Raep [rape] Me. The Times found images
including a pornographic child-abuse DVD, an adolescent-looking child posing
naked in a wig and a video of a young child apparently being forced to give
oral sex.
Other content shared by Facebook users includes the
aftermath of an Isis beheading
All three images were kept online when reported to
Facebook, which considered they were not in breach of its community standards.
Julian Knowles, QC, a criminal law expert, said that the latter video would
“undoubtedly breach UK indecency laws” and appeared to depict a sexual assault
on a child.
Facebook’s decision to keep up such content could expose
it to prosecution itself.
Mr Knowles said that if someone reported an illegal image
to Facebook, and a senior moderator signed off on keeping it online, then
“Facebook is at risk of committing a criminal offence because the company might
be regarded as assisting or encouraging its publication and distribution”.
British politicians have already expressed fury at the
company’s failure to remove terrorist content. Ministers are considering
enforcing a German proposal that would fine internet companies millions of
pounds if they failed to remove illegal or hate-filled content within a
reasonable time.
Front Line Lolicons was not discovered as a result of a
tip-off. The Times accessed the group because Facebook’s own algorithms
suggested it to our fake profile, which we set up last month to investigate
inappropriate content.
Having joined a group called Lolicon Heaven, our
imaginary male Facebook user was recommended to join Front Line Lolicons under
the site’s “suggested groups” section. When he joined, he was recommended other
groups including True Lolicon Hell and Pokegirl Lewds. Many published
potentially illegal content.
Promoting such groups could also land Facebook in legal
trouble, Mr Knowles said.
“If Facebook’s algorithms suggest groups which involve
members distributing or exchanging indecent or prohibited images or terrorist
material then if that fact has been brought to Facebook’s attention and it
continues to do nothing about it, then subject to proving knowledge on the part
of the company, it is at risk of committing an offence by assisting or encouraging
the distribution of illegal material,” he said.
A simple search of Facebook using keywords associated
with paedophilia brings up dozens of groups apparently acting as marketplaces
for the exchange of illegal child-abuse content.
Even here, Facebook appears to be reluctant to act. The
Times reported one group because it openly called for videos of children under
13 to be posted by members before they would be allowed to join. Facebook’s
moderators thanked us for raising it, but said: “We’ve looked over the group
that you reported and . . . it doesn’t go against any of our specific community
standards.”
Most of the groups are “closed” — you have to be invited
by someone to enter — but Facebook allows anyone to see who has already joined.
A group named after an acronym used by paedophiles, featured a heavily made-up
child from Spain. The girl, who looks about ten, is recorded on Facebook as
being a “sex professional”. The profile was being reviewed by Facebook.
Mr Knowles said that although the image was not illegal
in itself, if groups facilitated the exchange of illegal images of children and
with the knowledge of Facebook, then the company was at risk of committing a
criminal offence too.
After The Times’s fake user applied to join some of the
child-abuse groups, Facebook’s algorithms suggested that he might “like” the
page of a ten-year-old dancer because they had identified her as being of
possible interest.
Facebook, like Google and Twitter, is under pressure from
politicians and companies to clean out extremist and hateful material.
More than 250 global advertisers boycotted YouTube after
The Times revealed that it had inadvertently funded extremists who posted on
the site. The company apologised and overhauled its brand safety systems to
remove adverts from five times as many videos.
Last month MPs on the home affairs select committee
criticised the companies for failing to remove hate content. Yvette Cooper, the
chairwoman, said that it was disgraceful that antisemitic videos were hosted on
YouTube. She warned the company that it faced financial penalties if it failed
to act. “Google [which owns YouTube] is one of the richest companies on the
planet with sophisticated technology,” she said. “It is quite capable of
sorting this.”
Tim Loughton, a Tory MP, said social networks abided by
“completely different standards”. He said: “If you are an employer and you take
on an illegal immigrant and they are discovered, saying, ‘I had no idea’ is not
a defence, and you’ll be fined. Why should it be any different [with social
networks]?”
Comments
Post a Comment