Facebook turns a blind eye to underage users and allows extreme content to remain on the platform


CHILD'S PLAY TV investigation finds that Facebook turns a blind eye to underage users and allows extreme content to remain on the platform

Channel 4's Dispatches programme found that Facebook moderators often 'pretend they don't know what under-age looks like' when it comes to users under the age of 13

By Rod McPhee, Bizarre Reporter 17th July 2018, 12:33 am

FACEBOOK ignores underage users and knowingly allows extreme material to remain its site, according to a new Channel 4 investigation.

An undercover probe by the broadcaster’s Dispatches programme went inside the web giant’s moderation department which handles complaints, but found bosses even allowed racist or violent posts to remain.

And many reported posts took days to tackle, despite the company saying offensive content should be dealt with within 24 hours.

In footage shot by an undercover reporter, one staff member said: “If you start censoring, people lose interest. It’s all about making money at the end of the day.”

Another staff member featured in the documentary, which airs tonight at 9pm, revealed their approach to combatting users beneath the age minimum of 13.

They said: “We have to have an admission that the person is underage. If not, we just like pretend that we are blind and that we don’t know what underage looks like.”

Dispatches revealed that Facebook’s UK’s moderation operations, which are outsourced to Dublin-based company Cpl Resources plc, has a backlog of 7,000 daily complaints about posts.

The documentary revealed how a child abuse campaigner asked the company to remove footage of a two-year-old boy being beaten up by a man. They refused as it didn’t violate their terms and conditions, and within 24 hours the graphic images were shared more than 44,000 times.

The company actually used the footage as an example of content which should not be removed during training sessions.

Unless it was live-streamed, the also admitted they would not report footage of child abuse to police.

During training to join the moderation team, the undercover reporter was shown a cartoon of a girl appearing to be drowned with the caption: “When your daughter’s first crush is a little negro boy.”

He was told this was a post which should not be removed.

Staff would also allow abusive comments of Muslims to be kept on the site, so long as they were called “Muslim immigrants” — and in some cases images of self-harm were also kept up.

Facebook also appeared to show favour to right-wing groups, with controversial Britain First allowed more violations than the company’s rules allow before their page was eventually taken down in a bid to protect the social network’s revenue.

A staff member says: “If you start censoring, people lose interest.”

One worker said: “They had eight or nine violations and they’re only allowed five. But obviously they have a lot of followers so they’re generating a lot of revenue for Facebook.”

In the documentary, Roger McNamee, one of Facebook’s earliest investors and former mentor of the site’s boss Mark Zuckerberg, claimed that extreme content was the company’s money-making “crack cocaine.”

He said: “If you’re going to have an advertising based business, you need them to see the ads so you want them to spend more time on the site.

“They want as much extreme content as they can get.”

Comments

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

Visualizing The Power Of The World's Supercomputers

BMW traps alleged thief by remotely locking him in car