Facebook has TRUST ratings for users – but it won’t tell you your score


SOCIAL SCORE Facebook has TRUST ratings for users – but it won’t tell you your score

The social network is predicting your trustworthiness in a bid to fight fake news

By Sean Keach, 21st August 2018, 3:20 pm Updated

FACEBOOK is rating users based on how "trustworthy" it thinks they are.

Users receive a score on a scale from zero to one that determines if they have a good or bad reputation – but it's completely hidden.

The rating system was revealed in a report by the Washington Post, which says it's in place to "help identify malicious actors".

Facebook tracks your behaviour across its site and uses that info to assign you a rating.

Tessa Lyons, who heads up Facebook's fight against fake news, said: "One of the signals we use is how people interact with articles.

"For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true."

Earlier this year, Facebook admitted it was rolling out trust ratings for media outlets.

This involved ranking news websites based on the quality of the news they were reporting.

This rating would then be used to decide which posts should be promoted higher in users' News Feeds.

It's not clear exactly what users' ratings are for, but it's possible they may be used in a similar way.

But Facebook hasn't revealed exactly how ratings are decided, or whether all users have a rating.

According to Lyons, a user's rating "isn't meant to be an absolute indicator of a person's credibility".

Instead, it's intended as a measurement of working out how risky a user's actions may be.

Online commentators are already comparing the system to China's creepy "social credit" system.

The Chinese government analyses users' social media habits and online shopping purchases, assigning citizens a score.

Jaywalking or skipping train fares can result in you getting a lower score.

This score is then used to determine whether people can take loans, and even travel on public transport.

Some citizens with very low ratings become "blacklisted", making it impossible to book a plane flight, rent or buy a property or stay in a luxury hotel.

The system is currently being piloted, but will become mandatory in China by 2020.

Facebook's own rating system is the latest drive in its bid to tackle fake news, a growing problem for the social network.

The site, which sees 2.23 billion users log on every single month, has become a hot-bed for falsified news coverage.

Earlier this year, billionaire Facebook boss Mark Zuckerberg vowed to fight fake news.

"The world feels anxious and divided, and Facebook has a lot of work to do," the 34-year-old Harvard drop-out explained.

Facebook has admitted that its site has been the subject of political fakery campaigns from Russia.

After initially denying any complacency on its part, the social network admitted more than 126 million US users had viewed some form of Russian propaganda.

A congressional hearing followed, with Facebook, Twitter, and Google in the dock.

And Facebook's been grappling with the problem ever since.

Speaking in January, Samidh Chakrabarti, who heads up civic engagement at Facebook, said: "Even a handful of deliberately misleading stories can have dangerous consequences.

"We're committed to this issue of transparency because it goes beyond Russia.

"Without transparency, it can be hard to hold politicians accountable for their own words.

"Democracy then suffers because we don't get the full picture of what our leaders are promising us," he wrote, in what looks like a subtle snipe at US President Donald Trump.

"This is an even more pernicious problem than foreign interference.

"But we hope that by setting a new bar for transparency, we can tackle both of these challenges simultaneously."

Chakrabarti said that the misinformation campaigns targeting Facebook users are "professionalised, and constantly try to game the system".

"We will always have more work to do," he added.

We've asked Facebook for comment and will update this story with any response.

Comments

Popular posts from this blog

BMW traps alleged thief by remotely locking him in car

Report: World’s 1st remote brain surgery via 5G network performed in China

New ATM's: withdraw money with veins in your finger