Facebook to Rank News Sources by Quality to Battle Misinformation

Facebook to Rank News Sources by Quality to Battle Misinformation

Tech giant will rely on user surveys of trustworthiness to try to preserve objectivity

Facebook CEO Mark Zuckerberg said the change is necessary to address the role of social media in amplifying sensationalism, misinformation and polarization.

By Deepa Seetharaman Updated Jan. 19, 2018 5:07 p.m. ET

Facebook Inc. plans to start ranking news sources in its feed based on user evaluations of credibility, a major step in its effort to fight false and sensationalist information that will also push the company further into a role it has long sought to avoid—content referee.

The social-media giant will begin testing the effort next week by prioritizing news reports in its news feed from publications that users have rated in Facebook surveys as trustworthy, executives said Friday. The most “broadly trusted” publications—those trusted and recognized by a large cross-section of Facebook users—would get a boost in the news feed, while those that users rate low on trust would be penalized. The change only applies to U.S. users., though Facebook plans to roll it out later internationally.

The announcement, which confirms a report last week by The Wall Street Journal, comes after Facebook outlined another major news-feed overhaul that would diminish the presence of news in favor of what it calls “meaningful” interactions on the platform. This shift will result in news accounting for about 4% of the posts that appears in users’ feeds world-wide, down from the current 5%, Facebook Chief Executive Mark Zuckerberg said in a post Friday.

The planned introduction of a trustworthiness score marks an important shift for Facebook, which Mr. Zuckerberg has long said seeks to avoid becoming the “arbiters of truth.” But the company has been under pressure since the 2016 U.S. presidential campaign to stop enabling fabricated news articles and misinformation to spread across its platform. About 45% of U.S. adults get news from Facebook, according to a Pew Research Center survey conducted last summer.

Mr. Zuckerberg said the change—which will be tested leading up to the 2018 U.S. midterm elections—is necessary to address the role of social media in amplifying sensationalism, misinformation and polarization. “That’s why it’s important that News Feed promotes high quality news that helps build a sense of common ground,” he wrote in his post.

In an interview, Adam Mosseri, the Facebook executive who oversees its news feed, acknowledged that the company was wading into “tricky” territory by weighting publishers based on user trust.

“This is an interesting and tricky thing for us to pursue because I don’t think we can decide what sources of news are trusted and what are not trusted, the same way I don’t think we can’t decide what is true and what is not,” Mr. Mosseri said in an interview.

He added, however, that Facebook engineers themselves weren’t taking a stance on credibility because the company relied on its users to provide a value judgment. He compared the approach with Facebook’s reliance on third-party fact-checkers to determine whether or not an article is completely fabricated. Facebook uses those evaluations to determine where those posts should be ranked in users’ feeds.

On Friday, some publishers and media observers expressed concern about the ranking change, which, like other Facebook news-feed changes may have a significant and unpredictable impact on news publishers that rely on the site for traffic, including the Journal.

“For a company that wields this much power to make these kind of decisions with zero transparency really scares me,” said Neil Patel, publisher of the conservative site Daily Caller. He worried that publishers don’t know what questions Facebook is asking to whom, and how exactly the conclusions will affect his business.

Nicco Mele, director of the Shorenstein Center on Media, Politics and Public Policy at Harvard University, said that while Facebook isn’t taking sides, relying on users’ judgement may not improve the quality of news on the platform. “You may end up with reality television,” Mr. Mele said.

In surveys, Facebook is asking a small percentage of its users whether they recognize a publication and if so how much they trust it. The aggregate of those results will inform its news-feed rankings. Mr. Mosseri called the trust score an important weight, but one of many factors.

Facebook doesn’t plan to share the scores with publishers, saying the metric gives an incomplete picture of how specific posts get distribution. Facebook runs tens of thousands of users surveys a day and the results help shape what its more than two billion monthly users see in their news feeds.

Mr. Mosseri acknowledged the shortcomings of relying on surveys, and said Facebook plans to fine-tune its rankings using other factors such as how informative and locally relevant news sources are. “No one signal that we use is perfect,” he said. “There’s always examples of when [the results] aren’t lining up with what we’re intending.”

Facebook’s trust score would boost the news-feed presence of well-known and widely trusted publications even if users disagree with the content or aren’t avid readers. The change won’t help publishers trusted by a small group of devoted readers but disparaged by everybody else, Mr. Mosseri said. Both posts from Facebook pages and links to news sites shared by users will be affected by the ranking change.

But, as with other Facebook news-feed changes, the moves could have a significant and unpredictable impact on news publishers, including The Wall Street Journal, many of which get substantial traffic from Facebook. Publishers’ got 24% of their online traffic from Facebook as of last month, on average, down from 40% at the end of 2016, according to analytics firm Parse.ly.

Mr. Mosseri said Facebook tried to take steps to avoid hurting small, lesser-known publishers, although those outlets still could be outranked by more prominent publications. He said publishers won’t be punished if they aren’t well-recognized in user surveys.

Many publishers are likely to be concerned about allowing users to decide how news outlets are ranked. Media executives have long been wary of Facebook’s increasing dominance in both the ad market and as a vital distribution network for news with the power to massively magnify or dial down the amount of traffic to a site with a simple algorithm tweak.

At the same time, publishers have lobbied Facebook intensively to take a more active role in weeding out low-quality “clickbait,” conspiracy theories and bogus stories and to prioritize news coming from established and respected media outlets.

Many media companies have been critical of Facebook’s longstanding position that it isn’t a media company, but simply a platform.

Over the past year, Facebook has consulted extensively with publishers on many issues including how to prioritize more trustworthy news sources, promote local news sources and accommodate news sites that are behind paywalls.

The company started discussing the possibility of a trust score internally around last fall. Facebook consulted experts about the trustworthy score but found there was “a massive amount of disagreement” among various media organizations over what makes a publication credible, Mr. Mosseri told the Journal.

In his post, Mr. Zuckerberg noted that he wasn’t comfortable with Facebook making its own decision about what is and isn’t trustworthy and that relying on outside experts “would likely not solve the objectivity problem.”

—Lukas I. Alpert and Benjamin Mullin contributed to this article.


Comments

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

BMW traps alleged thief by remotely locking him in car

Visualizing The Power Of The World's Supercomputers