WP: How Facebook can influence the news, not just share it - No way to fact check 'trending data' or Any Facebook Data...

How Facebook can influence the news, not just share it

By Callum Borchers May 22 at 8:00 AM

The big media-technology story of the moment is about the news in Facebook’s “trending” box. But what about the news stories that are influenced by what’s in that box or in the other places where Facebook tells journalists what people care about?

Conservatives have been up in arms since Gizmodo reported last week that some Facebook workers who curate the network’s trending news section allow their own (mostly liberal) biases to influence which topics in politics — and everything else — qualify for promotion in a special box on users’ homepages. The concern is that right-leaning views might be suppressed by a technology giant that wields tremendous power over what is presented to some 167 million Americans as buzzy and newsworthy.

Facebook chief executive Mark Zuckerberg quickly denied censoring content but said he and his company “take this report very seriously and are conducting a full investigation to ensure our teams upheld the integrity of this product.”

All of this — the Gizmodo article, the reaction, the investigation — is important. But it’s a bit narrow. Facebook doesn’t merely have the ability to dictate which already-written stories merit inclusion in its own trending news section; in some cases, the social media juggernaut can also influence which stories journalists wind up writing — and the kinds of questions they ask — in the first place.
Facebook's trending news box showed what users were talking about on Thursday afternoon.

Last fall, the social network launched a reporting tool called Signal that is designed to help journalists “monitor what topics are trending on Facebook” and “find stories as they grow in importance.” Poynter described possible applications in an article about the rollout in September:

In advance of today’s launch, Facebook made Signal available to several beta testers. Vox.com Engagement Editor Allison Rockey was among the journalists who got a sneak peek at the tool, and she says Vox will continue to use Signal for both audience engagement and newsgathering.

“It’s very important for us to give context and background to the biggest news stories of the day,” Rockey said. “Our editors are incredibly interested in the questions and conversation that people are having outside of newsrooms and outside of the Beltway. Having insight into what people are discussing and have questions about on Facebook is really helpful.”

It is really helpful — so long as the data is legit. And, to be clear, there is no evidence that it’s not.

Robert D’Onofrio, Facebook’s director of data communications, said in a statement that the trending topics team at the center of the Gizmodo report is not the same unit that tells journalists — such as the presidential debate moderators who have cited Facebook data in their questions — which political subjects are driving conversations on the social network.

“The political conversation data Facebook shares — like we did with broadcast partners during multiple presidential primary debates — is completely separate from Trending Topics,” D’Onofrio said. “A different team works to quantify and analyze which candidates and issues people talk about most on Facebook before, during or after a debate. These data sets are an unaltered, aggregated rankings based on the number of unique people liking, posting, commenting on and sharing content about a particular subject or candidate.”

One big challenge for media outlets is that there’s no way to fact-check Facebook data because it’s proprietary. No one but Facebook knows what’s really trending on Facebook.

The inability to authenticate Facebook data means newsrooms have to think hard about how heavily they lean on it. They also have to consider how — amid intense competition for Internet traffic — the desire to produce viral content might color editorial judgments. Many news sites draw large portions of their audiences from Facebook, so the temptation is to try to create content that seems likely to be shared and liked over and over.

Upworthy built a business on this strategy. Then, when Facebook changed its news feed algorithm a couple years ago in what was widely viewed as a crackdown on clickbait, Upworthy’s traffic declined sharply. It was a high-profile illustration of how a media company can become a slave to Facebook — and a reminder that the reputation of the news media is on the line here, too.

So there is a shared responsibility where the potential for abuse exists. Facebook could theoretically nudge journalists toward the topics it wants covered by showing those topics to be trending when, in fact, they are not.

For example: One source in the Gizmodo report claimed that “the Black Lives Matter movement was … injected into Facebook’s trending news module,” even though it actually wasn’t trending at the time, according to numerical metrics. If true, the immediate consequence was that more people were exposed to existing news about Black Lives Matter. The possible long-term consequence was that journalists produced more news about Black Lives Matter because they believed a ton of people were interested.

“Nothing says you shouldn’t inject certain topics,” said Jennifer Grygiel, a communications professor at Syracuse University who specializes in social media. “Journalism certainly isn't just what people are talking about the most; it’s what people need to know. So, sure, inject topics. But be transparent if you do.”

Zuckerberg said in a statement last week that the social network has “rigorous guidelines that do not permit the prioritization of one viewpoint over another or the suppression of political perspectives.”

“We have found no evidence that [the Gizmodo] report is true,” he added. “If we find anything against our principles, you have my commitment that we will take additional steps to address it.”

The Gizmodo article, based on interviews with unnamed former contractors, noted “there is no evidence that Facebook management mandated or was even aware of any political bias at work.”

Still, facing a PR problem, Zuckerberg addressed the charge of ideological screening by employees in an off-the-record meeting Wednesday. He met with about 20 prominent conservatives, including media figures such as radio host Glenn Beck, CNN commentator S.E. Cupp and Townhall general manager Jonathan Garthwaite.

Facebook’s role in shaping the news agenda has been on display this election season during presidential debates. At a Republican debate last fall, Fox Business moderator Neil Cavuto asked for the candidates’ positions on the minimum wage. At another debate in January, Cavuto’s co-moderator, Maria Bartiromo, brought up gun control. And at a third debate in March, Fox News’s Bret Baier asked why the GOP’s White House contenders hadn’t talked more about the water crisis in Flint, Mich.

All three questions — along with several others posed in the primary debates — had something in common that was mentioned live on the air: Facebook data showing the subjects they addressed were popular on the social network.

A spokeswoman for the Fox cable channels — alone among the GOP debate media sponsors in citing Facebook trending data — said journalists “chose the topics and only used Facebook for supplementary data. We stand by the topics discussed in both debates on Fox News Channel and Fox Business.”

In other words, Facebook didn’t tell Fox to ask about the minimum wage, gun control or the Flint water crisis; it simply provided data confirming those were, indeed, topics that interested many people.

Nevertheless, the inclusion of Facebook trending data in questions posed to presidential candidates — with millions of voters watching on television — is proof of the high stakes. Moderators relied, in part, on Facebook figures to inform those candidates and voters about the priorities of the electorate.

If journalists are going to depend on Facebook to tell them what people care about, then they’ll want to feel assured that these data are accurate or, if subjective, at least openly so. Which is why there is so much on the line.

In other words, the issue is much bigger than which topics do or do not appear in a little box on your phone or computer screen.



Comments

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

Visualizing The Power Of The World's Supercomputers

BMW traps alleged thief by remotely locking him in car