Should Facebook have a “quiet period” of no algorithm changes before a major election?
Should Facebook have a “quiet period” of
no algorithm changes before a major election?
Several
Facebook News Feed updates leading up to the 2016 U.S. election disadvantaged
traditional news sources and favored less reliable information shared by your
uncle. Should regulation keep the playing field static?
Facebook’s News
Feed algorithm determines what users see on its platform — from
funny memes to comments from friends. The company regularly updates this
algorithm, which can dramatically change what information people consume.
As the
2020 election approaches, there is much public concern that what was dubbed “Russian meddling” in the 2016 presidential
election could happen again. But what’s not getting enough attention is the
role Facebook’s algorithm changes play, intentionally or not, in that kind of
meddling.
A key
counterpoint to the Russian misinformation campaign was factual journalism from
reputable sources — which reached many of their readers on Facebook and other
social media platforms. As a social media researcher and educator, I see
evidence that changes to Facebook’s News Feed algorithm suppressed users’
access to credible journalism in the run-up to Trump’s election.
Political
operatives know Facebook serves as a gatekeeper of the information diets of more than 200 million Americans and 2
billion users worldwide. Actions and abuse by others on the platforms have
generated much concern and public discussion, including about how much disinformation and propaganda Americans saw before
the election. What has not been talked about enough is the effect that
Facebook’s algorithmic shifts have had on access to news and democracy.
Changing
the system
In mid-2015, Facebook introduced a major
algorithm change that pivoted readers away
from journalism and news, to deliver more updates from their friends
and family. The change was couched in friendly language suggesting Facebook was
trying to make sure users didn’t miss stories from friends. But social
media data shows that one effect of the change was to reduce the number of
interactions Facebook users had with credible news outlets.
A few months before the 2016 election, an even bigger algorithm change toward friends and family posts took
a second toll on publisher traffic. A wide range of news publishers found that
their content was significantly less visible to Facebook
users.
In my
research, I looked at Facebook engagement for mainstream news outlets
surrounding the 2016 election. My findings support others’ conclusions that
Facebook’s algorithm greatly suppressed public engagement with these
publishers.
Data from
CrowdTangle, a social media monitoring company (bought by Facebook shortly after the 2016
election), shows that Facebook traffic dropped noticeably at CNN, ABC, NBC,
CBS, Fox News, The New York Times and The Washington Post after the company
updated its algorithms to favor friends and family in June 2016.
That
proves the algorithm worked the way it was designed to work, but I’m concerned
that major U.S. publishers were suppressed in this way. Voter interest in the
presidential election was higher in 2016 than in the previous two decades,
and misinformation was rampant. Facebook’s changes meant that key news
organizations across the political spectrum had a harder time getting the word
out about credible election news and reporting.
Facebook
was aware of concerns about its algorithm even before the election happened.
One of Facebook’s own engineers flagged these potential effects of
Facebook’s algorithm changes in July 2015. Three months later, Zuckerberg’s
mentor, Roger McNamee, also attempted to alert Zuckerberg and Facebook executives that
the platform was being used to manipulate information about the election.
Just after
the election, reporter Craig Silverman’s research at BuzzFeed showed that fake
election news had outperformed “real news.” In late 2018, Facebook’s own
company statement revealed issues with how its algorithm rewarded “borderline content” that was sensational and
provocative, like much of the hyperpartisan news that trended in advance of the
election.
More
recent research by Harvard’s Shorenstein Center shows that Facebook traffic continued
to decrease significantly for publishers after a further
Facebook algorithm change in January 2018.
Algorithmic
transparency
To date,
research on how Facebook’s algorithm works has been limited by the lack of
access to its proprietary inner workings. It’s not enough to
investigate the effects of the changes in Facebook’s News
Feed. I believe it’s important to understand why they happened, too,
and to consider Facebook’s business decisions more directly and how they affect
democracy.
Recent insight
into the company’s internal processes suggests that Facebook is beginning to
understand its power. This month, Bloomberg News revealed that the company had
deployed software on its own platform to look out for posts that portrayed Facebook itself in
potentially misleading ways, reducing their visibility to safeguard
the company’s reputation.
Some
international legal scholars have begun to call for laws to
protect democracies against the possibility that algorithmic
manipulation could deliver electoral gain. There’s no proof that Facebook’s
changes had political intentions, but it’s not hard to imagine that the company
could tweak its algorithms in the future if it wanted to.
To guard
against that potential, new laws could bar changes to the algorithm in the
run-up periods before elections. In the financial industry, for instance, “quiet
periods” in advance of major corporate announcements seek to prevent
marketing and public relations efforts from artificially influencing stock
prices.
Similar
protections for algorithms against corporate manipulation could help ensure
that politically active, power-seeking Facebook executives — or
any other company with significant control over users’ access to information —
can’t use their systems to shape public opinion or voting behavior.
Jennifer Grygiel is
an assistant professor of communications at Syracuse University. This article
is republished from The Conversation under a Creative Commons
license.
Comments
Post a Comment