How a confused, defensive social media giant steered itself into a disaster, and how Mark Zuckerberg is trying to fix it all
INSIDE THE TWO YEARS THAT SHOOK FACEBOOK—AND THE WORLD
How a confused, defensive social media giant steered
itself into a disaster, and how Mark Zuckerberg is trying to fix it all.
By: NICHOLAS THOMPSON, FRED VOGELSTEINBY NICHOLAS
THOMPSON AND FRED VOGELSTEIN 02.12.1807:00 AM
ONE DAY IN late February of 2016, Mark Zuckerberg sent a
memo to all of Facebook’s employees to address some troubling behavior in the
ranks. His message pertained to some walls at the company’s Menlo Park
headquarters where staffers are encouraged to scribble notes and signatures. On
at least a couple of occasions, someone had crossed out the words “Black Lives
Matter” and replaced them with “All Lives Matter.” Zuckerberg wanted whoever
was responsible to cut it out.
“ ‘Black Lives Matter’ doesn’t mean other lives don’t,”
he wrote. “We’ve never had rules around what people can write on our walls,”
the memo went on. But “crossing out something means silencing speech, or that
one person’s speech is more important than another’s.” The defacement, he said,
was being investigated.
All around the country at about this time, debates about
race and politics were becoming increasingly raw. Donald Trump had just won the
South Carolina primary, lashed out at the Pope over immigration, and earned the
enthusiastic support of David Duke. Hillary Clinton had just defeated Bernie
Sanders in Nevada, only to have an activist from Black Lives Matter interrupt a
speech of hers to protest racially charged statements she’d made two decades
before. And on Facebook, a popular group called Blacktivist was gaining
traction by blasting out messages like “American economy and power were built
on forced migration and torture.”
So when Zuckerberg’s admonition circulated, a young
contract employee named Benjamin Fearnow decided it might be newsworthy. He
took a screenshot on his personal laptop and sent the image to a friend named
Michael Nuñez, who worked at the tech-news site Gizmodo. Nuñez promptly
published a brief story about Zuckerberg’s memo.
A week later, Fearnow came across something else he
thought Nuñez might like to publish. In another internal communication,
Facebook had invited its employees to submit potential questions to ask
Zuckerberg at an all-hands meeting. One of the most up-voted questions that
week was “What responsibility does Facebook have to help prevent President
Trump in 2017?” Fearnow took another screenshot, this time with his phone.
Fearnow, a recent graduate of the Columbia Journalism
School, worked in Facebook’s New York office on something called Trending
Topics, a feed of popular news subjects that popped up when people opened Facebook.
The feed was generated by an algorithm but moderated by a team of about 25
people with backgrounds in journalism. If the word “Trump” was trending, as it
often was, they used their news judgment to identify which bit of news about
the candidate was most important. If The Onion or a hoax site published a spoof
that went viral, they had to keep that out. If something like a mass shooting
happened, and Facebook’s algorithm was slow to pick up on it, they would inject
a story about it into the feed.
Facebook prides itself on being a place where people love
to work. But Fearnow and his team weren’t the happiest lot. They were contract
employees hired through a company called BCforward, and every day was full of
little reminders that they weren’t really part of Facebook. Plus, the young
journalists knew their jobs were doomed from the start. Tech companies, for the
most part, prefer to have as little as possible done by humans—because, it’s
often said, they don’t scale. You can’t hire a billion of them, and they prove
meddlesome in ways that algorithms don’t. They need bathroom breaks and health
insurance, and the most annoying of them sometimes talk to the press.
Eventually, everyone assumed, Facebook’s algorithms would be good enough to run
the whole project, and the people on Fearnow’s team—who served partly to train
those algorithms—would be expendable.
The day after Fearnow took that second screenshot was a
Friday. When he woke up after sleeping in, he noticed that he had about 30
meeting notifications from Facebook on his phone. When he replied to say it was
his day off, he recalls, he was nonetheless asked to be available in 10
minutes. Soon he was on a videoconference with three Facebook employees,
including Sonya Ahuja, the company’s head of investigations. According to his
recounting of the meeting, she asked him if he had been in touch with Nuñez. He
denied that he had been. Then she told him that she had their messages on
Gchat, which Fearnow had assumed weren’t accessible to Facebook. He was fired.
“Please shut your laptop and don’t reopen it,” she instructed him.
That same day, Ahuja had another conversation with a
second employee at Trending Topics named Ryan Villarreal. Several years before,
he and Fearnow had shared an apartment with Nuñez. Villarreal said he hadn’t
taken any screenshots, and he certainly hadn’t leaked them. But he had clicked
“like” on the story about Black Lives Matter, and he was friends with Nuñez on
Facebook. “Do you think leaks are bad?” Ahuja demanded to know, according to
Villarreal. He was fired too. The last he heard from his employer was in a
letter from BCforward. The company had given him $15 to cover expenses, and it
wanted the money back.
The firing of Fearnow and Villarreal set the Trending
Topics team on edge—and Nuñez kept digging for dirt. He soon published a story
about the internal poll showing Facebookers’ interest in fending off Trump.
Then, in early May, he published an article based on conversations with yet a
third former Trending Topics employee, under the blaring headline “Former
Facebook Workers: We Routinely Suppressed Conservative News.” The piece
suggested that Facebook’s Trending team worked like a Fox News fever dream,
with a bunch of biased curators “injecting” liberal stories and “blacklisting”
conservative ones. Within a few hours the piece popped onto half a dozen highly
trafficked tech and politics websites, including Drudge Report and Breitbart
News.
The post went viral, but the ensuing battle over Trending
Topics did more than just dominate a few news cycles. In ways that are only
fully visible now, it set the stage for the most tumultuous two years of
Facebook’s existence—triggering a chain of events that would distract and
confuse the company while larger disasters began to engulf it.
This is the story of those two years, as they played out
inside and around the company. WIRED spoke with 51 current or former Facebook
employees for this article, many of whom did not want their names used, for
reasons anyone familiar with the story of Fearnow and Villarreal would surely
understand. (One current employee asked that a WIRED reporter turn off his
phone so the company would have a harder time tracking whether it had been near
the phones of anyone from Facebook.)
The stories varied, but most people told the same basic
tale: of a company, and a CEO, whose techno-optimism has been crushed as
they’ve learned the myriad ways their platform can be used for ill. Of an
election that shocked Facebook, even as its fallout put the company under
siege. Of a series of external threats, defensive internal calculations, and
false starts that delayed Facebook’s reckoning with its impact on global
affairs and its users’ minds. And—in the tale’s final chapters—of the company’s
earnest attempt to redeem itself.
In that saga, Fearnow plays one of those obscure but
crucial roles that history occasionally hands out. He’s the Franz Ferdinand of
Facebook—or maybe he’s more like the archduke’s hapless young assassin. Either
way, in the rolling disaster that has enveloped Facebook since early 2016,
Fearnow’s leaks probably ought to go down as the screenshots heard round the
world.
II
BY NOW, THE story of Facebook’s all-consuming growth is
practically the creation myth of our information era. What began as a way to
connect with your friends at Harvard became a way to connect with people at
other elite schools, then at all schools, and then everywhere. After that, your
Facebook login became a way to log on to other internet sites. Its Messenger
app started competing with email and texting. It became the place where you
told people you were safe after an earthquake. In some countries like the
Philippines, it effectively is the internet.
The furious energy of this big bang emanated, in large
part, from a brilliant and simple insight. Humans are social animals. But the
internet is a cesspool. That scares people away from identifying themselves and
putting personal details online. Solve that problem—make people feel safe to post—and
they will share obsessively. Make the resulting database of privately shared
information and personal connections available to advertisers, and that
platform will become one of the most important media technologies of the early
21st century.
But as powerful as that original insight was, Facebook’s
expansion has also been driven by sheer brawn. Zuckerberg has been a
determined, even ruthless, steward of the company’s manifest destiny, with an
uncanny knack for placing the right bets. In the company’s early days, “move
fast and break things” wasn’t just a piece of advice to his developers; it was
a philosophy that served to resolve countless delicate trade-offs—many of them
involving user privacy—in ways that best favored the platform’s growth. And when
it comes to competitors, Zuckerberg has been relentless in either acquiring or
sinking any challengers that seem to have the wind at their backs.
FACEBOOK’S RECKONING
Two years that forced the platform to change
by Blanca Myers
March 2016
Facebook suspends Benjamin Fearnow, a journalist-curator
for the platform’s Trending Topics feed, after he leaks to Gizmodo.
May 2016
Gizmodo reports that Trending Topics “routinely
suppressed conservative news.” The story sends Facebook scrambling.
July 2016
Rupert Murdoch tells Zuckerberg that Facebook is wreaking
havoc on the news industry and threatens to cause trouble.
August 2016
Facebook cuts loose all of its Trending Topics
journalists, ceding authority over the feed to engineers in Seattle.
November 2016
Donald Trump wins. Zuckerberg says it’s “pretty crazy” to
think fake news on Facebook helped tip the election.
December 2016
Facebook declares war on fake news, hires CNN alum
Campbell Brown to shepherd relations with the publishing industry.
September 2017
Facebook announces that a Russian group paid $100,000 for
roughly 3,000 ads aimed at US voters.
October 2017
Researcher Jonathan Albright reveals that posts from six
Russian propaganda accounts were shared 340 million times.
November 2017
Facebook general counsel Colin Stretch gets pummeled
during congressional Intelligence Committee hearings.
January 2018
Facebook begins announcing major changes, aimed to ensure
that time on the platform will be “time well spent.”
In fact, it was in besting just such a rival that
Facebook came to dominate how we discover and consume news. Back in 2012, the
most exciting social network for distributing news online wasn’t Facebook, it
was Twitter. The latter’s 140-character posts accelerated the speed at which
news could spread, allowing its influence in the news industry to grow much
faster than Facebook’s. “Twitter was this massive, massive threat,” says a
former Facebook executive heavily involved in the decisionmaking at the time.
So Zuckerberg pursued a strategy he has often deployed
against competitors he cannot buy: He copied, then crushed. He adjusted
Facebook’s News Feed to fully incorporate news (despite its name, the feed was
originally tilted toward personal news) and adjusted the product so that it
showed author bylines and headlines. Then Facebook’s emissaries fanned out to
talk with journalists and explain how to best reach readers through the
platform. By the end of 2013, Facebook had doubled its share of traffic to news
sites and had started to push Twitter into a decline. By the middle of 2015, it
had surpassed Google as the leader in referring readers to publisher sites and
was now referring 13 times as many readers to news publishers as Twitter. That
year, Facebook launched Instant Articles, offering publishers the chance to
publish directly on the platform. Posts would load faster and look sharper if
they agreed, but the publishers would give up an element of control over the
content. The publishing industry, which had been reeling for years, largely
assented. Facebook now effectively owned the news. “If you could reproduce
Twitter inside of Facebook, why would you go to Twitter?” says the former
executive. “What they are doing to Snapchat now, they did to Twitter back
then.”
It appears that Facebook did not, however, carefully
think through the implications of becoming the dominant force in the news
industry. Everyone in management cared about quality and accuracy, and they had
set up rules, for example, to eliminate pornography and protect copyright. But
Facebook hired few journalists and spent little time discussing the big
questions that bedevil the media industry. What is fair? What is a fact? How do
you signal the difference between news, analysis, satire, and opinion? Facebook
has long seemed to think it has immunity from those debates because it is just
a technology company—one that has built a “platform for all ideas.”
This notion that Facebook is an open, neutral platform is
almost like a religious tenet inside the company. When new recruits come in,
they are treated to an orientation lecture by Chris Cox, the company’s chief
product officer, who tells them Facebook is an entirely new communications
platform for the 21st century, as the telephone was for the 20th. But if anyone
inside Facebook is unconvinced by religion, there is also Section 230 of the
1996 Communications Decency Act to recommend the idea. This is the section of
US law that shelters internet intermediaries from liability for the content
their users post. If Facebook were to start creating or editing content on its
platform, it would risk losing that immunity—and it’s hard to imagine how
Facebook could exist if it were liable for the many billion pieces of content a
day that users post on its site.
And so, because of the company’s self-image, as well as
its fear of regulation, Facebook tried never to favor one kind of news content
over another. But neutrality is a choice in itself. For instance, Facebook
decided to present every piece of content that appeared on News Feed—whether it
was your dog pictures or a news story—in roughly the same way. This meant that
all news stories looked roughly the same as each other, too, whether they were
investigations in The Washington Post, gossip in the New York Post, or flat-out
lies in the Denver Guardian, an entirely bogus newspaper. Facebook argued that
this democratized information. You saw what your friends wanted you to see, not
what some editor in a Times Square tower chose. But it’s hard to argue that this
wasn’t an editorial decision. It may be one of the biggest ever made.
In any case, Facebook’s move into news set off yet
another explosion of ways that people could connect. Now Facebook was the place
where publications could connect with their readers—and also where Macedonian
teenagers could connect with voters in America, and operatives in Saint
Petersburg could connect with audiences of their own choosing in a way that no
one at the company had ever seen before.
III
IN FEBRUARY OF 2016, just as the Trending Topics fiasco
was building up steam, Roger McNamee
became one of the first Facebook insiders to notice strange things happening on
the platform. McNamee was an early investor in Facebook who had mentored
Zuckerberg through two crucial decisions: to turn down Yahoo’s offer of $1
billion to acquire Facebook in 2006; and to hire a Google executive named
Sheryl Sandberg in 2008 to help find a business model. McNamee was no longer in
touch with Zuckerberg much, but he was still an investor, and that month he
started seeing things related to the Bernie Sanders campaign that worried him.
“I’m observing memes ostensibly coming out of a Facebook group associated with
the Sanders campaign that couldn’t possibly have been from the Sanders
campaign,” he recalls, “and yet they were organized and spreading in such a way
that suggested somebody had a budget. And I’m sitting there thinking, ‘That’s
really weird. I mean, that’s not good.’ ”
But McNamee didn’t say anything to anyone at Facebook—at
least not yet. And the company itself was not picking up on any such worrying
signals, save for one blip on its radar: In early 2016, its security team
noticed an uptick in Russian actors attempting to steal the credentials of
journalists and public figures. Facebook reported this to the FBI. But the
company says it never heard back from the government, and that was that.
Instead, Facebook spent the spring of 2016 very busily
fending off accusations that it might influence the elections in a completely
different way. When Gizmodo published its story about political bias on the
Trending Topics team in May, the article went off like a bomb in Menlo Park.
It quickly reached millions of readers and, in a delicious irony, appeared in
the Trending Topics module itself. But the bad press wasn’t what really rattled
Facebook—it was the letter from John Thune, a Republican US senator from South
Dakota, that followed the story’s publication. Thune chairs the Senate Commerce
Committee, which in turn oversees the Federal Trade Commission, an agency that
has been especially active in investigating Facebook. The senator wanted
Facebook’s answers to the allegations of bias, and he wanted them promptly.
The Thune letter put Facebook on high alert. The company
promptly dispatched senior Washington staffers to meet with Thune’s team. Then
it sent him a 12-page single-spaced letter explaining that it had conducted a
thorough review of Trending Topics and determined that the allegations in the
Gizmodo story were largely false.
Facebook decided, too, that it had to extend an olive
branch to the entire American right wing, much of which was raging about the
company’s supposed perfidy. And so, just over a week after the story ran,
Facebook scrambled to invite a group of 17 prominent Republicans out to Menlo
Park. The list included television hosts, radio stars, think tankers, and an
adviser to the Trump campaign. The point was partly to get feedback. But more
than that, the company wanted to make a show of apologizing for its sins,
lifting up the back of its shirt, and asking for the lash.
According to a Facebook employee involved in planning the
meeting, part of the goal was to bring in a group of conservatives who were
certain to fight with one another. They made sure to have libertarians who
wouldn’t want to regulate the platform and partisans who would. Another goal,
according to the employee, was to make sure the attendees were “bored to death”
by a technical presentation after Zuckerberg and Sandberg had addressed the group.
The power went out, and the room got uncomfortably hot.
But otherwise the meeting went according to plan. The guests did indeed fight,
and they failed to unify in a way that was either threatening or coherent. Some
wanted the company to set hiring quotas for conservative employees; others
thought that idea was nuts. As often happens when outsiders meet with Facebook,
people used the time to try to figure out how they could get more followers for
their own pages.
Afterward, Glenn Beck, one of the invitees, wrote an
essay about the meeting, praising Zuckerberg. “I asked him if Facebook, now or
in the future, would be an open platform for the sharing of all ideas or a
curator of content,” Beck wrote. “Without hesitation, with clarity and
boldness, Mark said there is only one Facebook and one path forward: ‘We are an
open platform.’”
Inside Facebook itself, the backlash around Trending
Topics did inspire some genuine soul-searching. But none of it got very far. A
quiet internal project, codenamed Hudson, cropped up around this time to
determine, according to someone who worked on it, whether News Feed should be
modified to better deal with some of the most complex issues facing the
product. Does it favor posts that make people angry? Does it favor simple or
even false ideas over complex and true ones? Those are hard questions, and the
company didn’t have answers to them yet. Ultimately, in late June, Facebook
announced a modest change: The algorithm would be revised to favor posts from
friends and family. At the same time, Adam Mosseri, Facebook’s News Feed boss,
posted a manifesto titled “Building a Better News Feed for You.” People inside
Facebook spoke of it as a document roughly resembling the Magna Carta; the
company had never spoken before about how News Feed really worked. To
outsiders, though, the document came across as boilerplate. It said roughly
what you’d expect: that the company was opposed to clickbait but that it wasn’t
in the business of favoring certain kinds of viewpoints.
The most important consequence of the Trending Topics
controversy, according to nearly a dozen former and current employees, was that
Facebook became wary of doing anything that might look like stifling
conservative news. It had burned its fingers once and didn’t want to do it
again. And so a summer of deeply partisan rancor and calumny began with
Facebook eager to stay out of the fray.
IV
SHORTLY AFTER MOSSERI published his guide to News Feed
values, Zuckerberg traveled to Sun Valley, Idaho, for an annual conference hosted
by billionaire Herb Allen, where moguls in short sleeves and sunglasses cavort
and make plans to buy each other’s companies. But Rupert Murdoch broke the mood
in a meeting that took place inside his villa. According to numerous accounts
of the conversation, Murdoch and Robert Thomson, the CEO of News Corp,
explained to Zuckerberg that they had long been unhappy with Facebook and
Google. The two tech giants had taken nearly the entire digital ad market and
become an existential threat to serious journalism. According to people
familiar with the conversation, the two News Corp leaders accused Facebook of
making dramatic changes to its core algorithm without adequately consulting its
media partners, wreaking havoc according to Zuckerberg’s whims. If Facebook
didn’t start offering a better deal to the publishing industry, Thomson and
Murdoch conveyed in stark terms, Zuckerberg could expect News Corp executives
to become much more public in their denunciations and much more open in their
lobbying. They had helped to make things very hard for Google in Europe. And
they could do the same for Facebook in the US.
Facebook thought that News Corp was threatening to push
for a government antitrust investigation or maybe an inquiry into whether the
company deserved its protection from liability as a neutral platform. Inside
Facebook, executives believed Murdoch might use his papers and TV stations to
amplify critiques of the company. News Corp says that was not at all the case;
the company threatened to deploy executives, but not its journalists.
Zuckerberg had reason to take the meeting especially
seriously, according to a former Facebook executive, because he had firsthand
knowledge of Murdoch’s skill in the dark arts. Back in 2007, Facebook had come
under criticism from 49 state attorneys general for failing to protect young
Facebook users from sexual predators and inappropriate content. Concerned
parents had written to Connecticut attorney general Richard Blumenthal, who
opened an investigation, and to The New York Times, which published a story.
But according to a former Facebook executive in a position to know, the company
believed that many of the Facebook accounts and the predatory behavior the
letters referenced were fakes, traceable to News Corp lawyers or others working
for Murdoch, who owned Facebook’s biggest competitor, MySpace. “We traced the
creation of the Facebook accounts to IP addresses at the Apple store a block
away from the MySpace offices in Santa Monica,” the executive says. “Facebook
then traced interactions with those accounts to News Corp lawyers. When it
comes to Facebook, Murdoch has been playing every angle he can for a long
time.” (Both News Corp and its spinoff 21st Century Fox declined to comment.)
When Zuckerberg returned from Sun Valley, he told his
employees that things had to change. They still weren’t in the news business,
but they had to make sure there would be a news business. And they had to
communicate better. One of those who got a new to-do list was Andrew Anker, a
product manager who’d arrived at Facebook in 2015 after a career in journalism
(including a long stint at WIRED in the ’90s). One of his jobs was to help the
company think through how publishers could make money on the platform. Shortly
after Sun Valley, Anker met with Zuckerberg and asked to hire 60 new people to
work on partnerships with the news industry. Before the meeting ended, the
request was approved.
But having more people out talking to publishers just
drove home how hard it would be to resolve the financial problems Murdoch
wanted fixed. News outfits were spending millions to produce stories that
Facebook was benefiting from, and Facebook, they felt, was giving too little
back in return. Instant Articles, in particular, struck them as a Trojan horse.
Publishers complained that they could make more money from stories that loaded
on their own mobile web pages than on Facebook Instant. (They often did so, it
turned out, in ways that short-changed advertisers, by sneaking in ads that
readers were unlikely to see. Facebook didn’t let them get away with that.)
Another seemingly irreconcilable difference: Outlets like Murdoch’s Wall Street
Journal depended on paywalls to make money, but Instant Articles banned
paywalls; Zuckerberg disapproved of them. After all, he would often ask, how
exactly do walls and toll booths make the world more open and connected?
The conversations often ended at an impasse, but Facebook
was at least becoming more attentive. This newfound appreciation for the
concerns of journalists did not, however, extend to the journalists on
Facebook’s own Trending Topics team. In late August, everyone on the team was
told that their jobs were being eliminated. Simultaneously, authority over the
algorithm shifted to a team of engineers based in Seattle. Very quickly the
module started to surface lies and fiction. A headline days later read, “Fox
News Exposes Traitor Megyn Kelly, Kicks Her Out For Backing Hillary."
V
WHILE FACEBOOK GRAPPLED internally with what it was
becoming—a company that dominated media but didn’t want to be a media
company—Donald Trump’s presidential campaign staff faced no such confusion. To
them Facebook’s use was obvious. Twitter was a tool for communicating directly
with supporters and yelling at the media. Facebook was the way to run the most
effective direct-marketing political operation in history.
In the summer of 2016, at the top of the general election
campaign, Trump’s digital operation might have seemed to be at a major
disadvantage. After all, Hillary Clinton’s team was flush with elite talent and
got advice from Eric Schmidt, known for running Google. Trump’s was run by Brad
Parscale, known for setting up the Eric Trump Foundation’s web page. Trump’s
social media director was his former caddie. But in 2016, it turned out you
didn’t need digital experience running a presidential campaign, you just needed
a knack for Facebook.
Over the course of the summer, Trump’s team turned the
platform into one of its primary vehicles for fund-raising. The campaign
uploaded its voter files—the names, addresses, voting history, and any other
information it had on potential voters—to Facebook. Then, using a tool called
Lookalike Audiences, Facebook identified the broad characteristics of, say,
people who had signed up for Trump newsletters or bought Trump hats. That
allowed the campaign to send ads to people with similar traits. Trump would
post simple messages like “This election is being rigged by the media pushing
false and unsubstantiated charges, and outright lies, in order to elect Crooked
Hillary!” that got hundreds of thousands of likes, comments, and shares. The
money rolled in. Clinton’s wonkier messages, meanwhile, resonated less on the
platform. Inside Facebook, almost everyone on the executive team wanted Clinton
to win; but they knew that Trump was using the platform better. If he was the
candidate for Facebook, she was the candidate for LinkedIn.
Trump’s candidacy also proved to be a wonderful tool for
a new class of scammers pumping out massively viral and entirely fake stories.
Through trial and error, they learned that memes praising the former host of
The Apprentice got many more readers than ones praising the former secretary of
state. A website called Ending the Fed proclaimed that the Pope had endorsed
Trump and got almost a million comments, shares, and reactions on Facebook,
according to an analysis by BuzzFeed. Other stories asserted that the former
first lady had quietly been selling weapons to ISIS, and that an FBI agent
suspected of leaking Clinton’s emails was found dead. Some of the posts came
from hyperpartisan Americans. Some came from overseas content mills that were
in it purely for the ad dollars. By the end of the campaign, the top fake
stories on the platform were generating more engagement than the top real ones.
Even current Facebookers acknowledge now that they missed
what should have been obvious signs of people misusing the platform. And looking
back, it’s easy to put together a long list of possible explanations for the
myopia in Menlo Park about fake news. Management was gun-shy because of the
Trending Topics fiasco; taking action against partisan disinformation—or even
identifying it as such—might have been seen as another act of political
favoritism. Facebook also sold ads against the stories, and sensational garbage
was good at pulling people into the platform. Employees’ bonuses can be based
largely on whether Facebook hits certain growth and revenue targets, which
gives people an extra incentive not to worry too much about things that are
otherwise good for engagement. And then there was the ever-present issue of
Section 230 of the 1996 Communications Decency Act. If the company started taking
responsibility for fake news, it might have to take responsibility for a lot
more. Facebook had plenty of reasons to keep its head in the sand.
Roger McNamee, however, watched carefully as the nonsense
spread. First there were the fake stories pushing Bernie Sanders, then he saw
ones supporting Brexit, and then helping Trump. By the end of the summer, he
had resolved to write an op-ed about the problems on the platform. But he never
ran it. “The idea was, look, these are my friends. I really want to help them.”
And so on a Sunday evening, nine days before the 2016 election, McNamee emailed
a 1,000-word letter to Sandberg and Zuckerberg. “I am really sad about
Facebook,” it began. “I got involved with the company more than a decade ago
and have taken great pride and joy in the company’s success … until the past
few months. Now I am disappointed. I am embarrassed. I am ashamed.”
EDDIE GUY
VI
IT’S NOT EASY to recognize that the machine you’ve built
to bring people together is being used to tear them apart, and Mark
Zuckerberg’s initial reaction to Trump’s victory, and Facebook’s possible role
in it, was one of peevish dismissal. Executives remember panic the first few
days, with the leadership team scurrying back and forth between Zuckerberg’s
conference room (called the Aquarium) and Sandberg’s (called Only Good News),
trying to figure out what had just happened and whether they would be blamed.
Then, at a conference two days after the election, Zuckerberg argued that
filter bubbles are worse offline than on Facebook and that social media hardly
influences how people vote. “The idea that fake news on Facebook—of which, you
know, it’s a very small amount of the content—influenced the election in any
way, I think, is a pretty crazy idea,” he said.
Zuckerberg declined to be interviewed for this article,
but people who know him well say he likes to form his opinions from data. And
in this case he wasn’t without it. Before the interview, his staff had worked
up a back-of-the-envelope calculation showing that fake news was a tiny
percentage of the total amount of election-related content on the platform.
But the analysis was just an aggregate look at the percentage of clearly fake
stories that appeared across all of Facebook. It didn’t measure their influence
or the way fake news affected specific groups. It was a number, but not a
particularly meaningful one.
Zuckerberg’s comments did not go over well, even inside
Facebook. They seemed clueless and self-absorbed. “What he said was incredibly
damaging,” a former executive told WIRED. “We had to really flip him on that.
We realized that if we didn’t, the company was going to start heading down this
pariah path that Uber was on.”
A week after his “pretty crazy” comment, Zuckerberg flew
to Peru to give a talk to world leaders about the ways that connecting more
people to the internet, and to Facebook, could reduce global poverty. Right
after he landed in Lima, he posted something of a mea culpa. He explained that
Facebook did take misinformation seriously, and he presented a vague
seven-point plan to tackle it. When a professor at the New School named David
Carroll saw Zuckerberg’s post, he took a screenshot. Alongside it on Carroll’s
feed ran a headline from a fake CNN with an image of a distressed Donald Trump
and the text “DISQUALIFIED; He’s GONE!”
At the conference in Peru, Zuckerberg met with a man who
knows a few things about politics: Barack Obama. Media reports portrayed the
encounter as one in which the lame-duck president pulled Zuckerberg aside and
gave him a “wake-up call” about fake news. But according to someone who was
with them in Lima, it was Zuckerberg who called the meeting, and his agenda was
merely to convince Obama that, yes, Facebook was serious about dealing with the
problem. He truly wanted to thwart misinformation, he said, but it wasn’t an
easy issue to solve.
One employee compared Zuckerberg to Lennie in Of Mice and
Men—a man with no understanding of his own strength.
Meanwhile, at Facebook, the gears churned. For the first
time, insiders really began to question whether they had too much power. One
employee told WIRED that, watching Zuckerberg, he was reminded of Lennie in Of
Mice and Men, the farm-worker with no understanding of his own strength.
Very soon after the election, a team of employees started
working on something called the News Feed Integrity Task Force, inspired by a
sense, one of them told WIRED, that hyperpartisan misinformation was “a disease
that’s creeping into the entire platform.” The group, which included Mosseri
and Anker, began to meet every day, using whiteboards to outline different ways
they could respond to the fake-news crisis. Within a few weeks the company
announced it would cut off advertising revenue for ad farms and make it easier
for users to flag stories they thought false.
In December the company announced that, for the first
time, it would introduce fact-checking onto the platform. Facebook didn’t want
to check facts itself; instead it would outsource the problem to professionals.
If Facebook received enough signals that a story was false, it would
automatically be sent to partners, like Snopes, for review. Then, in early
January, Facebook announced that it had hired Campbell Brown, a former anchor
at CNN. She immediately became the most prominent journalist hired by the
company.
Soon Brown was put in charge of something called the
Facebook Journalism Project. “We spun it up over the holidays, essentially,”
says one person involved in discussions about the project. The aim was to
demonstrate that Facebook was thinking hard about its role in the future of
journalism—essentially, it was a more public and organized version of the
efforts the company had begun after Murdoch’s tongue-lashing. But sheer anxiety
was also part of the motivation. “After the election, because Trump won, the
media put a ton of attention on fake news and just started hammering us. People
started panicking and getting afraid that regulation was coming. So the team
looked at what Google had been doing for years with News Lab”—a group inside
Alphabet that builds tools for journalists—“and we decided to figure out how we
could put together our own packaged program that shows how seriously we take
the future of news.”
Facebook was reluctant, however, to issue any mea culpas
or action plans with regard to the problem of filter bubbles or Facebook’s
noted propensity to serve as a tool for amplifying outrage. Members of the
leadership team regarded these as issues that couldn’t be solved, and maybe
even shouldn’t be solved. Was Facebook really more at fault for amplifying
outrage during the election than, say, Fox News or MSNBC? Sure, you could put
stories into people’s feeds that contradicted their political viewpoints, but
people would turn away from them, just as surely as they’d flip the dial back
if their TV quietly switched them from Sean Hannity to Joy Reid. The problem,
as Anker puts it, “is not Facebook. It’s humans.”
VII
Zuckerberg’s “pretty crazy” statement about fake news
caught the ear of a lot of people, but one of the most influential was a
security researcher named Renée DiResta. For years, she’d been studying how
misinformation spreads on the platform. If you joined an antivaccine group on
Facebook, she observed, the platform might suggest that you join flat-earth
groups or maybe ones devoted to Pizzagate—putting you on a conveyor belt of
conspiracy thinking. Zuckerberg’s statement struck her as wildly out of touch.
“How can this platform say this thing?” she remembers thinking.
Roger McNamee, meanwhile, was getting steamed at
Facebook’s response to his letter. Zuckerberg and Sandberg had written him back
promptly, but they hadn’t said anything substantial. Instead he ended up having
a months-long, ultimately futile set of email exchanges with Dan Rose,
Facebook’s VP for partnerships. McNamee says Rose’s message was polite but also
very firm: The company was doing a lot of good work that McNamee couldn’t see,
and in any event Facebook was a platform, not a media company.
“And I’m sitting there going, ‘Guys, seriously, I don’t
think that’s how it works,’” McNamee says. “You can assert till you’re blue in
the face that you’re a platform, but if your users take a different point of
view, it doesn’t matter what you assert.”
As the saying goes, heaven has no rage like love to
hatred turned, and McNamee’s concern soon became a cause—and the beginning of
an alliance. In April 2017 he connected with a former Google design ethicist
named Tristan Harris when they appeared together on Bloomberg TV. Harris had by
then gained a national reputation as the conscience of Silicon Valley. He had
been profiled on 60 Minutes and in The Atlantic, and he spoke eloquently about
the subtle tricks that social media companies use to foster an addiction to
their services. “They can amplify the worst aspects of human nature,” Harris
told WIRED this past December. After the TV appearance, McNamee says he called
Harris up and asked, “Dude, do you need a wingman?”
The next month, DiResta published an article comparing
purveyors of disinformation on social media to manipulative high-frequency
traders in financial markets. “Social networks enable malicious actors to
operate at platform scale, because they were designed for fast information
flows and virality,” she wrote. Bots and sock puppets could cheaply “create the
illusion of a mass groundswell of grassroots activity,” in much the same way
that early, now-illegal trading algorithms could spoof demand for a stock.
Harris read the article, was impressed, and emailed her.
The three were soon out talking to anyone who would
listen about Facebook’s poisonous effects on American democracy. And before
long they found receptive audiences in the media and Congress—groups with their
own mounting grievances against the social media giant.
VIII
EVEN AT THE best of times, meetings between Facebook and
media executives can feel like unhappy family gatherings. The two sides are
inextricably bound together, but they don’t like each other all that much. News
executives resent that Facebook and Google have captured roughly three-quarters
of the digital ad business, leaving the media industry and other platforms,
like Twitter, to fight over scraps. Plus they feel like the preferences of
Facebook’s algorithm have pushed the industry to publish ever-dumber stories.
For years, The New York Times resented that Facebook helped elevate BuzzFeed;
now BuzzFeed is angry about being displaced by clickbait.
And then there’s the simple, deep fear and mistrust that
Facebook inspires. Every publisher knows that, at best, they are sharecroppers
on Facebook’s massive industrial farm. The social network is roughly 200 times
more valuable than the Times. And journalists know that the man who owns the
farm has the leverage. If Facebook wanted to, it could quietly turn any number
of dials that would harm a publisher—by manipulating its traffic, its ad
network, or its readers.
Emissaries from Facebook, for their part, find it
tiresome to be lectured by people who can’t tell an algorithm from an API. They
also know that Facebook didn’t win the digital ad market through luck: It built
a better ad product. And in their darkest moments, they wonder: What’s the
point? News makes up only about 5 percent of the total content that people see
on Facebook globally. The company could let it all go and its shareholders
would scarcely notice. And there’s another, deeper problem: Mark Zuckerberg,
according to people who know him, prefers to think about the future. He’s less
interested in the news industry’s problems right now; he’s interested in the
problems five or 20 years from now. The editors of major media companies, on
the other hand, are worried about their next quarter—maybe even their next
phone call. When they bring lunch back to their desks, they know not to buy
green bananas.
This mutual wariness—sharpened almost to enmity in the
wake of the election—did not make life easy for Campbell Brown when she started
her new job running the nascent Facebook Journalism Project. The first item on
her to-do list was to head out on yet another Facebook listening tour with
editors and publishers. One editor describes a fairly typical meeting: Brown
and Chris Cox, Facebook’s chief product officer, invited a group of media
leaders to gather in late January 2017 at Brown’s apartment in Manhattan. Cox,
a quiet, suave man, sometimes referred to as “the Ryan Gosling of Facebook
Product,” took the brunt of the ensuing abuse. “Basically, a bunch of us just
laid into him about how Facebook was destroying journalism, and he graciously
absorbed it,” the editor says. “He didn’t much try to defend them. I think the
point was really to show up and seem to be listening.” Other meetings were even
more tense, with the occasional comment from journalists noting their interest
in digital antitrust issues.
As bruising as all this was, Brown’s team became more
confident that their efforts were valued within the company when Zuckerberg
published a 5,700-word corporate manifesto in February. He had spent the
previous three months, according to people who know him, contemplating whether
he had created something that did more harm than good. “Are we building the
world we all want?” he asked at the beginning of his post, implying that the
answer was an obvious no. Amid sweeping remarks about “building a global
community,” he emphasized the need to keep people informed and to knock out
false news and clickbait. Brown and others at Facebook saw the manifesto as a
sign that Zuckerberg understood the company’s profound civic responsibilities.
Others saw the document as blandly grandiose, showcasing Zuckerberg’s tendency
to suggest that the answer to nearly any problem is for people to use Facebook
more.
Shortly after issuing the manifesto, Zuckerberg set off
on a carefully scripted listening tour of the country. He began popping into
candy shops and dining rooms in red states, camera crew and personal social
media team in tow. He wrote an earnest post about what he was learning, and he
deflected questions about whether his real goal was to become president. It
seemed like a well-meaning effort to win friends for Facebook. But it soon
became clear that Facebook’s biggest problems emanated from places farther away
than Ohio.
IX
ONE OF THE many things Zuckerberg seemed not to grasp
when he wrote his manifesto was that his platform had empowered an enemy far
more sophisticated than Macedonian teenagers and assorted low-rent purveyors of
bull. As 2017 wore on, however, the company began to realize it had been
attacked by a foreign influence operation. “I would draw a real distinction
between fake news and the Russia stuff,” says an executive who worked on the
company’s response to both. “With the latter there was a moment where everyone
said ‘Oh, holy shit, this is like a national security situation.’”
That holy shit moment, though, didn’t come until more
than six months after the election. Early in the campaign season, Facebook was
aware of familiar attacks emanating from known Russian hackers, such as the
group APT28, which is believed to be affiliated with Moscow. They were hacking
into accounts outside of Facebook, stealing documents, then creating fake
Facebook accounts under the banner of DCLeaks, to get people to discuss what
they’d stolen. The company saw no signs of a serious, concerted foreign
propaganda campaign, but it also didn’t think to look for one.
During the spring of 2017, the company’s security team
began preparing a report about how Russian and other foreign intelligence
operations had used the platform. One of its authors was Alex Stamos, head of
Facebook’s security team. Stamos was something of an icon in the tech world for
having reportedly resigned from his previous job at Yahoo after a conflict over
whether to grant a US intelligence agency access to Yahoo servers. According to
two people with direct knowledge of the document, he was eager to publish a
detailed, specific analysis of what the company had found. But members of the
policy and communications team pushed back and cut his report way down. Sources
close to the security team suggest the company didn’t want to get caught up in
the political whirlwind of the moment. (Sources on the politics and
communications teams insist they edited the report down, just because the darn
thing was hard to read.)
On April 27, 2017, the day after the Senate announced it
was calling then FBI director James Comey to testify about the Russia
investigation, Stamos’ report came out. It was titled “Information Operations
and Facebook,” and it gave a careful step-by-step explanation of how a foreign
adversary could use Facebook to manipulate people. But there were few specific
examples or details, and there was no direct mention of Russia. It felt bland
and cautious. As Renée DiResta says, “I remember seeing the report come out and
thinking, ‘Oh, goodness, is this the best they could do in six months?’”
One month later, a story in Time suggested to Stamos’
team that they might have missed something in their analysis. The article
quoted an unnamed senior intelligence official saying that Russian operatives
had bought ads on Facebook to target Americans with propaganda. Around the same
time, the security team also picked up hints from congressional investigators
that made them think an intelligence agency was indeed looking into Russian
Facebook ads. Caught off guard, the team members started to dig into the
company’s archival ads data themselves.
Eventually, by sorting transactions according to a series
of data points—Were ads purchased in rubles? Were they purchased within
browsers whose language was set to Russian?—they were able to find a cluster of
accounts, funded by a shadowy Russian group called the Internet Research
Agency, that had been designed to manipulate political opinion in America.
There was, for example, a page called Heart of Texas, which pushed for the
secession of the Lone Star State. And there was Blacktivist, which pushed
stories about police brutality against black men and women and had more
followers than the verified Black Lives Matter page.
Numerous security researchers express consternation that
it took Facebook so long to realize how the Russian troll farm was exploiting
the platform. After all, the group was well known to Facebook. Executives at
the company say they’re embarrassed by how long it took them to find the fake
accounts, but they point out that they were never given help by US intelligence
agencies. A staffer on the Senate Intelligence Committee likewise voiced
exasperation with the company. “It seemed obvious that it was a tactic the
Russians would exploit,” the staffer says.
When Facebook finally did find the Russian propaganda on
its platform, the discovery set off a crisis, a scramble, and a great deal of
confusion. First, due to a miscalculation, word initially spread through the
company that the Russian group had spent millions of dollars on ads, when the
actual total was in the low six figures. Once that error was resolved, a
disagreement broke out over how much to reveal, and to whom. The company could
release the data about the ads to the public, release everything to Congress,
or release nothing. Much of the argument hinged on questions of user privacy.
Members of the security team worried that the legal process involved in handing
over private user data, even if it belonged to a Russian troll farm, would open
the door for governments to seize data from other Facebook users later on.
“There was a real debate internally,” says one executive. “Should we just say
‘Fuck it’ and not worry?” But eventually the company decided it would be crazy
to throw legal caution to the wind “just because Rachel Maddow wanted us to.”
Ultimately, a blog post appeared under Stamos’ name in
early September announcing that, as far as the company could tell, the Russians
had paid Facebook $100,000 for roughly 3,000 ads aimed at influencing American
politics around the time of the 2016 election. Every sentence in the post
seemed to downplay the substance of these new revelations: The number of ads
was small, the expense was small. And Facebook wasn’t going to release them.
The public wouldn’t know what they looked like or what they were really aimed
at doing.
This didn’t sit at all well with DiResta. She had long
felt that Facebook was insufficiently forthcoming, and now it seemed to be
flat-out stonewalling. “That was when it went from incompetence to malice,” she
says. A couple of weeks later, while waiting at a Walgreens to pick up a
prescription for one of her kids, she got a call from a researcher at the Tow
Center for Digital Journalism named Jonathan Albright. He had been mapping
ecosystems of misinformation since the election, and he had some excellent news.
“I found this thing,” he said. Albright had started digging into CrowdTangle,
one of the analytics platforms that Facebook uses. And he had discovered that
the data from six of the accounts Facebook had shut down were still there,
frozen in a state of suspended animation. There were the posts pushing for
Texas secession and playing on racial antipathy. And then there were political
posts, like one that referred to Clinton as “that murderous anti-American
traitor Killary.” Right before the election, the Blacktivist account urged its
supporters to stay away from Clinton and instead vote for Jill Stein. Albright
downloaded the most recent 500 posts from each of the six groups. He reported
that, in total, their posts had been shared more than 340 million times.
X
TO MCNAMEE, THE way the Russians used the platform was
neither a surprise nor an anomaly. “They find 100 or 1,000 people who are angry
and afraid and then use Facebook’s tools to advertise to get people into
groups,” he says. “That’s exactly how Facebook was designed to be used.”
McNamee and Harris had first traveled to DC for a day in
July to meet with members of Congress. Then, in September, they were joined by
DiResta and began spending all their free time counseling senators, representatives,
and members of their staffs. The House and Senate Intelligence Committees were
about to hold hearings on Russia’s use of social media to interfere in the US
election, and McNamee, Harris, and DiResta were helping them prepare. One of
the early questions they weighed in on was the matter of who should be summoned
to testify. Harris recommended that the CEOs of the big tech companies be
called in, to create a dramatic scene in which they all stood in a neat row
swearing an oath with their right hands in the air, roughly the way tobacco
executives had been forced to do a generation earlier. Ultimately, though, it
was determined that the general counsels of the three companies—Facebook,
Twitter, and Google—should head into the lion’s den.
And so on November 1, Colin Stretch arrived from Facebook
to be pummeled. During the hearings themselves, DiResta was sitting on her bed
in San Francisco, watching them with her headphones on, trying not to wake up
her small children. She listened to the back-and-forth in Washington while
chatting on Slack with other security researchers. She watched as Marco Rubio
smartly asked whether Facebook even had a policy forbidding foreign governments
from running an influence campaign through the platform. The answer was no.
Rhode Island senator Jack Reed then asked whether Facebook felt an obligation
to individually notify all the users who had seen Russian ads that they had
been deceived. The answer again was no. But maybe the most threatening comment
came from Dianne Feinstein, the senior senator from Facebook’s home state.
“You’ve created these platforms, and now they’re being misused, and you have to
be the ones to do something about it,” she declared. “Or we will.”
After the hearings, yet another dam seemed to break, and
former Facebook executives started to go public with their criticisms of the
company too. On November 8, billionaire entrepreneur Sean Parker, Facebook’s
first president, said he now regretted pushing Facebook so hard on the world.
“I don’t know if I really understood the consequences of what I was saying,” he
said. “God only knows what it’s doing to our children’s brains.” Eleven days
later, Facebook’s former privacy manager, Sandy Parakilas, published a New York
Times op-ed calling for the government to regulate Facebook: “The company won’t
protect us by itself, and nothing less than our democracy is at stake.”
XI
THE DAY OF the hearings, Zuckerberg had to give
Facebook’s Q3 earnings call. The numbers were terrific, as always, but his mood
was not. Normally these calls can put someone with 12 cups of coffee in them to
sleep; the executive gets on and says everything is going well, even when it
isn’t. Zuckerberg took a different approach. “I’ve expressed how upset I am
that the Russians tried to use our tools to sow mistrust. We build these tools
to help people connect and to bring us closer together. And they used them to
try to undermine our values. What they did is wrong, and we are not going to
stand for it.” The company would be investing so much in security, he said,
that Facebook would make “significantly” less money for a while. “I want to be
clear about what our priority is: Protecting our community is more important
than maximizing our profits.” What the company really seeks is for users to
find their experience to be “time well spent,” Zuckerberg said—using the three
words that have become Tristan Harris’ calling card, and the name of his
nonprofit.
Other signs emerged, too, that Zuckerberg was beginning
to absorb the criticisms of his company. The Facebook Journalism Project, for
instance, seemed to be making the company take its obligations as a publisher,
and not just a platform, more seriously. In the fall, the company announced
that Zuckerberg had decided—after years of resisting the idea—that publishers
using Facebook Instant Articles could require readers to subscribe. Paying for
serious publications, in the months since the election, had come to seem like
both the path forward for journalism and a way of resisting the post-truth
political landscape. (WIRED recently instituted its own paywall.) Plus,
offering subscriptions arguably helped put in place the kinds of incentives
that Zuckerberg professed to want driving the platform. People like Alex
Hardiman, the head of Facebook news products and an alum of The New York Times,
started to recognize that Facebook had long helped to create an economic system
that rewarded publishers for sensationalism, not accuracy or depth. “If we just
reward content based on raw clicks and engagement, we might actually see
content that is increasingly sensationalist, clickbaity, polarizing, and
divisive,” she says. A social network that rewards only clicks, not
subscriptions, is like a dating service that encourages one-night stands but
not marriages.
XII
A COUPLE OF weeks before Thanksgiving 2017, Zuckerberg
called one of his quarterly all-hands meetings on the Facebook campus, in an
outdoor space known as Hacker Square. He told everyone he hoped they would have
a good holiday. Then he said, “This year, with recent news, a lot of us are
probably going to get asked: ‘What is going on with Facebook?’ This has been a
tough year … but … what I know is that we’re fortunate to play an important
role in billions of people’s lives. That’s a privilege, and it puts an enormous
responsibility on all of us.” According to one attendee, the remarks came
across as blunter and more personal than any they’d ever heard from Zuckerberg.
He seemed humble, even a little chastened. “I don’t think he sleeps well at
night,” the employee says. “I think he has remorse for what has happened.”
During the late fall, criticism continued to mount:
Facebook was accused of becoming a central vector for spreading deadly
propaganda against the Rohingya in Myanmar and for propping up the brutal leadership
of Rodrigo Duterte in the Philippines. And December brought another haymaker
from someone closer by. Early that month, it emerged that Chamath Palihapitiya,
who had been Facebook’s vice president for user growth before leaving in 2011,
had told an audience at Stanford that he thought social media platforms like
Facebook had “created tools that are ripping apart the social fabric” and that
he feels “tremendous guilt” for being part of that. He said he tries to use
Facebook as little as possible and doesn’t permit his children to use such
platforms at all.
The criticism stung in a way that others hadn’t.
Palihapitiya is close to many of the top executives at Facebook, and he has
deep cachet in Silicon Valley and among Facebook engineers as a part-owner of
the Golden State Warriors. Sheryl Sandberg sometimes wears a chain around her
neck that’s welded together from one given to her by Zuckerberg and one given
to her by Palihapitiya after her husband’s death. The company issued a
statement saying it had been a long time since Palihapitiya had worked there.
“Facebook was a very different company back then and as we have grown we have
realized how our responsibilities have grown too.” Asked why the company had
responded to Palihapitiya, and not to others, a senior Facebook executive said,
“Chamath is—was—a friend to a lot of people here.”
Roger McNamee, meanwhile, went on a media tour lambasting
the company. He published an essay in Washington Monthly and then followed up
in The Washington Post and The Guardian. Facebook was less impressed with him.
Executives considered him to be overstating his connection to the company and
dining out on his criticism. Andrew Bosworth, a VP and member of the
management team, tweeted, “I’ve worked at Facebook for 12 years and I have to
ask: Who the fuck is Roger McNamee?”
Zuckerberg did seem to be eager to mend one fence,
though. Around this time, a team of Facebook executives gathered for dinner
with executives from News Corp at the Grill, an upscale restaurant in Manhattan.
Right at the start, Zuckerberg raised a toast to Murdoch. He spoke charmingly
about reading a biography of the older man and of admiring his accomplishments.
Then he described a game of tennis he’d once played against Murdoch. At first
he had thought it would be easy to hit the ball with a man more than 50 years
his senior. But he quickly realized, he said, that Murdoch was there to
compete.
XIII
ON JANUARY 4, 2018, Zuckerberg announced that he had a
new personal challenge for the year. For each of the past nine years, he had
committed himself to some kind of self-improvement. His first challenge was
farcical—wear ties—and the others had been a little preening and collegiate. He
wanted to learn Mandarin, read 25 books, run 365 miles. This year, though, he
took a severe tone. “The world feels anxious and divided, and Facebook has a
lot of work to do—whether it’s protecting our community from abuse and hate,
defending against interference by nation-states, or making sure that time spent
on Facebook is time well spent,” Zuckerberg declared. The language wasn’t
original—he had borrowed from Tristan Harris again—but it was, by the accounts
of many people around him, entirely sincere.
That New Year’s challenge, it turned out, was a bit of
carefully considered choreography setting up a series of announcements,
starting with a declaration the following week that the News Feed algorithm
would be rejiggered to favor “meaningful interactions.” Posts and videos of the
sort that make us look or like—but not comment or care—would be deprioritized.
The idea, explained Adam Mosseri, is that, online, “interacting with people is
positively correlated with a lot of measures of well-being, whereas passively
consuming content online is less so.”
To numerous people at the company, the announcement
marked a huge departure. Facebook was putting a car in reverse that had been
driving at full speed in one direction for 14 years. Since the beginning,
Zuckerberg’s ambition had been to create another internet, or perhaps another
world, inside of Facebook, and to get people to use it as much as possible. The
business model was based on advertising, and advertising was insatiably hungry
for people’s time. But now Zuckerberg said he expected these new changes to
News Feed would make people use Facebook less.
The announcement was hammered by many in the press.
During the rollout, Mosseri explained that Facebook would downgrade stories
shared by businesses, celebrities, and publishers, and prioritize stories
shared by friends and family. Critics surmised that these changes were just a
way of finally giving the publishing industry a middle finger. “Facebook has
essentially told media to kiss off,” Franklin Foer wrote in The Atlantic.
“Facebook will be back primarily in the business of making us feel terrible
about the inferiority of our vacations, the relative mediocrity of our
children, teasing us into sharing more of our private selves.”
People who know him say Zuckerberg has truly been altered
in the crucible of the past several months.
But inside Facebook, executives insist this isn’t
remotely the case. According to Anker, who retired from the company in December
but worked on these changes, and who has great affection for the management
team, “It would be a mistake to see this as a retreat from the news industry.
This is a retreat from ‘Anything goes if it works with our algorithm to drive
up engagement.’” According to others still at the company, Zuckerberg didn’t
want to pull back from actual journalism. He just genuinely wanted there to be
less crap on the platform: fewer stories with no substance; fewer videos you
can watch without thinking.
And then, a week after telling the world about
“meaningful interactions,” Zuckerberg announced another change that seemed to
answer these concerns, after a fashion. For the first time in the company’s
history, he said in a note posted to his personal page, Facebook will start to
boost certain publishers—ones whose content is “trustworthy, informative, and
local.” For the past year, Facebook has been developing algorithms to hammer
publishers whose content is fake; now it’s trying to elevate what’s good. For
starters, he explained, the company would use reader surveys to determine which
sources are trustworthy. That system, critics were quick to point out, will
surely be gamed, and many people will say they trust sources just because they
recognize them. But this announcement, at least, went over a little better in
boardrooms and newsrooms. Right after the post went up, the stock price of The
New York Times shot up—as did that of News Corp.
Zuckerberg has hinted—and insiders have confirmed—that we
should expect a year of more announcements like this. The company is
experimenting with giving publishers more control over paywalls and allowing
them to feature their logos more prominently to reestablish the brand
identities that Facebook flattened years ago. One somewhat hostile outside
suggestion has come from Facebook’s old antagonist Murdoch, who said in late
January that if Facebook truly valued “trustworthy” publishers, it should pay
them carriage fees.
The fate that Facebook really cares about, however, is
its own. It was built on the power of network effects: You joined because
everyone else was joining. But network effects can be just as powerful in
driving people off a platform. Zuckerberg understands this viscerally. After
all, he helped create those problems for MySpace a decade ago and is arguably
doing the same to Snap today. Zuckerberg has avoided that fate, in part,
because he has proven brilliant at co-opting his biggest threats. When social
media started becoming driven by images, he bought Instagram. When messaging
took off, he bought WhatsApp. When Snapchat became a threat, he copied it. Now,
with all his talk of “time well spent,” it seems as if he’s trying to co-opt
Tristan Harris too.
But people who know him say that Zuckerberg has truly
been altered in the crucible of the past several months. He has thought deeply;
he has reckoned with what happened; and he truly cares that his company fix the
problems swirling around it. And he’s also worried. “This whole year has
massively changed his personal techno-optimism,” says an executive at the
company. “It has made him much more paranoid about the ways that people could
abuse the thing that he built.”
The past year has also altered Facebook’s fundamental
understanding about whether it’s a publisher or a platform. The company has
always answered that question defiantly—platform, platform, platform—for
regulatory, financial, and maybe even emotional reasons. But now, gradually,
Facebook has evolved. Of course it’s a platform, and always will be. But the
company also realizes now that it bears some of the responsibilities that a
publisher does: for the care of its readers, and for the care of the truth. You
can’t make the world more open and connected if you’re breaking it apart. So
what is it: publisher or platform? Facebook seems to have finally recognized
that it is quite clearly both.
Comments
Post a Comment