Facebook Experiments Had Few Limits With Little Oversight
Facebook Experiments Had Few Limits
Data Science Lab Conducted Tests on Users With Little
Oversight
By REED ALBERGOTTI CONNECT
July 2, 2014 7:39 p.m. ET
Facebook operating chief Sheryl Sandberg says the
company's experiment on user emotions was 'poorly communicated.' Bloomberg News
Thousands of Facebook Inc. users received an unsettling
message two years ago: They were being locked out of the social network because
Facebook believed they were robots or using fake names. To get back in, the
users had to prove they were real.
In fact, Facebook knew most of the users were legitimate.
The message was a test designed to help improve Facebook's antifraud measures.
In the end, no users lost access permanently.
The experiment was the work of Facebook's Data Science
team, a group of about three dozen researchers with unique access to one of the
world's richest data troves: the movements, musings and emotions of Facebook's
1.3 billion users.
The little-known group was thrust into the spotlight this
week by reports about a 2012 experiment in which the news feeds of nearly
700,000 Facebook users were manipulated to show more positive or negative
posts. The study found that users who saw more positive content were more
likely to write positive posts, and vice versa.
Facebook Chief Operating Officer Sheryl Sandberg said
Wednesday during a trip to India that the study was "part of ongoing
research companies do to test different products" and was "poorly
communicated."
The company said that after the feedback on the study,
"We are taking a very hard look at this process to make more
improvements."
Until recently, the Data Science group operated with few
boundaries, according to a former member of the team and outside researchers.
At a university, researchers likely would have been required to obtain consent
from participants in such a study. But Facebook relied on users' agreement to
its Terms of Service, which at the time said data could be used to improve
Facebook's products. Those terms now say that user data may be used for
research.
"There's no review process, per se," said
Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013.
"Anyone on that team could run a test," Mr. Ledvina said.
"They're always trying to alter peoples' behavior."
He recalled a minor experiment in which he and a product
manager ran a test without telling anyone else at the company. Tests were run
so often, he said, that some data scientists worried that the same users, who
were anonymous, might be used in more than one experiment, tainting the
results.
Facebook said that since the study on emotions, it has
implemented stricter guidelines on Data Science team research. Since at least
the beginning of this year, research beyond routine product testing is reviewed
by a panel drawn from a group of 50 internal experts in fields such as privacy
and data security. Facebook declined to name them.
Company research intended to be published in academic
journals receives additional review from in-house experts on academic research.
Some of those experts are also on the Data Science team, Facebook said,
declining to name the members of that panel.
A spokesman said Facebook is considering additional
changes.
Since its creation in 2007, Facebook's Data Science group
has run hundreds of tests. One published study deconstructed how families
communicate, another delved into the causes of loneliness. One test looked at
how social behaviors spread through networks. In 2010, the group measured how
"political mobilization messages" sent to 61 million people caused
people in social networks to vote in the 2010 congressional elections.
Many of Facebook's data scientists hold doctoral degrees
from major universities in fields including computer science, artificial
intelligence and computational biology. Some worked in academic research before
joining Facebook.
Adam Kramer, the lead author of the study about emotions,
said in a 2012 interview on Facebook's website that he joined the company
partly because it is "the largest field study in the history of the
world." Mr. Kramer, who has a doctorate in social psychology from the
University of Oregon, said that in academia he would have had to get papers
published and then hope that someone noticed. At Facebook, "I just message
someone on the right team and my research has an impact within weeks, if not
days."
Much of Facebook's research is less controversial than
the emotions study, testing features that will prompt users to spend more time
on the network and click on more ads. Other Internet companies, including Yahoo
Inc., Microsoft Corp., Twitter Inc. and Google Inc., conduct research on their users and their
data.
The recent ruckus is "a glimpse into a wide-ranging
practice," said Kate Crawford, a visiting professor at the Massachusetts
Institute of Technology's Center for Civic Media and a principal researcher at
Microsoft Research. Companies "really do see users as a willing
experimental test bed" to be used at the companies' discretion.
Facebook's team has drawn particular interest because it
occasionally publishes its work in academic journals that touch on users'
personal lives, including the study about positive and negative posts.
"Facebook deserves a lot of credit for pushing as
much research into the public domain as they do," said Clifford Lampe, an
associate professor at the University of Michigan's School of Information who
has worked on about 10 studies with Facebook researchers. If Facebook stopped
publishing studies, he said, "It would be a real loss for science."
Dr. Lampe said he has been in touch with members of the
Data Science team since the controversy erupted. "They've been listening
to the arguments and they take them very seriously," he said.
Mr. Ledvina, the former Facebook data scientist, said
some researchers debated the merits of a study similar to the one that accused
users of being robots but there was no formal review, and none of the users in
the study were notified that it was an experiment.
"I'm sure some people got very angry
somewhere," he said. "Internally, you get a little desensitized to
it."
Comments
Post a Comment