Content Moderator Sues Facebook, Says Job Gave Her PTSD
Content Moderator Sues Facebook, Says Job Gave Her PTSD
In the class action lawsuit, the woman says that
thousands of content moderators are subjected to "highly toxic"
content as part of the job.
By Jason Koebler and Joseph Cox
Content Moderator Sues Facebook, Says Job Gave Her PTSD
In the class action lawsuit, the woman says that
thousands of content moderators are subjected to "highly toxic"
content as part of the job.
By Jason Koebler and Joseph Cox Sep 24 2018, 9:08am
A woman in California sued Facebook Friday for being
“exposed to highly toxic, unsafe, and injurious content during her employment
as a content moderator at Facebook.”
Selena Scola was a content moderator at Facebook’s Menlo
Park, California headquarters from June 2017 through March of this year,
according to the lawsuit. She worked for a contractor called Pro Unlimited,
Inc., which helps Facebook delete content that violates its Community
Standards. Facebook has roughly 7,500 content moderators worldwide, who are
tasked with deleting hate speech, graphic violence and self harm images and
video, nudity and sexual content, bullying, and a host of other content that
violates its policies.
Scola’s lawyers say that she developed post traumatic
stress disorder as a result of “constant and unmitigated exposure to highly
toxic and extremely disturbing images at the workplace,” and allege that
Facebook does not have proper mental health services and monitoring in place
for its content moderators. The case was filed as a class-action lawsuit, but
at the moment Scola is the only named plaintiff; the lawsuit names a potential
class of “thousands” of current and former moderators in California.
The lawsuit does not currently include specific details
about Scola’s job and instead relies on news investigations about how content
moderation works; Scola’s lawyers told Motherboard that further into the legal
process she will detail them. “This complaint does not include these
[specifics] because Ms. Scola fears that Facebook may retaliate against her
using a purported non-disclosure agreement.”
Moderating content is a difficult job—multiple
documentaries, longform investigations, and law articles have noted that
moderators work long hours, are exposed to disturbing and graphic content, and
have the tough task of determining whether a specific piece of content violates
Facebook’s sometimes byzantine and constantly-changing rules. Facebook prides
itself on accuracy, and with more than 2 billion users, Facebook’s work force
of moderators are asked to review millions of possibly infringing posts every
day.
"An outsider might not totally comprehend, we aren't
just exposed to the graphic videos—you'll have to watch them closely, often
repeatedly, for specific policy signifiers,” one moderation source told
Motherboard. “Someone could be being graphically beaten in a video, and you
could have to watch it a dozen times, sometimes with others present, while you
decide whether the victim's actions would count as self-defense or not, or
whether the aggressor is the same person who posted the video."
The source said they are “not surprised at all” that
Facebook is now facing a lawsuit. “It’s definitely a thing that some coworkers
and former coworkers speak of.”
Another moderation source at Facebook told Motherboard:
“I’m not surprised.”
The lawsuit alleges that “Facebook does not provide its
content moderators with sufficient training or implement the safety standards
it helped develop … Ms. Scola’s PTSD symptoms may be triggered when she touches
a computer mouse, enters a cold building, watches violence on television, hears
loud noises, or is startled. Her symptoms are also triggered when she recalls
or describes graphic imagery she was exposed to as a content moderator.”
A Facebook spokesperson told Motherboard that it is
"currently reviewing this claim."
"We recognize that this work can often be difficult.
That is why we take the support of our content moderators incredibly seriously,
starting with their training, the benefits they receive, and ensuring that
every person reviewing Facebook content is offered psychological support and
wellness resources," the spokesperson said. "Facebook employees
receive these in house and we also require companies that we partner with for
content review to provide resources and psychological support, including onsite
counseling—available at the location where the plaintiff worked—and other
wellness resources like relaxation areas at many of our larger
facilities."
Earlier this year, when we visited Facebook’s
headquarters, multiple high-level employees told us that the company is working
to make the job less stressful and potentially traumatic for its moderators.
The company does have specific training protocols for content moderators,
though the lawsuit alleges they are insufficient.
"There’s actual physical environments where you can
go into, if you want to just kind of chillax, or if you want to go play a game,
or if you just want to walk away, you know, be by yourself, that support system
is pretty robust"
“This job is not for everyone, candidly, and we recognize
that,” Brian Doegan, Facebook’s director of global training, community
operations, told Motherboard in June. He said that new hires are gradually
exposed to graphic content to “so we don't just radically expose you, but
rather we do have a conversation about what it is, and what we're going to be
seeing.”
Doegan said that there are rooms in each office that are
designed to help employees de-stress.
“What I admire is that at any point in this role, you
have access to counsellors, you have access to having conversations with other
people,” he said. “There’s actual physical environments where you can go into,
if you want to just kind of chillax, or if you want to go play a game, or if
you just want to walk away, you know, be by yourself, that support system is
pretty robust, and that is consistent across the board.”
Carolyn Glanville, a Facebook spokesperson, told
Motherboard in June that each office and contractor that does content
moderation has mental health services, but that the types of services offered
vary by country depending on what the company believes are best practices as
determined by its culture.
“Whereas [in some countries] it's fine to just go walk
across the hall to a counsellor, and they don't care, in other cultures, they
don't do that, they would do it off hours, and other people might not know
about it,” she said.
Scola’s lawsuit asks the court to create a “Facebook-funded
medical monitoring program to facilitate the diagnosis and aquate treatment of
Plaintiff and the class for psychological trauma, including but not limited to
PTSD.”
Next, a judge in California will decide if the case has
enough merit to move forward.
Comments
Post a Comment