Facebook extends ban on hate speech to ‘white nationalists’
Facebook extends ban on hate speech to ‘white
nationalists’
Facebook said Wednesday, March 27, that it is broadening
its definition of hate speech to apply to "white nationalists" and
"white separatists." The company previously allowed posts from those
groups even though it has long banned "white supremacists."
By BARBARA ORTUTAY March 27, 2019
SAN FRANCISCO (AP) — Facebook is extending its ban on
hate speech to prohibit the promotion and support of white nationalism and
white separatism.
The company previously allowed such material even though
it has long banned white supremacists. The social network said Wednesday that
it didn’t apply the ban previously to expressions of white nationalism because
it linked such expressions with broader concepts of nationalism and separatism
— such as American pride or Basque separatism (which are still allowed).
But civil rights groups and academics called this view
“misguided” and have long pressured the company to change its stance. Facebook
said it concluded after months of “conversations” with them that white
nationalism and separatism cannot be meaningfully separated from white
supremacy and organized hate groups.
Critics have “raised these issues to the highest levels
at Facebook (and held) a number of working meetings with their staff as we’ve
tried to get them to the right place,” said Kristen Clarke, president and
executive director of the Lawyers’ Committee for Civil Rights Under Law, a
Washington, D.C.-based legal advocacy group.
“This is long overdue as the country continues to deal
with the grip of hate and the increase in violent white supremacy,” she said.
“We need the tech sector to do its part to combat these efforts.”
Though Facebook Inc. said it has been working on the
change for three months, it comes less than two weeks after Facebook received
widespread criticism after the suspect in shootings at two New Zealand mosques
that killed 50 people was able to broadcast the massacre on live video on
Facebook. Also on Wednesday, a man convicted on state murder charges in a
deadly car attack at a white nationalist rally in Charlottesville, Virginia, pleaded
guilty to federal hate crime charges. The bloodshed in 2017 prompted tech
companies to take a firmer stand against accounts used to promote hate and
violence.
But apparently not enough. Now, Facebook is trying to do
more. As part of Wednesday’s change, people who search for terms associated
with white supremacy on Facebook will be directed to a group called Life After
Hate, which was founded by former extremists who want to help people leave the
violent far-right.
Clarke called the idea that white supremacism is
different than white nationalism or white separatism a misguided “distinction
without a difference.”
She said the New Zealand attack was a “powerful reminder
about why we need the tech sector to do more to stamp out the conduct and
activity of violent white supremacists.”
Rashad Robinson, the president of Color of Change, says
the racial justice group warned Facebook to the growing dangers of white
nationalists on its platform years ago and that he was glad to see Wednesday’s
announcement.
“Facebook’s update should move Twitter, YouTube, and
Amazon to act urgently to stem the growth of white nationalist ideologies,
which find space on platforms to spread the violent ideas and rhetoric that
inspired the tragic attacks witnessed in Charlottesville, Pittsburgh, and now
Christchurch,” he said.
Twitter does not currently ban white nationalists or
white separatists, though its hateful conduct policy forbids the promotion of
violence or threats against people on the basis of race, gender, religion and
other protected categories. It also bans the use of “hateful images or symbols”
in profile or header images. YouTube also bans hate speech and says it removes
content promoting violence or hatred on the basis of these categories. Amazon
has an “offensive products” policy that does not allow the promotion or
glorification of hatred, racial violence or sexual or religious intolerance.
The three companies did not immediately respond to messages for comment on
Wednesday.
Madihha Ahussain, a special counsel for anti-Muslim
bigotry at the nonprofit Muslim Advocates, said what’s needed now is more
information on how Facebook will define white nationalist content — and how it
will enforce its new rules.
“Now, the question is: how will Facebook interpret and enforce
this new policy to prevent another tragedy like the Christchurch mosque
attacks?” she said.
Associated Press Writer Michael Kunzelman in College
Park, Maryland, contributed to this story.
Comments
Post a Comment