Facebook still auto-generating Islamic State, al-Qaida pages
Facebook still auto-generating Islamic State,
al-Qaida pages
WASHINGTON (AP) — In the face of criticism that Facebook is not
doing enough to combat extremist messaging, the company likes to say that its
automated systems remove the vast majority of prohibited content glorifying the
Islamic State group and al-Qaida before it’s reported.
But a whistleblower’s complaint shows that Facebook itself has
inadvertently provided the two extremist groups with a networking and
recruitment tool by producing dozens of pages in their names.
The social networking company appears to have made little
progress on the issue in the four months since The Associated Press detailed how pages that
Facebook auto-generates for businesses are aiding Middle East extremists and
white supremacists in the United States.
On Wednesday, U.S. senators on the Committee on Commerce,
Science, and Transportation will be questioning representatives from social
media companies, including Monika Bickert, who heads Facebooks efforts to stem
extremist messaging.
The new details come from an update of a complaint to
the Securities and Exchange Commission that the National Whistleblower Center
plans to file this week. The filing obtained by the AP identifies almost 200
auto-generated pages — some for businesses, others for schools or other
categories — that directly reference the Islamic State group and dozens more
representing al-Qaida and other known groups. One page listed as a “political
ideology” is titled “I love Islamic state.” It features an IS logo inside the
outlines of Facebook’s famous thumbs-up icon.
In response to a request for comment, a Facebook spokesperson
told the AP: “Our priority is detecting and removing content posted by people
that violates our policy against dangerous individuals and organizations to
stay ahead of bad actors. Auto-generated pages are not like normal Facebook
pages as people can’t comment or post on them and we remove any that violate
our policies. While we cannot catch every one, we remain vigilant in this
effort.”
Facebook has a number of functions that auto-generate pages from
content posted by users. The updated complaint scrutinizes one function that is
meant to help business networking. It scrapes employment information from
users’ pages to create pages for businesses. In this case, it may be helping
the extremist groups because it allows users to like the pages, potentially
providing a list of sympathizers for recruiters.
The new filing also found that users’ pages promoting extremist
groups remain easy to find with simple searches using their names. They
uncovered one page for “Mohammed Atta” with an iconic photo of one of the
al-Qaida adherents, who was a hijacker in the Sept. 11 attacks. The page lists
the user’s work as “Al Qaidah” and education as “University Master Bin Laden”
and “School Terrorist Afghanistan.”
Facebook has been working to limit the spread of extremist
material on its service, so far with mixed success. In March, it expanded its
definition of prohibited content to include U.S. white nationalist and white
separatist material as well as that from international extremist groups. It
says it has banned 200 white supremacist organizations and 26 million pieces of
content related to global extremist groups like IS and al-Qaida.
It also expanded its definition of terrorism to include not just
acts of violence attended to achieve a political or ideological aim, but also
attempts at violence, especially when aimed at civilians with the intent to
coerce and intimidate. It’s unclear, though, how well enforcement works if the
company is still having trouble ridding its platform of well-known extremist
organizations’ supporters.
But as the report shows, plenty of material gets through the
cracks — and gets auto-generated.
The AP story in May highlighted the auto-generation problem, but
the new content identified in the report suggests that Facebook has not solved
it.
The report also says that researchers found that many of the
pages referenced in the AP report were removed more than six weeks later on
June 25, the day before Bickert was questioned for another congressional
hearing.
The issue was flagged in the initial SEC complaint filed by the
center’s executive director, John Kostyack, that alleges the social media
company has exaggerated its success combatting extremist messaging.
“Facebook would like us to believe that its magical algorithms
are somehow scrubbing its website of extremist content,” Kostyack said. “Yet
those very same algorithms are auto-generating pages with titles like ‘I Love
Islamic State,’ which are ideal for terrorists to use for networking and
recruiting.”
Ortutay reported from San
Francisco.
Comments
Post a Comment