Web giants to cooperate on removal of extremist content
By Julia Fioretti December 5, 2016
BRUSSELS (Reuters) - Web giants YouTube, Facebook,
Twitter and Microsoft will step up efforts to remove extremist content from
their websites by creating a common database.
The companies will share 'hashes' - unique digital
fingerprints they automatically assign to videos or photos - of extremist
content they have removed from their websites to enable their peers to identify
the same content on their platforms.
"We hope this collaboration will lead to greater
efficiency as we continue to enforce our policies to help curb the pressing
global issue of terrorist content online," the companies said in a
statement on Tuesday.
Tech companies have long resisted outside intervention in
how their sites should be policed, but have come under increasing pressure from
Western governments to do more to remove extremist content following a wave of
militant attacks.
YouTube and Facebook have begun to use hashes to
automatically remove extremist content.
But many providers have relied until now mainly on users
to flag content that violates terms of service. Flagged material is then
individually reviewed by human editors who delete postings found to be in
violation.
Twitter suspended 235,000 accounts between February and
August this year and has expanded the teams reviewing reports of extremist
content.
Each company will decide what image and video hashes to
add to the database and matching content will not be automatically removed,
they said.
The database will be up and running in early 2017 and
more companies could be brought into the partnership.
The European Union set up an EU Internet Forum last year
bringing together the internet companies, interior ministers and the EU
Counter-Terrorism Coordinator to find ways of removing extremist content.
The Forum will meet again on Thursday, when ministers are
expected to ask the companies about their efforts and helping to provide
evidence to convict foreign fighters.
(Reporting by Julia Fioretti; editing by John
Stonestreet)
Comments
Post a Comment