Fake, misleading social media posts exploding globally, Oxford study finds
FILE - This Nov. 1, 2017, file photo shows some of the
Facebook and Instagram ads linked to a Russian effort to disrupt the American
political process and stir up tensions around divisive social issues, released
by members of the U.S. House Intelligence committee, are photographed in
Washington. Facebook says it will require political advertisers in the U.S. to
label “issue ads” that disclose who paid for them, part of its ongoing efforts
to prevent elections-related misuse of its platform. Such ads played
prominently in Russia’s efforts to interfere in the 2016 U.S. elections. Jon
Elswick, File AP Photo
Fake, misleading social media posts exploding globally,
Oxford study finds
BY GREG GORDON July 20, 2018 06:00 AM
WASHINGTON - Russia’s social media blitz to influence the
2016 U.S. election was part of a global “phenomenon” in which a broad spectrum
of governments and political parties used Internet platforms to spread junk
news and disinformation in at least 48 countries last year, an Oxford
University study has found.
Including U.S. government programs aimed at countering
extremists such as Islamic fundamentalists, about $500 million has been spent
worldwide on research, development or implementation of social media
“psychological operations” since 2010, the authors estimated.
“The manipulation of public opinion over social media
platforms has emerged as a critical threat to public life,” the researchers
wrote. They warned that, at a time when news consumption is increasingly
occurring over the Internet, this trend threatens “to undermine trust in the
media, public institutions and science.”
In an earlier analysis covering 2016, the researchers
found governments and political parties had deployed social media to manipulate
the public in 28 countries.
“Disinformation during elections is the new normal,”
co-author Philip Howard told McClatchy. “In democracies around the world, more
and more political parties are using social media to spread junk information
and propaganda to voters.
“The largest, most complex disinformation campaigns are
managed from Russia and directed at democracies. But increasingly, I’m also
worried about copycat organizations springing up in other authoritarian
regimes.”
In about a fifth of the countries evaluated, the
researchers reported disinformation campaigns are occurring on chat
applications, even encrypted platforms such as WhatsApp, Signal or Telegram.
Howard said young people in poorer nations “develop their political identities”
on those sites, “so that’s where the disinformation campaigns will go.”
Russia’s 2016 stealthy social media campaign was part of
a broad cyber offensive that U.S. intelligence agencies say was aimed at
helping Donald Trump win the White House. It originated at a so-called “troll
farm” in St. Petersburg, where Russian operatives, a number of whom now face
U.S. criminal charges, allegedly placed Facebook and Twitter ads carrying fake
or harshly critical news about Democratic presidential candidate Hillary
Clinton or aimed at sowing divisions among voters on issues such as race, gun
rights and immigration. The impact of some of those ads was amplified via
automated messages, known as “bots,” that reached millions of Americans.
Facebook and Twitter, facing pressure from the House and
Senate intelligence committees, each took significant measures to tighten
monitoring of social media activity and remove fake accounts and bots. Mark
Zuckerberg, Facebook’s chairman and chief executive, ordered the hiring of
thousands of employees to police activity over its platform and announced the
firm would require disclosure in all future political messages of the identity
of advertising sponsors.
But the latest Oxford study suggests that use of social
media to carry propaganda or misleading political messages may still be
expanding faster than the rising numbers of cyber cops.
“Even though Twitter and Facebook have been trying a lot
of things to rein in the use of fake accounts and bots, we actually found 38
countries used bots last year, compared with 17 in the year before,” Howard
said.
Brazil, a country rocked with political turmoil in recent
years, has been the scene of “lots of manipulation over social media accounts,”
Howard said. “The political parties have been locked in lawsuits with each
other over the use of bots.”
In five countries — Brazil, Germany, Mexico, Taiwan and
the United States — the cyber operatives have found ways to complicate tracking
and disabling of bot accounts, the researchers found. Cyber operatives in those
nations have taken to occasionally injecting comments or typographical errors
amid the bot streams to signal human involvement, Howard said.
Based on a canvas of publicly available data, the
researchers estimated that in China, 300,000 to 2 million people were used in
2017 as “cyber troops” engaged in social media campaigns that are largely
directed internally. Similar tasks are performed by at least 10,000 people in
Azerbaijan, Iran, Ukraine and Vietnam, they said.
“Social media manipulation is big business,” the study
said. “We estimate that tens of millions of dollars are being spent on social
media manipulation campaigns, involving tens of thousands of professional
staff.”
Comments
Post a Comment