Fake news 2.0: personalized, optimized, and even harder to stop


Fake news 2.0: personalized, optimized, and even harder to stop

Artificial intelligence will automate and optimize fake news, warns a technology supplier to the CIA.

by Will Knight  March 27, 2018

Fake news may have already influenced politics in the US, but it’s going to get a lot worse, warns an AI consultant to the CIA.

Sean Gourley, founder and CEO of Primer, a company that uses software to mine data sources and automatically generate reports for the CIA and other clients, told a conference in San Francisco that the next generation of fake news would be far more sophisticated thanks to AI.

“The automation of the generation of fake news is going to make it very effective,” Gourley told the audience at EmTech Digital, organized by MIT Technology Review.

The warning should cause concern at Facebook. The social network has been embroiled in a scandal after failing to prevent fake news, some of it created by Russian operatives, from reaching millions of people in the months before the 2016 presidential election. More recently the company been hit by the revelation that it let Cambridge Analytica, a company tied to the Trump presidential campaign, mine users’ personal data.

In recent interviews, Facebook’s CEO, Mark Zuckerberg, suggested that the company would use AI to spot fake news. According to Gourley, AI could be used in the service of the opposite goal as well.

Gourley noted that the fake news seen to date has been relatively simple, consisting of crude, hand-crafted stories posted to social media at regular intervals. Technology such as Primer’s could easily be used to generate convincing fake stories automatically, he said, and that could mean fake reports tailored to an individual’s interests and sympathies and carefully tested before being released, to maximize their impact. “I can generate a million stories, see which ones get the most traction, double down on those,” Gourley said.

Gourley added that fake news has so far been fed into social-media platforms like Facebook essentially at random. A more sophisticated understanding of network dynamics, as well as the mechanisms used to judge the popularity of content, could amplify a post’s effect.

“Where you inject information is going to have a massive impact on how it spreads and diffuses,” Gourley said. He went on to suggest that a platform like Facebook may be inherently flawed for sharing news. “All we’ve seen at the moment is primitive, and it’s had a profound impact, and more is coming,” he said.

Gourley did, however, agree that AI would be at least part of the solution. “If machines are going to produce it on one side,” he said, “then you’d better have machines helping you sift through it on the other.”

Comments

Popular posts from this blog

BMW traps alleged thief by remotely locking him in car

Report: World’s 1st remote brain surgery via 5G network performed in China

New ATM's: withdraw money with veins in your finger