Criminals are using deepfakes to impersonate CEOs

Criminals are using deepfakes to impersonate CEOs


BY MICHAEL GROTHAUS 07.19.19
“Deepfakes” refers to media that has been altered by artificial intelligence to make it appear that a person is doing or saying something that, in fact, that person has never done or said. The technology first began appearing a few years ago, with crude deepfake tools allowing users to make it look like celebrities were recorded engaging in sexual activities they actually didn’t take part in.
But deepfakes are now moving past the porn realm and into the criminal world where bad actors are using the tech to impersonate CEOs, Axios reports. However, for now, it appears criminals are using deepfake audio instead of video to pull off scams:
·        Symantec, a major cybersecurity company, says it has seen three successful audio attacks on private companies. In each, a company’s “CEO” called a senior financial officer to request an urgent money transfer.
·        Scammers were mimicking the CEOs’ voices with an AI program that had been trained on hours of their speech—culled from earnings calls, YouTube videos, TED talks, and the like.
·        Millions of dollars were stolen from each company, whose names were not revealed. The attacks were first reported in the BBC.
The threat deepfake audio poses to businesses cannot be understated. While someone using deepfake audio to pretend they’re the CEO of a company and getting that company’s accounting department to wire them $1 million because of an “emergency” is one thing, the tech could also be used for sabotage. What if one rival–or even a nation-state–wanted to sink Apple’s stock price? A well-timed deepfake audio clip that purports to show Tim Cook having a private conversation with someone about iPhone sales tanking could do just that–wiping billions off the stock market in seconds.
And unfortunately, right now there just aren’t reliable tools to easily and automatically identify deepfake media on the web. By the time a deepfake video or audio recording has been debunked, the damage could already be done.
If you want to see a deepfake in action, check out the one of President Obama, voiced by Jordan Peel, below.
https://www.fastcompany.com/90379001/criminals-are-using-deepfakes-to-impersonate-ceos

Comments

Popular posts from this blog

BMW traps alleged thief by remotely locking him in car

Report: World’s 1st remote brain surgery via 5G network performed in China

New ATM's: withdraw money with veins in your finger