Most Deepfakes Are Porn, and They're Multiplying Fast

Most Deepfakes Are Porn, and They're Multiplying Fast

TOM SIMONITE October 9, 2019

Researchers worry that doctored videos may disrupt the 2020 election, but a new report finds that 96 percent of deepfakes are pornographic.

In November 2017, a Reddit account called deepfakes posted pornographic clips made with software that pasted the faces of Hollywood actresses over those of the real performers. Nearly two years later, deepfake is a generic noun for video manipulated or fabricated with artificial intelligence software. The technique has drawn laughs on YouTube, along with concern from lawmakers fearful of political disinformation. Yet a new report that tracked the deepfakes circulating online finds they mostly remain true to their salacious roots.
Startup Deeptrace took a kind of deepfake census during June and July to inform its work on detection tools it hopes to sell to news organizations and online platforms. It found almost 15,000 videos openly presented as deepfakes—nearly twice as many as seven months earlier. Some 96 percent of the deepfakes circulating in the wild were pornographic, Deeptrace says.
The count is unlikely to be exhaustive, but the findings are a reminder that despite speculation about deepfakes destabilizing elections, the technology is mostly being used very differently, including as a tool for harassment. One worrying trend: Deeptrace says the tools needed to create deepfakes are becoming more sophisticated and more widely available.
The startup's report describes a niche but thriving ecosystem of websites and forums where people share, discuss, and collaborate on pornographic deepfakes. Some are commercial ventures that run advertising around deepfake videos made by taking a pornographic clip and editing in a person's face without that individual's consent.
All the people edited into the pornographic clips Deeptrace found were women. Clips of the most popular figures—Western actresses and South Korean pop celebrities—had millions of views. Nonprofits have already reported that women journalists and political activists are being attacked or smeared with deepfakes. Henry Ajder, a researcher at Deeptrace who worked on the firm's report, says there are deepfake forums where users discuss or request pornographic deepfakes of women they know, such as ex-girlfriends, wanting to see them edited into a pornographic clip.
Danielle Citron, a law professor at Boston University, describes pornographic deepfakes made without a person’s consent as an “invasion of sexual privacy.” She spoke at a June hearing by the US House Intelligence Committee about artificial intelligence media manipulation tools.
The porn industry has helped pioneer new media technologies, from VHS and pop-up ads to streaming video. Citron says that the preponderance of pornographic deepfakes is a reminder of another consistent lesson from the history of technology: “At each stage we’ve seen that people use what’s ready and at hand to torment women. Deepfakes are an illustration of that.”
Citron helped spur the recent spread of state legislation on revenge porn, which is now subject to laws in at least 46 states and the District of Columbia. California is among them; last week its governor, Gavin Newsom, signed into law a bill that allows a person edited into sexually explicit material without consent to seek civil damages against the person who created or disclosed it.
The law professor also says she is currently talking with House and Senate lawmakers from both parties about new federal laws to penalize distribution of malicious forgeries and impersonations, including deepfakes. “We’ve been encouraged that the uptake has been swift,” she adds.
Last week, senators Marco Rubio, the Republican of Florida, and Mark Warner, the Democrat from Virginia, both of whom are members of the Senate Intelligence Committee, wrote to Facebook and 10 other social media sites seeking more details of how they plan to detect and respond to malicious deepfakes. The legislators cautioned that fake clips could have a “corrosive impact on our democracy.”
Ajder of Deeptrace plays down fears that a fake clip could significantly affect the 2020 election. But the startup’s report notes that growing awareness of the technology can fuel political deception.
In June, a Malaysian political aide was arrested after a video surfaced purportedly showing him having sex with the country’s minister of economic affairs. (Gay sex is illegal in Malaysia.) The country’s prime minister said the video was a deepfake, but independent experts have been unable to determine if the video was manipulated. “Deepfakes can provide plausible deniability,” Ajder says.
To conduct its analysis, Deeptrace used a mixture of manual searching and web scraping tools and data analysis to record known deepfakes from major porn sites, mainstream video services such as YouTube, and deepfake-specific sites and forums.
That methodology is imperfect. It couldn’t account for deepfakes that successfully passed off as real clips or probe every hidden online corner. Jack Clark, policy director at independent AI lab OpenAI, says the Deeptrace report is nonetheless a welcome attempt to gather empirical evidence on deepfakes, which has been lacking.
Clark predicts that fake videos won’t be the first example of unsavory consequences from the spread of artificial intelligence tools through commercialization and open source. “Individuals will mess around with the technology and some of the ways they mess around will be harmful and offensive,” he notes.

Comments

Popular posts from this blog

Report: World’s 1st remote brain surgery via 5G network performed in China

BMW traps alleged thief by remotely locking him in car

Visualizing The Power Of The World's Supercomputers