Misogyny

Scarlett Johansson warns of ‘virtually lawless abyss’ of the internet after her image appears in ‘deepfake’ porn videos

Scarlett Johansson in Los Angeles, on April 12, 2015. (Getty Images)

“If you were the worst misogynist in the world,” Mary Anne Franks, a University of Miami law professor and president of the Cyber Civil Rights Initiative told the Washington Post, “this technology would allow you to accomplish whatever you wanted.”

Franks is talking about recent technical breakthroughs that have made fake-video creation — once considered extremely difficult — more accessible than ever. All that’s needed is a computer and a broad collection of photos, like those posted everyday by the millions of people onto social media.

The term “deepfakes” comes from the online name of an anonymous creator, who last year began using the software to create face-swapped porn videos of various actresses and posting them to Reddit, inspiring a wave of copycats. (Reddit and Pornhub banned the videos this year, but they simply moved elsewhere.)

Celebrities are favored targets, partly because their fame guarantees a curious audience but also because there are so many publicly available images of them that can be utilized to create the videos. Scarlett Johansson has been superimposed into dozens of graphic sex scenes over the past year that have circulated across the Web: One video has been watched on a major porn site more than 1.5 million times, the Post reports. Johansson said she worries it may already be too late for women and children to protect themselves against the “virtually lawless (online) abyss.”

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she said. “The fact is that trying to protect yourself from the Internet and its depravity is basically a lost cause. . . . The Internet is a vast wormhole of darkness that eats itself.”

Celebrities, however, are not the only victims of deepfake videos. Rana Ayyub, an investigative journalist in India who has endured years of harassment told the Post that a deepfake sex video featuring her image, that circulated widely on social platforms last April, “managed to break me.” She threw up when she saw it, cried for days and was eventually rushed to the hospital, overwhelmed with anxiety. “This is a lot more intimidating than a physical threat,” she said. “This has a lasting impact on your mind. And there’s nothing that could prevent it from happening to me again.”

Legal recourse is, unfortunately, extremely limited. “The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause, for the most part,” says Johansson, observing that “every country has their own legalese regarding the right to your own image.”

Hany Farid, a Dartmouth College computer-science professor who specializes in examining manipulated photos and videos, told the Post that Google and other tech giants need “to get more serious about how weaponized this technology can be.”

“If a biologist said, ‘Here’s a really cool virus; let’s see what happens when the public gets their hands on it,’ that would not be acceptable. And yet it’s what Silicon Valley does all the time,” he said. “It’s indicative of a very immature industry. We have to understand the harm and slow down on how we deploy technology like this.”

Read the full stories about the deepfake phenomenon and Scarlett Johansson’s take on it at The Washington Post.

Related

Protesters in South Korea rage against epidemic of hidden-camera pornography

Marines revenge-porn page resurfaces with new explicit images of female servicemembers

Vile man posts video of himself having sex with a woman to private Facebook group, also body-shames her

Leave a Reply

Your email address will not be published. Required fields are marked *