The video showed the woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile.
It was her face. But it had been seamlessly grafted, without her knowledge or consent, onto someone else’s body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online.
She felt nauseous and mortified: What if her co-workers saw it? Her family, her friends? Would it change how they thought of her? Would they believe it was a fake?
“I feel violated – this icky kind of violation,” said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career. “It’s this weird feeling, like you want to tear everything off the internet. But you know you can’t.”
Airbrushing and Photoshop long ago opened photos to easy manipulation. Now, videos are becoming just as vulnerable to fakes that look deceptively real. Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike “deepfake” videos have quickly multiplied across the internet, blurring the line between truth and lie.
But the videos have also been weaponised disproportionately against women, representing a new and degrading means of humiliation, harassment and abuse. The fakes are explicitly detailed, posted on popular porn sites and increasingly challenging to detect.
Disturbingly realistic fakes have been made with the faces of both celebrities and women who don’t live in the spotlight, and the actress Scarlett Johansson says she worries that “it’s just a matter of time before any one person is targeted” by a lurid forgery.
Johansson has been superimposed into dozens of graphic sex scenes over the past year that have circulated across the web: One video, falsely described as real “leaked” footage, has been watched on a major porn site more than 1.5 million times. She said she worries it may already be too late for women and children to protect themselves against the “virtually lawless [online] abyss”.
“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she said. “The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause … The internet is a vast wormhole of darkness that eats itself.”
In September, Google added “involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation”. But there’s no easy fix to their creation and spread.
A growing number of deepfakes target women far from the public eye, with anonymous users on deepfakes discussion boards and private chats calling them co-workers, classmates and friends. Several users who make videos by request said there’s even a going rate: about $US20 ($28.50) per fake.
The requester of the video with the woman’s face atop the body with the pink off-the-shoulder top had included 491 photos of her face, many taken from her Facebook account, and told other members of the deepfake site that he was “willing to pay for good work :-)”. The author of this article later found her by running those portraits through an online tool known as a reverse-image search that can locate where a photo was originally shared.
It had taken two days after the request for a team of self-labelled “creators” to deliver. A faceless online audience celebrated the effort. “Nice start!” the requester wrote.
“It’s like an assault: the sense of power, the control,” said Adam Dodge, the legal director of Laura’s House, a domestic-violence shelter in California. Dodge hosted a training session in November for detectives and sheriff’s deputies on how deepfakes could be used by an abusive partner or spouse. “With the ability to manufacture pornography, everybody is a potential target,” Dodge said.
Videos have for decades served as a benchmark for authenticity, offering a clear distinction from photos that could be easily distorted. Fake video, for everyone except high-level artists and film studios, has always been too technically complicated to get right.
But recent breakthroughs in machine-learning technology, employed by creators racing to refine and perfect their fakes, have made fake-video creation more accessible than ever. All that’s needed to make a persuasive mimicry within a matter of hours is a computer and a robust collection of photos, such as those posted by the millions onto social media every day.
The result is a fearsome new way for faceless strangers to inflict embarrassment, distress or shame. “If you were the worst misogynist in the world,” said Mary Anne Franks, a University of Miami law professor and the president of the Cyber Civil Rights Initiative, “this technology would allow you to accomplish whatever you wanted”.
Men are inserted into the videos almost entirely as a joke: A popular imitation shows the actor Nicolas Cage’s face superimposed onto President Donald Trump’s. But the fake videos of women are predominantly pornographic, exposing how the sexual objectification of women is being emboldened by the same style of AI technology that could underpin the future of the web.
The media critic Anita Sarkeesian, who has been assailed online for her feminist critiques of pop culture and video games, was inserted into a hardcore porn video this year that has been viewed more than 30,000 times on the adult-video site Pornhub.
On deepfake forums, anonymous posters said they were excited to confront her with the video in her Twitter and email accounts, and shared her contact information and suggestions on how they could ensure the video was easily accessible and impossible to remove.
One user on the social-networking site Voat, who goes by “Its-Okay-To-Be-White”, wrote, “Now THIS is the deepfake we need and deserve, if for no other reason than (principle)”. Another user, “Hypercyberpastelgoth”, wrote, “She attacked us first. … She just had to open up her smarmy mouth”.
Sarkeesian said the deepfakes were more proof of “how terrible and awful it is to be a woman on the internet, where there are all these men who feel entitled to women’s bodies”.
– Drew Harwell, The Sydney Morning Herald
Read more: ‘Everybody is a Potential Target’: Fake-Porn Videos Weaponised to Harass Women
Image by Marvin Meyer from Unsplash
Leave A Comment