Excerpt of an article in the Washington Post
By Drew Harwell
December 30, 2018
The video showed the woman in a pink off-the-shoulder top, sitting on a bed, smiling a convincing smile.
It was her face. But it had been seamlessly grafted, without her knowledge or consent, onto someone else’s body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online.
She felt nauseated and mortified: What if her co-workers saw it? Her family, her friends? Would it change how they thought of her? Would they believe it was a fake?
“I feel violated — this icky kind of violation,” said the woman, who is in her 40s and spoke on the condition of anonymity because she worried that the video could hurt her marriage or career. “It’s this weird feeling, like you want to tear everything off the Internet. But you know you can’t.”
Airbrushing and Photoshop long ago opened photos to easy manipulation. Now, videos are becoming just as vulnerable to fakes that look deceptively real. Supercharged by powerful and widely available artificial-intelligence software developed by Google, these lifelike “deepfake” videos have quickly multiplied across the Internet, blurring the line between truth and lie.
But the videos have also been weaponized disproportionately against women, representing a new and degrading means of humiliation, harassment and abuse. The fakes are explicitly detailed, posted on popular porn sites and increasingly challenging to detect. And although their legality hasn’t been tested in court, experts say they may be protected by the First Amendment — even though they might also qualify as defamation, identity theft or fraud.