Disturbing Trend of Deep-fake Porn Rises

A disturbing trend is now affecting many women all over the world. American actress, Scarlett Johansson, is just one example among many women who is a victim of what is being called deep-fake pornography. Deep-fake pornography is a form of image-based sexual assault that uses facial images of women and places them onto someone else’s body to create a pornographic film. The result is a video that looks so realistic and detailed that it is very challenging to detect that the original video has been altered.

To create deep-fake pornography, the images of the women are “stolen” from pictures or videos that they have posted online. The creators of deep-fake pornography are anonymous, and do not have to work very hard to develop the content. All the creator needs is the correct software, which is publicly available online, and plenty of images of their intended victim. Gathering sufficient images is an easy task since there are plenty of pictures and selfies posted on private social media pages. Some of the same technology that is easily available is also used for face-swapping smartphone apps. It’s also easy to access, that there are creators who sell deep-fake pornography for as little as $20.

Last year, Scarlett Johansson became a victim of this form of image-based sexual assault dozens of times, and the videos have over 1.5 million views. “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson said.

What makes deep-fake porn even more disturbing is that it is unclear if legal action can be taken. The creators could possibly be protected under the First Amendment due to creators using public images and to make “new” material. Another troubling point is that the creators are unknown, making it hard to prosecute even if legal action could be taken. If it is found that legal action can be taken, the process is slow and tedious, especially for those who lack financial resources.

Celebrities aren’t the only ones at risk. Anyone can reach out to an anonymous creator in hopes to blackmail, humiliate, extort, threaten, or control potential victims. This form of image-based sexual assault is a violation of privacy, a form of identity theft, and could be a violation of federal obscenity laws. Furthermore, deep-fake pornography normalizes the objectification and abuse of women’s bodies with no regard to consent.

To learn more about the harms of pornography, visit our porn harms webpages.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.