A disturbing trend is now affecting many women all over the world. American actress, Scarlett Johansson, is just one example among many women who is a victim of what is being called deep-fake pornography. Deep-fake pornography is a form of image-based sexual assault that uses facial images of women and places them onto someone else’s body to create a pornographic film. The result is a video that looks so realistic and detailed that it is very challenging to detect that the original video has been altered.
To create deep-fake pornography, the images of the women are “stolen” from pictures or videos that they have posted online. The creators of deep-fake pornography are anonymous, and do not have to work very hard to develop the content. All the creator needs is the correct software, which is publicly available online, and plenty of images of their intended victim. Gathering sufficient images is an easy task since there are plenty of pictures and selfies posted on private social media pages. Some of the same technology that is easily available is also used for face-swapping smartphone apps. It’s also easy to access, that there are creators who sell deep-fake pornography for as little as $20.
Last year, Scarlett Johansson became a victim of this form of image-based sexual assault dozens of times, and the videos have over 1.5 million views. “Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” Johansson said.
What makes deep-fake porn even more disturbing is that it is unclear if legal action can be taken. The creators could possibly be protected under the First Amendment due to creators using public images and to make “new” material. Another troubling point is that the creators are unknown, making it hard to prosecute even if legal action could be taken. If it is found that legal action can be taken, the process is slow and tedious, especially for those who lack financial resources.
Celebrities aren’t the only ones at risk. Anyone can reach out to an anonymous creator in hopes to blackmail, humiliate, extort, threaten, or control potential victims. This form of image-based sexual assault is a violation of privacy, a form of identity theft, and could be a violation of federal obscenity laws. Furthermore, deep-fake pornography normalizes the objectification and abuse of women’s bodies with no regard to consent.
To learn more about the harms of pornography, visit our porn harms webpages.