Not a Fantasy: How the Pornography Industry Exploits Image-Based Sexual Abuse in Real Life

Nine young women, all members of a college field hockey team, suffered tremendous abuse when they were filmed changing while in a visiting locker room. Their lawsuit, Does 1-9 v Murphy et al., exposes how, allegedly, the intramural/summer conference director of a college in South Carolina placed a hidden “spy camera” in a locker room and secretly filmed the women in all stages of undress before uploading the video footage to notorious pornography “tube” sites Pornhub and XHamster.

Neither website bothered to verify the age or consent of the women in the videos—which were labeled as “spy cam” and “hidden cam” videos—and instead monetized the women’s abuse.

“Spy cam,” “Hidden camera,” and “Voyeur” are extremely popular genres in the pornography industry, and many mainstream pornography sites have official tags and categories for terms like these which indicate real life abuse. So while many dismiss the harms of pornography, claiming that the content is fake or “just a fantasy,” this is highly misleading.

It’s not a fantasy. It’s real life. Real people are being exploited and abused by the pornography industry.  

The release of NCOSE’s report entitled, “Not a Fantasy: How the Pornography Industry Exploits Image-Based Sexual Abuse in Real Life” details how pornography distribution websites like Pornhub, XVideos, XNXX, and xHamster facilitate, normalize, and profit from image-based sexual abuse (IBSA), a rapidly growing form of sexual violence. It also calls on the general public to demand that policymakers enact measures to stop the abuse and hold the exploiters accountable. 

Not a Fantasy

How the Pornography Industry Exploits Image-Based Sexual Abuse in Real Life

Types of Image-Based Sexual Abuse Common in Mainstream Pornography

Image-based sexual abuse (IBSA) refers to the creation, manipulation, theft, extortion, threatened or actual distribution, or any use of images for sexual purposes without the meaningful consent of the person depicted. This can include nonconsensual distribution of sexually explicit material (popularly though inappropriately called “revenge pornography”), recorded sexual violence, computer-generated sexually explicit material (commonly referred to as “deepfake” pornography), and video voyeurism (when a person is recorded in a sexual context without their knowledge). 

Below are some examples of tags, channels, and search results on mainstream pornography websites which overtly advertise IBSA.

To offer a few real life stories:

Katelynn Spencer was shocked to learn that videos of her performing sex acts had been uploaded to Pornhub ten years before by a former male friend who had groomed and coaxed her into making a sex video with him. The man recorded another video of her performing a sex act on him without her knowledge and uploaded it to Pornhub, this one with her name attached to it. This video garnered millions of views. There were no laws regarding the nonconsensual distribution of sexually explicit material in her state. The fallout from this experience cost her relationships with family and friends, resulted in divorce from her husband, loss of her job, as well as extreme psychological duress and physical illness.

GirlsDoPorn, a verified content partner on Pornhub, lured women into a sex trafficking operation under false pretences, such as promising modelling jobs that did not exist. The women were trapped in hotel rooms, manipulated and threatened into signing contracts and making pornography videos, and lied to about how said videos would be distributed and their anonymity protected. The GirlsDoPorn company uploaded the exploitive videos to Pornhub, which monetized them, profiting from their victimization. The former producer of GirlsDoPorn, Andre Garcia, was given a 20-year prison sentence for sex trafficking. 50 GirlsDoPorn survivors also filed a lawsuit against Pornhub for their complicity in these crimes.  

Dominique Pelicot was charged with serially raping his wife and inviting dozens of men to rape her while she was unconscious, after the police discovered thousands of videos of these assaults on his computer. This has become the world’s most infamous case sexual violence case in recent history. While it is still not confirmed if Pelicot or the men who raped his wife uploaded these videos to pornography platforms, Gisèle Pelicot name yields 331 results on XVideos showcasing a deplorable fact: one of the world’s largest pornography platforms is luring pornography users who seek material of Gisèle’s abuse by allowing her name to surface search results.  

How the Pornography Industry Fuels IBSA

By “pornography industry” we are referring to “all owners, shareholders, directors, and employees of companies involved in the production, advertising, monetization, or distribution of the depiction and commercialization of human nudity and/or sexual activity for consumption by third parties via visual, audiovisual, or virtually interactive means” (excepting all persons sex trafficked into pornography production).  

Image-based sexual abuse is unfortunately a cornerstone to the success of the pornography industry.  

The mainstream pornography industry fuels IBSA and harm to the people it depicts in at least nine ways: 

  1. Sexual socialization of pornography users to IBSA: IBSA on pornography platforms is categorized and tagged using related search terms and thus presents IBSA and the survivors harmed by it as sex objects for masturbatory use by pornography consumers. 
  2. Industry profiteering – the “free” pornography profit model: For years ubiquitous amounts of “free” pornography uploaded to tube sites by unknown third parties drew vast amounts of traffic to their sites. This was (and in some cases remains) the basis of the industry’s profitability and incentivizes them to ignore blatant IBSA (and CSAM) on their platforms. 
  3. Disregard of an age and consent verification law: For more than a decade, pornography platforms ignored an existing law requiring age and identity verification of those depicted on their platforms. Many major pornography platforms still fail to employ such verification.  
  4. Anonymous user uploads and lack of robust content moderation: Nonexistent or inadequate screening protocols for uploaders of content to pornography platforms allowed IBSA to become public on pornography sites. Lack of vigorous content moderation allows the material to remain on pornography platforms indefinitely. 
  5. Mass publicization and global trafficking of IBSA: IBSA uploaded to pornography platforms publicizes it to their global audience of consumers. Additionally, platforms have allowed, or continue to allow, the downloading of IBSA shared on their platforms, thus facilitating its global distribution. 
  6. Facilitation of serial sexual abuse and exploitation: Hosting, enabled downloading, and harvesting of IBSA from pornography websites fuels survivors’ perpetual mass sexual abuse and exploitation by allowing countless others to possess and/or profit from the material, causing them incalculable, lifelong emotional, physical, and social distress. 
  7. Failure to protect pornography performers from pirating of content: Pornography tube sites have depended on copious amounts of pirated pornography (i.e., copied and redistributed without compensation to the creator/s) as a key component of their “freemium” profit model. This exploits pornography performers, not only sexually, but also monetarily. 
  8. Exploitation of AI and normalization of forged pornography: Pornography site images—including images of sex trafficked persons—have been nonconsensually harvested from pornography sites and used to create so-called “artificial” or “deepfake” pornography. This constitutes nonconsensual use of the performers’ material and therefore is a form of IBSA. The industry also largely turns a blind eye to the presence of deepfake material on their platforms. 
  9. Lack of meaningful survivor-support services and failure to permanently remove IBSA: Pornography platforms lack robust, survivor- centered processes for the removal of IBSA from their sites, which forces victims to continually monitor pornography platforms to report and request removal of their sexual abuse material. Thus, the pornography industry transfers the onus of responsibility for this task to survivors and furthers their mental anguish. 

The Impact of IBSA on Survivors 

Image-based sexual abuse is not only traumatizing to those victimized in the moment they learn about their abuse, it is often the aftermath that is the most difficult to deal with. Survivors have to deal with the images circulating perpetually on the Internet, no matter how hard they try to get them down. And even if the videos of one’s abuse are eventually removed from one pornography site, they typically resurface on other lifelong trauma often persists. 

A study conducted in Australia reported that 1 in 5 persons of 4,274 participants (2,406 female; 1,868 male) had experienced at least one form of IBSA. Among those who had experienced IBSA, 80.8% of women and 72.9% of men reported feeling annoyed, humiliated, depressed, angry, or fearful as a consequence.1

Further, in a study which interviewed 75 survivors of various forms of IBSA from the UK, Australia and New Zealand, participants described additional impacts including social rupture (i.e., disruption of one’s sense of self and relationships with others), the constancy of harm (“endlessness”), and a sense of what researchers termed “existential threat” (i.e., a sense of constant apprehension and perpetual vigilance).

One participant described feeling “completely, completely broken” and others described their experiences as “life-ruining,” “hell on earth,” and “a nightmare . . . which destroyed everything.” One survivor stated, “‘[it] transcends [everything], it impacts you emotionally, physiologically, professionally, in dating and relationships, in bl**y every single factor of your life.” 

IBSA inflicts irreversible damage on survivors that often follows them for the rest of their lives. It’s time to stop viewing Internet pornography as harmless or fake or just a “fantasy.” These are real people whose lives are being ruined by the pornography industry’s relentless quest for profit.  

We must hold this predatory industry accountable.  

Download the “Not a Fantasy” report for FREE to gain insights into this exploitative industry and discover strategies to address the harms caused by image-based sexual abuse within mainstream pornography. 

Not a Fantasy

How the Pornography Industry Exploits Image-Based Sexual Abuse in Real Life

1Anastasia Powell, Nicola Henry, and Asher Flynn, “Image-based Sexual Abuse,” in Routledge Handbook of Critical Criminology, 2nd ed., ed. Walter S. DeKeseredy (New York, NY: Taylor & Francis, 2018).

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.