5 Types of Image-Based Sexual Abuse You Should Know About

All arrangements had been made for the wedding. The reception hall had been booked. Guest lists had been drawn up.

But one thing was missing. Or rather . . . one person.

The bride was gone.

Soo-ah (pseudonym) took her own life before her wedding could take place.

It wasn’t because she got cold feet. It wasn’t because she was being forced to marry a man she didn’t love—or any other Hollywood reason that might prelude a bride-to-be’s suicide in a tv drama.

The real reason?

A colleague of Soo-ah’s had secretly filmed her in the changing room of the hospital where they both worked.

When Soo-ah learned about how she had been violated, she fell into depression. She tried to drown her pain with alcohol and anti-depressants. But in the end, the trauma was too much for her to carry.

Soo-ah was only 26 years old when she ended her life.

Sadly, Soo-ah’s story is far from unique. She is one of countless individuals who have suffered profound trauma and even considered or committed suicide after being victimized through image-based sexual abuse.

What is Image-Based Sexual Abuse (IBSA)?

Image-based sexual abuse (IBSA) is the sexual violation of a person committed through the abuse or weaponization of any image of the person depicted. An “image” is any visual depiction or representation of a person—including but not limited to materials such as photographs, videos, edited/altered images, or personal representations in virtual reality or online gaming.

IBSA is an umbrella term encompassing a wide range of abusive activities. It includes the creation, theft, extortion, threatened or actual distribution, or any use of sexually explicit or sexualized materials without the meaningful consent of the person/s depicted and/or for purposes of sexual exploitation. It also includes sexual violence or harassment committed towards a person’s representation (e.g., a person’s avatar) in virtual reality or online gaming. 

Importantly, sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, not to be confused with IBSA, known as child sexual abuse material (CSAM) (i.e., “child pornography”), which is illegal under US federal statute.

Forms of Image-Based Sexual Abuse (IBSA)

It is important to understand the various different types of IBSA, so that we can recognize victimization when it happens, and build solutions that successfully address the wide range of ways IBSA can occur. Below is a non-exhaustive explanation of common forms of IBSA.

1. Video Voyeurism (VV)

Soo-ah was victimized through a form of IBSA called “video voyeurism” (VV). Popularly referred to as “spycamming,” video voyeurism involves filming or taking pictures of people engaged in private activities (e.g., changing clothes, using the toilet or showering, having sex in private) without their knowledge, or surreptitiously filming/taking pictures of their private body parts while they are in public (e.g., “down blousing,” and “upskirting”).

2. Non-consensual Distribution of Sexually Explicit Material (NCDSEM)

Non-consensual Distribution of Sexually Explicit Material (NCDSEM) is the sharing or online posting of sexually explicit or sexualized images/videos of another person without their meaningful consent.

In many cases of NCDSEM, the images/videos were initially created and exchanged in the context of a romantic relationship, but the partner subsequently shared them with others or posted them online (e.g., to social media sites or pornography sites) without the permission of the person depicted. This particular form of NCDSEM is often referred to as “revenge pornography;” however, NCOSE refrains from using this term as much as possible because of its victim-blaming connotations (“revenge” suggests that the victim did something wrong and deserves to have their images distributed non-consensually).

NCDSEM also encompasses the sharing of recordings of sex trafficking, rape, sexual assault, and/or of secretly sexual images and videos. As such, NCDSEM often overlaps with other forms of IBSA (e.g., a person may be a victimized first through video voyeurism or recorded sexual violence (another form of IBSA) and then re-victimized through NCDSEM when the images are shared).

NCDSEM is frequently accompanied by “doxing,”—i.e., sharing personally identifying information such as a person’s name, address, and workplace. Further, the material is often distributed with the intention of causing mental anguish or reputational harm to the victim, or even to provoke the victim to commit self-harm or suicide.

For example, in a survey of 1,631 survivors of IBSA, one survivor shared:

“[He told me] that he should post the intimate photos I sent him during our relationship . . . to  get back at me since I had ‘hurt him so much.’ Then [he told me] to kill myself, hurt myself . . . how he wished he could hurt me.” 

3. Recording Sexual Violence (RSV)

Recording sexual violence (RSV) involves taking pictures or creating videos of another person’s sexual assault or rape. This includes, but is not limited to, the videoing of sex trafficking victims who are subjected to serial rape by their sex traffickers and buyers. Such recordings are typically shared with others, and in some cases, have been distributed via mainstream pornography websites.

In a case of RSV arising in Australia, immigration and education agency manager Frank Hu drugged numerous women by giving them hot drinks laced with sedatives, subsequently raping them and filming the abuse. Hu took thousands of photographs and videos of the unconscious women, and posted some of the recordings on an online pornography fetish website for people interested in sex with “sleeping women” (i.e., rape).

4. Synthetic Sexually Explicit Material (SSEM)

Synthetic Sexually Explicit Material (SSEM) is the alteration of images of a person so that a person not in pornography appears to be in pornography (popularly referred to as “deepfake” or “cheapfake” pornography), or to “strip” the person depicted of their clothing via “nudify apps”. Such fabrications have been practiced for decades; however, current technological advances have resulted in increasingly realistic misrepresentations of people—quite commonly of famous women but also increasingly women from other walks of life—as nude or engaging in pornography when they have not. Today, synthetic sexually explicit material can be created utilizing relatively easy to use and accessible software programs or via sophisticated artificial intelligence programs.

For example, in July 2020, it was reported that approximately 104,852 women had their personal images artificially “stripped” and subsequently publicly shared by users of a “stripping” AI-bot. Seventy percent of users of the hub where the images were created targeted women in private life, versus only 16% who targeted high-profile celebrities. This marked a significant shift away from celebrity women being the primary targets of SSEM. 

5. Sexual Extortion

Sexual extortion (popularly referred to as “sextortion”) is the use of sexual images to blackmail the person (or persons) depicted. This is often done for the purpose of obtaining more sexually explicit material or money, to coerce the person into in-person sex or sex trafficking, to pressure the person to stay in a relationship, or other benefits to the perpetrator.

For example, one LGBT survivor shared how when they tried to leave an abusive relationship, their romantic partner threatened to distribute sexual images:

“I was younger and struggling with my gender identity and sexual orientation. I wasn’t comfortable being open with people at that point in my life. I feared for my safety if the wrong people found out that I’m trans. I stupidly got into an abusive relationship with someone older and really controlling. Whenever I tried to leave, he threatened to post pictures and videos taken during our relationship and to tell everyone that I’m trans and bisexual, and basically ruin my life.”

The sexual images used to extort a person may initially be shared consensually, may be obtained by theft (e.g., computer hacking), by alteration of non-sexual images of the person (i.e., synthetic pornography), or other means.

The above are only some of the forms IBSA can take; there are others as well. To learn more about this multi-faceted issue, visit endsexualexploitation.org/issues/image-based-sexual-abuse.

ACTION: Call on Google to Remove IBSA from Search Results!

Survivors of IBSA whose images have been distributed online often report that a particularly devastating part of their trauma is the continual re-victimization, as countless people watch, download, and re-share videos of their abuse that have been posted to the Internet.

Please sign this petition, asking Google to prioritize survivor-centered practices to better prevent IBSA and remove it from search results!


Header image is a stock photo and does not depict a known victim.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.