Imagine you’re going about a normal day when you get a message telling you there’s a naked photo of you going around online. How would you feel?
And how would you feel if it was a photo that you had never even taken?
Gabi Belle, a 26-year-old, said she felt “yucky and violated” when she unexpectedly found herself in a situation like this. The image, which showed her standing naked in a field, had been forged using AI.
In trying to get the picture taken down, she realized there were about a hundred photos on various websites, including several pornography sites. Imagine how violated you would feel, knowing that forged naked photos of you were being viewed by potentially thousands of people on pornography sites.
Even after Belle successfully got the original images taken down, more have continued to pop up.
What if forged images of you harmed your career prospects? What if your family, friends, or coworkers saw them? What if every time you finally got one photo taken down, more popped up in their place?
How overwhelmed would you feel?
Unfortunately, countless individuals—average individuals like you—are being victimized through image-based sexual abuse.
What is Image-Based Sexual Abuse (IBSA)?
IBSA encompasses a wide range of abusive activities including the creation, manipulation, theft, extortion, threatened or actual distribution, or any use of images for sexual purposes without the meaningful consent of the person/s depicted or for purposes of sexual exploitation. It also includes sexual violence or harassment committed towards a person’s representation (e.g., a person’s avatar) in virtual reality or online gaming.
Importantly, sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, not to be confused with IBSA, known as child sexual abuse material (CSAM) (i.e., “child pornography”), which is illegal under US federal statute.
Forms of Image-Based Sexual Abuse (IBSA)
It is important to understand the various different types of IBSA, so that we can recognize victimization when it happens, and build solutions that successfully address the wide range of ways IBSA can occur. Below is a non-exhaustive explanation of common forms of IBSA.
1. AI-Generated Forged Pornography
AI-generated forged pornography depicts people who have not been filmed or photographed for pornography as if they had. Thus, people who have not participated in recordings of pornographic acts are technologically sex trafficked into pornography. Although forged pornography has existed for centuries, it used to be easily detectable as forgery due to technology limitations. Now, modern advances based on the use of “deepfake” technology yield hyper-realistic images that make detection of AI-generated forged pornography exceedingly difficult, if not impossible.
Forged pornography can be created in two ways:
- By superimposing an innocuous image of someone into pre-existing pornography, so that one or more persons originally in the pornography are “replaced” with the image of another person.
- By the use of “nudifying” apps which strip images of clothed persons of their clothing so that they appear to be partially or fully naked.
Research released in 2023 reported that 98% of AI-generated forged images are of pornography and that 99% of this pornography depicts women.
Many female celebrities, journalists, and politicians have been victimized through AI-generated IBSA, and increasingly women who are not famous are also becoming targets of this form of IBSA. For example, in July 2020, it was reported that approximately 104,852 women had their personal images artificially “stripped” and subsequently publicly shared by users of a “stripping” AI bot. Seventy percent of users of the hub where the images were created targeted women in private life, versus only 16% who targeted high-profile celebrities. This marked a significant shift away from celebrity women being the primary targets of AI-generated forged IBSA.
The appropriation of a person’s image or likeness for the unauthorized production of sexual materials is a form of identity theft, fraud, and sexual violence. It constitutes public sexual humiliation and sexual assault.
2. Non-consensual Distribution of Sexually Explicit Material (NCDSEM)
Non-consensual Distribution of Sexually Explicit Material (NCDSEM) is the sharing or online posting of sexually explicit or sexualized images/videos of another person without their meaningful consent.
In many cases of NCDSEM, the images/videos were initially created and exchanged consensually in the context of a romantic relationship, but the partner subsequently shared them by forwarding them to friends or posting them online (e.g., to social media sites or pornography sites) without the permission of the person depicted. This particular form of NCDSEM is often referred to as “revenge pornography;” however, NCOSE refrains from using this term as much as possible because of its victim-blaming connotations (“revenge” suggests that the victim did something wrong and deserves to have their images distributed non-consensually).
NCDSEM also encompasses distribution of nonconsensually created material, such as AI-generated images or recordings of sex trafficking, rape, sexual assault, and/or of secretly recorded sexual images and videos. As such, NCDSEM often overlaps with other forms of IBSA (e.g., a person may be a victimized first through video voyeurism or recorded sexual violence (other forms of IBSA) and then re-victimized through NCDSEM when the images are shared).
NCDSEM is frequently accompanied by “doxxing,”—i.e., sharing personally identifying information such as a person’s name, address, and workplace. Further, the material is often distributed with the intention of causing mental anguish or reputational harm to the victim, or even to provoke the victim into self-harm or death by suicide.
For example, in a survey of 1,631 survivors of IBSA, one survivor shared:
“[He told me] that he should post the intimate photos I sent him during our relationship…to get back at me since I had ‘hurt him so much.’ Then [he told me] to kill myself, hurt myself…how he wished he could hurt me.”
3. Sexual Extortion (Sextortion)
Sexual extortion (popularly referred to as “sextortion”) is the use of sexual images to blackmail the person (or persons) depicted. This is often done for the purpose of obtaining more sexually explicit material or money, to coerce the person/s into in-person sex or sex trafficking, to pressure the person to stay in a relationship, or other benefits to the perpetrator.
For example, one LGBTQ+ survivor shared how when they tried to leave an abusive relationship, their romantic partner threatened to distribute sexual images:
“I was younger and struggling with my gender identity and sexual orientation. I wasn’t comfortable being open with people at that point in my life. I feared for my safety if the wrong people found out that I’m trans. I stupidly got into an abusive relationship with someone older and really controlling. Whenever I tried to leave, he threatened to post pictures and videos taken during our relationship and to tell everyone that I’m trans and bisexual, and basically ruin my life.”
The sexual images used to extort a person may initially be shared consensually, may be obtained by theft (e.g., computer hacking), by alteration of non-sexual images of the person (i.e., AI-generated forged pornography), or other means.
4. Recording Sexual Violence (RSV)
Recording sexual violence (RSV) involves taking pictures or creating videos of another person’s sexual assault or rape. Recorded sexual violence often depicts persons who have been drugged or who are incapacitated, as well as sex trafficking victims who are subjected to serial rape by their sex traffickers and buyers. Such recordings are typically shared with others, and in some cases, are distributed on Internet platforms, commonly mainstream pornography websites.
In a case of RSV arising in Australia, immigration and education agency manager Frank Hu drugged numerous women by giving them hot drinks laced with sedatives, subsequently raping them and filming the abuse. Hu took thousands of photographs and videos of the unconscious women and posted some of the recordings on an online pornography fetish website for people interested in sex with “sleeping women” (i.e., rape).
5. Video Voyeurism (VV)
Popularly referred to as “spycamming,” video voyeurism involves filming or taking pictures of people engaged in private activities (e.g., changing clothes, using the toilet or showering, having sex in private) without their knowledge, or surreptitiously filming/taking pictures of their private body parts while they are in public (e.g., “down blousing,” and “upskirting”).
Image-based sexual abuse (IBSA) can happen through hidden camera videos, deepfake/edited images, leaked photos, filmed child sexual abuse, rape, sex trafficking, and prostitution.
— National Center on Sexual Exploitation (@NCOSE) October 9, 2022
It only takes an instant for anyone to become a victim of this exploitation.
The above are only some of the forms IBSA can take; there are others as well. To learn more about this multi-faceted issue, download this free resource.
ACTION: Ask Congress To Pass Legislation Combatting IBSA!
There are currently several federal bills before Congress which address the rampant problem of IBSA. We have made great progress on these bills, but need your help to push them to the finish line! Please take 30 SECONDS to contact Congress, using the quick action below.
Header image is a stock photo and does not depict a known victim.