Image Based Sexual Abuse
There is a form of sexual exploitation that many people have experienced, but that long went unnamed—image-based sexual abuse.
Image-based sexual abuse (IBSA) is a broad term that includes a wide range of harmful experiences involving the weaponization of sexually explicit or sexualized images or videos.
We define image-based sexual abuse (IBSA) as the creation, theft, extortion, threatened or actual distribution, or any use of sexualized or sexually explicit materials without the meaningful consent of the person depicted. This can include:
- using such images for purposes of blackmail of those depicted or to coerce the production of more sexually explicit material (often referred to as “sextortion”)
- non-consensual distribution of sexually explicit or sexualized images (sometimes called “revenge porn;” this is sometimes accompanied by doxing)
- non-consensual collecting, swapping, and posting of sexually explicit or sexualized images of persons (such as in text/messaging groups or on third party platforms)
- non-consensual recording of such images or videos (including so-called “down blousing,” “upskirting,” or surreptitious recordings in places such as restrooms and locker rooms)
- identity theft involving the non-consensual use of a person’s images for the creation of photoshopped/artificial pornography or sexualized materials intended to portray a person (popularly referred to as “cheap fake” or “deepfake” pornography)
- pressuring or harassing someone to self-generate or share sexually explicit or sexualized images
In this free resource from the National Center on Sexual Exploitation, we explore the classifications and definitions of image-based sexual abuse. This includes details on terminology to reject and subcategories that are included under the IBSA umbrella.
In an instant, anyone can become a victim of IBSA—whether someone has self-created images that are misused, or someone is targeted for “deepfake” creation or hidden camera capturing, or someone is abused on film, etc. No age, gender, or socioeconomic status is free from the threat of IBSA.
Around the world, people are becoming more aware of this multi-faceted issue, particularly as survivors of non-consensually shared content have spoken out about their trauma and the difficulty of getting such images removed from the Internet.
The National Center on Sexual Exploitation is building solutions to IBSA, including survivor services, stronger state and federal legislation, civil litigation, online platform responsibility to remove non-consensual sexually explicit imagery swiftly, survivor-centered removal forms, and more.
If you’ve been a victim of image-based sexual abuse, you may have a legal claim against the perpetrators or the online platforms that shared and profited from your abuse. Please reach out to our law center at lawcenter@ncose.com or fill out the form you’ll find by following the link below.
Full Definition: Image-based sexual abuse (IBSA) encompasses a range of harmful activities that weaponize sexually explicit or sexualized materials against the persons they depict. IBSA includes the creation, theft, extortion, threatened or actual distribution, or any use of sexually explicit or sexualized materials without the meaningful consent of the person or persons depicted and/or for purposes of sexual exploitation.
Importantly, sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, not to be confused as IBSA, known as child sexual abuse materials (CSAM) (i.e., “child pornography”) which is illegal under US federal statute. [CSAM is sexually explicit conduct or lascivious exhibition of the genitals or pubic area of a person under the age of 18. Nudity is not a required element of the offense under federal law, 18 USC 2256. ]
Note: NCOSE does not use the term “revenge porn” because it implies that the victim has done something to cause justifiable retaliation by sharing their images without consent. Also, many perpetrators share such materials for motives outside of “revenge” in a romantic relationship, for example coercive control, personal gratification, advertising someone for commercial sex, etc. Instead, we use variations of the more accurate phrase “Non-consensual Distribution of Sexually Explicit Material (NCDSEM) .”
Sexting—the creating, sending, receiving, or forwarding of sexually suggestive or explicit materials (texts, photos, videos),—can, at times, be a form of image-based sexual abuse (IBSA.) For example, IBSA occurs when sexting is induced under coercion, threats, or relational pressure. Unfortunately, many instances of sexting involve some form of pressure or coercion. Researchers from Northwestern University analyzed 462 accounts from teenage girls who described being pressured from boys to send explicit images and refusal was often met with repeated requests, harassment, or threats.
Sexting, when involving minors, is legally classified as child sexual abuse material (i.e., child pornography) which is a serious U.S. federal crime. It is illegal to produce, possess, or distribute any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age.) Learn more about this crime on the U.S. Department of Justice’s website here.
Sexting can also leave an individual vulnerable to IBSA if the recipient non-consensually shares the content. It should be noted that even in cases of self-generated explicit materials, it is never the victim’s fault when their trust is broken and they are abused.
For more information on sexting and its risks to youth, visit here.
The law varies from country to country, particularly depending on specific manifestations of IBSA. For example, some may ban non-consensually sharing of sexually explicit images but fail to have laws addressing “upskirting” or unsolicited explicit imagery, etc.
In the United States, 48 states, Washington D.C. and two territories have laws prohibiting the distribution or production of nonconsensually recorded/shared sexually explicit materials. However, there is no U.S. federal law regarding IBSA.
In England and Wales, non-consensual explicit image sharing became illegal in April 2015 and carries a maximum jail time of two years. Other countries have strict privacy laws which may be able to be interpreted in a manner to help IBSA victims. The Center for Internet and Society has coalesced a list of some relevant laws by country, last updated in 2018.
Solutions must include stronger legislation, avenues for victims to sue in civil litigation, online platform responsibility to remove non-consensual explicit imagery swiftly, survivor-centered removal forms, hashing for IBSA and child sexual abuse materials, and more.
Take Action
Sign the Petition Demanding Google Prevent and Remove Sexual Abuse From Search Results
Demand Reddit Stop Hosting Abuse and Exploitation
Call on AWS to Stop Providing Infrastructure Support to Reddit
Call on Congress to Hold Verisign Accountable for Exploitive Domain Registries
Demand Visa Stop Processing Payments Supporting Sexual Exploitation
NCOSE's Efforts to Combat Image-based Sexual Abuse
NCOSE works to combat image-based sexual abuse in the following ways:
- Survivor Leadership – NCOSE is advised and joined in advocacy by a network of survivors.
- Civil Litigation – NCOSE is filing lawsuits against major companies that allow the sharing of IBSA and often profit from it (Learn more here)
- Survivor Services – NCOSE assists survivors with DMCA requests to remove their IBSA from online platforms
- Corporate Accountability – NCOSE is calling for corporations like Google, Reddit, and Discord to create responsible practices and policies around IBSA.
- National Legislation – NCOSE is pursuing federal legislation in the United States and abroad ensuring that meaningful consent was collected by all depicted in sexually explicit materials (Learn more here)
- State Legislation – NCOSE is working to pass legislation in the two remaining states without non-consensually shared/recorded explicit content (sometimes called “revenge porn”) laws – MA and SC
Survivors of Image-based Sexual Abuse Experience Significant Trauma
Survivors of Image-based Sexual Abuse Experience Significant Trauma
The Cyber Civil Rights Initiative conducted an online survey regarding non-consensual distribution of sexually explicit photos in 2017. The survey found that compared to people without IBSA victimization, victims had “significantly worse mental health outcomes and higher levels of physiological problems.” A previous analysis noted that such distress can include high levels of anxiety, PTSD, depression, feelings of shame and humiliation, as well as loss of trust and sexual agency. The risk of suicide is also a very real issue for victims and there are many tragic stories of young people taking their own lives as a result of this type of online abuse. An informal survey showed that 51% of responding victims reported having contemplated suicide as a result of their IBSA experience.
A study that interviewed 75 survivors of various forms of IBSA from the UK, Australia and New Zealand reported survivor statements about the broader impact on their lives, including:
- Participants described feeling “completely, completely broken” and described their experiences as “life-ruining,” “hell on earth,” and “a nightmare . . . which destroyed everything.”
- One survivor stated, “‘[it] transcends [everything], it impacts you emotionally, physiologically, professionally, in dating and relationships, in bloody every single factor of your life.”
- Participants reported self-blame for their abuse, stating they felt “degraded,” “mortified,” “ashamed,” etc.
- Several women participants spoke about the “’worst’ harm (hypothetical or realized) being their families finding out about the abuse: ’The worst thing . . . is the shame of your parents being disappointed. . . .”
- Another survivor stated: “. . . there’s such a level of permanence which affects everything. . . especially if it’s impossible now to take photos down, especially if it’s impossible to stop the dissemination of the images . . . . there will never be a day in my entire lifetime that all of the images of me could ever be deleted.”
- A survivor shared: “There is no end to it, there is no stop, there is no finale. . . . It’s like, I’m quite aware that if I was to go on the internet or the porn websites now, I would . . . find the videos of me. . . . It’s a crime that doesn’t just happen and then that’s done. It’s something that is continual, and this could continue for I don’t know how long. It could go on for bloody ever.
- Another survivor reported: “[It’s] having this continuing threat that the images could be re-shared, or re-emerge online, that new people could see these intimate images. . . . and I think it’s the unknowing; that not knowing aspect that you have to deal with every day.”
In addition to having sexually explicit images posted online, victims of IBSA often also face the ordeal of having their personal information shared alongside these images. An informal survey of survivors of IBSA found that survivors often had their full name published alongside the explicit imagery (59%), or their social media/network information (49%), email (26%), phone number (20%), physical home address (16%) or work address (14%.)
Often IBSA is a concurrent form of abuse alongside additional exploitive dynamics—such as intimate partner abuse, grooming, sex trafficking, and more. For example, sexually explicit pictures and videos are often used to advertise for both sex trafficking and prostitution victims (including minors, in which case the imagery is child sexual abuse materials, i.e., child pornography, a federal crime in the United States.) Further, adult survivors of sex trafficking have recently sued a mainstream pornography producer for fraud for coercing their involvement in the production of pornography. They have also sued MindGeek, the parent company of Pornhub, for allowing the material produced from their sex trafficking to circulate rampantly on its platform. Additionally, a criminal conviction for sex trafficking was obtained by federal authorities against one of the individuals involved with the production studio; others have been indicted for sex trafficking related crimes.
When victims report IBSA to law enforcement they may be dismissed, blamed, or ridiculed. Like other forms of sexual abuse, there is a societal trend towards victim blaming in IBSA. This is a particular issue in adult self-generated sexually explicit images, where often the person, usually a female who is pressured (usually by a male), to take and send intimate sexual images is wrongly deemed as partially responsible for the non-consensual sharing or other use of those images.
There is still a long way to go to create strong public policy, reform Internet platform practices, and increase public awareness on this topic. But the good news is that the rising generation is breaking down the stigma around these issues and is increasingly speaking out. Laws are changing, awareness is rising, and victims of this abuse need no longer feel alone.
The Law and Society are Behind on Addressing this Issue
The Law and Society are Behind on Addressing this Issue
The law varies from country to country, particularly depending on specific manifestations of IBSA. For example, some may ban non-consensual sharing of sexually explicit images but fail to have laws addressing “upskirting” or unsolicited explicit imagery, etc.
In the US, 48 states, Washington D.C. and two territories have laws prohibiting the distribution or production of nonconsensually recorded/shared sexually explicit materials. However, there is no federal law regarding IBSA.
When victims realize that materials depicting them are online, they often have to resort to asking online platforms to remove the content themselves, or enlisting NGOs or lawyers for help.
The content removal and takedown process for non-consensual and image-based sexual abuse material requires hundreds of hours writing, sending, and following up with various web hosting providers and websites. And often, even when websites and web hosting providers are put on notice that certain images portray non-consensual acts, they are reluctant to remove it.
From November 2020 – March 2022, the National Center on Sexual Exploitation Law Center has sent more than 150 demand letters to both hosting providers and direct websites on behalf of five women who have been the victims of non-consensual content uploaded to the Internet. These demand letters contained an upwards of 200 URLs.
During this time frame, only an estimated 50 URLs of those reported have been removed.
Clearly, we need greater corporate accountability, in addition to improved legislation, on IBSA.
1 in 8
30%
40%
3,044
73%
1 in 10
6,000
2/3
Creating, Distributing, or Possessing CSAM is a Federal Crime
Self-produced photos of minors are child sexual abuse material (i.e., child pornography).
Creating, distributing, and possessing of CSAM is a serious U.S. federal crime.
Learn more about this crime on the U.S. Department of Justice’s website.
Further Reading
Sign the Petition Demanding Google Prevent and Remove Sexual Abuse From Search Results
Image-Based Sexual Abuse: A Little-Known Term, but a Pervasive Problem
Helpful Tips for Victims of Revenge Porn
Research
Resources
Videos
Playlist
From the Blog
Kids Are Dying Without This Law: Pass Kids Online Safety Act NOW!
How many more children have to die before Congress will take action?
A Tsunami of “Nudifying” Apps Advertised on Meta Platforms
Meta’s platforms, namely Instagram, are advertising a plethora of “nudifying” apps to its users, including to children.
5 Types of Image-Based Sexual Abuse You Should Know About
Image-based sexual abuse is an umbrella term encompassing a wide range of abusive activities. Learn more!
The Movement Ep. 3: Navigating the Benefits and Pitfalls of Emerging Tech
On this episode of The Movement, we’re exploring the ways new technology can aid the fight against exploitation.
VICTORY! Microsoft’s GitHub Takes Concrete Steps to Combat Image-Based Sexual Abuse
One spring day, Jodie* received an anonymous message with a link to a pornography website. When she clicked on it, her world imploded. There, right
What Diddy and Tate Charges Show About Modern-Day Sex Trafficking
Charges against public figures like Sean “Diddy” Combs and the Tate brothers put the horrific methods of modern-day sex trafficking on full display