Image Based Sexual Abuse
There is a form of sexual exploitation that many people have experienced, but that long went unnamed—image-based sexual abuse.
Image-based sexual abuse (IBSA) is a broad term that includes a wide range of harmful experiences involving the weaponization of sexually explicit or sexualized images or videos.
We define image-based sexual abuse (IBSA) as the creation, manipulation, theft, extortion, threatened or actual distribution, or any use of sexualized or sexually explicit materials without the meaningful consent of the person/s depicted or for purposes of sexual exploitation. This can include:
- using such images for purposes of blackmail of those depicted or to coerce the production of more sexually explicit material (often referred to as “sextortion”)
- non-consensual distribution of sexually explicit or sexualized images (sometimes called “revenge porn;” this is sometimes accompanied by doxing)
- non-consensual collecting, swapping, and posting of sexually explicit or sexualized images of persons (such as in text/messaging groups or on third party platforms)
- non-consensual recording of such images or videos (including so-called “down blousing,” “upskirting,” or surreptitious recordings in places such as restrooms and locker rooms)
- identity theft involving the non-consensual use of a person’s images for the creation of photoshopped/artificial pornography or sexualized materials intended to portray a person (popularly referred to as “cheap fake” or “deepfake” pornography)
- pressuring or harassing someone to self-generate or share sexually explicit or sexualized images
In this free resource from the National Center on Sexual Exploitation, we explore the classifications and definitions of image-based sexual abuse. This includes details on terminology to reject and subcategories that are included under the IBSA umbrella.
In an instant, anyone can become a victim of IBSA—whether someone has self-created images that are misused, or someone is targeted for “deepfake” creation or hidden camera capturing, or someone is abused on film, etc. No age, gender, or socioeconomic status is free from the threat of IBSA.
Around the world, people are becoming more aware of this multi-faceted issue, particularly as survivors of non-consensually shared content have spoken out about their trauma and the difficulty of getting such images removed from the Internet.
The National Center on Sexual Exploitation is building solutions to IBSA, including survivor services, stronger state and federal legislation, civil litigation, online platform responsibility to remove non-consensual sexually explicit imagery swiftly, survivor-centered removal forms, and more.
If you’ve been a victim of image-based sexual abuse, you may have a legal claim against the perpetrators or the online platforms that shared and profited from your abuse. Please reach out to our law center at lawcenter@ncose.com or fill out the form you’ll find by following the link below.

Full Definition: Image-based sexual abuse (IBSA) encompasses a range of harmful activities that weaponize sexually explicit or sexualized materials against the persons they depict. IBSA includes the creation, theft, extortion, threatened or actual distribution, or any use of sexually explicit or sexualized materials without the meaningful consent of the person or persons depicted and/or for purposes of sexual exploitation.
Importantly, sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, not to be confused as IBSA, known as child sexual abuse materials (CSAM) (i.e., “child pornography”) which is illegal under US federal statute. [CSAM is sexually explicit conduct or lascivious exhibition of the genitals or pubic area of a person under the age of 18. Nudity is not a required element of the offense under federal law, 18 USC 2256. ]
Note: NCOSE does not use the term “revenge porn” because it implies that the victim has done something to cause justifiable retaliation by sharing their images without consent. Also, many perpetrators share such materials for motives outside of “revenge” in a romantic relationship, for example coercive control, personal gratification, advertising someone for commercial sex, etc. Instead, we use variations of the more accurate phrase “Non-consensual Distribution of Sexually Explicit Material (NCDSEM) .”
Sexting—the creating, sending, receiving, or forwarding of sexually suggestive or explicit materials (texts, photos, videos),—can, at times, be a form of image-based sexual abuse (IBSA.) For example, IBSA occurs when sexting is induced under coercion, threats, or relational pressure. Unfortunately, many instances of sexting involve some form of pressure or coercion. Researchers from Northwestern University analyzed 462 accounts from teenage girls who described being pressured from boys to send explicit images and refusal was often met with repeated requests, harassment, or threats.
Sexting, when involving minors, is legally classified as child sexual abuse material (i.e., child pornography) which is a serious U.S. federal crime. It is illegal to produce, possess, or distribute any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age.) Learn more about this crime on the U.S. Department of Justice’s website here.
Sexting can also leave an individual vulnerable to IBSA if the recipient non-consensually shares the content. It should be noted that even in cases of self-generated explicit materials, it is never the victim’s fault when their trust is broken and they are abused.
For more information on sexting and its risks to youth, visit here.
The law varies from country to country, particularly depending on specific manifestations of IBSA. For example, some may ban non-consensually sharing of sexually explicit images but fail to have laws addressing “upskirting” or unsolicited explicit imagery, etc.
In the United States, 48 states, Washington D.C. and two territories have laws prohibiting the distribution or production of nonconsensually recorded/shared sexually explicit materials. However, there is no U.S. federal law regarding IBSA.
In England and Wales, non-consensual explicit image sharing became illegal in April 2015 and carries a maximum jail time of two years. Other countries have strict privacy laws which may be able to be interpreted in a manner to help IBSA victims. The Center for Internet and Society has coalesced a list of some relevant laws by country, last updated in 2018.
Solutions must include stronger legislation, avenues for victims to sue in civil litigation, online platform responsibility to remove non-consensual explicit imagery swiftly, survivor-centered removal forms, hashing for IBSA and child sexual abuse materials, and more.
Take Action
Demand Reddit Stop Hosting Abuse and Exploitation
Call on AWS to Stop Providing Infrastructure Support to Reddit
Call on Congress to Hold Verisign Accountable for Exploitive Domain Registries
Demand Visa Stop Processing Payments Supporting Sexual Exploitation
NCOSE's Efforts to Combat Image-based Sexual Abuse
NCOSE works to combat image-based sexual abuse in the following ways:
- Survivor Leadership – NCOSE is advised and joined in advocacy by a network of survivors.
- Civil Litigation – NCOSE is filing lawsuits against major companies that allow the sharing of IBSA and often profit from it (Learn more here)
- Survivor Services – NCOSE assists survivors with DMCA requests to remove their IBSA from online platforms
- Corporate Accountability – NCOSE is calling for corporations like Google, Reddit, and Discord to create responsible practices and policies around IBSA.
- National Legislation – NCOSE is pursuing federal legislation in the United States and abroad ensuring that meaningful consent was collected by all depicted in sexually explicit materials (Learn more here)
- State Legislation – NCOSE is working to pass legislation in the two remaining states without non-consensually shared/recorded explicit content (sometimes called “revenge porn”) laws – MA and SC
Survivors of Image-based Sexual Abuse Experience Significant Trauma
The Law and Society are Behind on Addressing this Issue


1 in 8
30%
40%
3,044
73%
1 in 10
6,000
2/3
Creating, Distributing, or Possessing CSAM is a Federal Crime
Self-produced photos of minors are child sexual abuse material (i.e., child pornography).
Creating, distributing, and possessing of CSAM is a serious U.S. federal crime.
Learn more about this crime on the U.S. Department of Justice’s website.
Further Reading
Image-Based Sexual Abuse: A Little-Known Term, but a Pervasive Problem
Helpful Tips for Victims of Revenge Porn
Sign the Petition Demanding Google Prevent and Remove Sexual Abuse From Search Results
Image-Based Sexual Abuse: A Little-Known Term, but a Pervasive Problem
Helpful Tips for Victims of Revenge Porn
Sign the Petition Demanding Google Prevent and Remove Sexual Abuse From Search Results
Image-Based Sexual Abuse: A Little-Known Term, but a Pervasive Problem
Helpful Tips for Victims of Revenge Porn
Research
Resources
Videos
Playlist
From the Blog

Victory! TAKE IT DOWN Act Unanimously Passes Senate — Again!
The TAKE IT DOWN Act passed in the Senate unanimously, a major victory in the fight against image-based sexual abuse! Now, onto the House!

TAKE IT DOWN Act Reintroduced in Senate to Confront AI-Generated Image-Based Sexual Abuse
WASHINGTON, DC (January 17, 2025) – The National Center on Sexual Exploitation (NCOSE) commends the reintroduction of the TAKE IT DOWN Act in the U.S.

TAKE IT DOWN Act Approved by Senate: Hope for Survivors of Image-Based Sexual Abuse!
The TAKE IT DOWN Act, a key bill that would protect survivors of image-based sexual abuse, unanimously passed the Senate!

Progress with Roblox. Pam Bondi AG Nomination. Pivotal Court Case … And More!
Read the latest progress and updates in the fight against sexual exploitation!

NCOSE Commends Senate Passage of TAKE IT DOWN Act; Urges House to Pass Three Bills to Combat Image-Based Sexual Abuse
WASHINGTON, DC (December 4, 2024) – The National Center on Sexual Exploitation (NCOSE) commended the U.S. Senate for unanimously passing the TAKE IT DOWN Act

Kids Are Dying Without This Law: Pass Kids Online Safety Act NOW!
How many more children have to die before Congress will take action?