Imagine you are on the couch, scrolling through your social media, when your phone vibrates with a text from a friend.
I’m so sorry, I just saw these and thought you should know… And then a URL.
Stomach in knots, you click the link to a popular app where you see images of yourself that you never meant to see the light of day. These sexual images had only ever been sent to one person, your ex romantic partner who you’d broken up with just a week ago. You two had been together for years before the breakup, you had trusted them completely and even after it ended you never imagined they would share these private pictures with the world.
Heart racing, you see comments on the photos from total strangers. Your eyes sting with tears as you see from the share button that nearly a hundred people have already shared the images within the app. And there is a download button, who knows how many people have already downloaded them?
Your heart drops as you click to report the images to the app moderators. You start to google: how do I get my nudes taken down, how to delete pictures of me from the internet, help boyfriend uploaded nudes…
You see other people posting in forums asking the same question and realize that even if this one app took the images down there was no guarantee someone couldn’t upload them to another forum or website.
Tragically, 1 in 12 U.S. adults have had nude or sexually explicit images of themselves distributed without their consent.
This is image-based sexual abuse.
Image-based sexual abuse (IBSA) can happen through hidden camera videos, deepfake/edited images, leaked photos, filmed child sexual abuse, rape, sex trafficking, and prostitution.— National Center on Sexual Exploitation (@NCOSE) October 9, 2022
It only takes an instant for anyone to become a victim of this exploitation.
Image-based sexual abuse (IBSA) includes, in part, the following:
- Sharing sexual images of a person without their consent (sometimes called “revenge porn”)
- Using sexual images of a person to blackmail them or coerce them into taking more sexual images (sometimes called “sextortion”)
- Collecting, swapping, and/or posting sexual images of a person without their consent
- Recording sexual images or videos of a person without their consent
- Pressuring or harassing another person to take or share sexual images of themselves
- Creating synthetic sexual images of a person through “deepfake” or “cheap fake” technology
A survey of more than 6,000 people aged 16–24 from the UK, Australia, and New Zealand found 38% had experienced at least one type of IBSA, and that LGBT+ respondents had 61% greater odds of IBSA victimization than heterosexual respondents.
Research by the Cyber Civil Rights Initiative has found that IBSA causes significant psychological trauma. Another study noted that survivors of “revenge porn” experienced trust issues, anxiety, PTSD, depression, and self-esteem issues. One survey found that 51% of IBSA survivors reported having considered suicide as a result of their IBSA experience.
Learn more about image-based sexual abuse here.
If you are a survivor of image-based sexual abuse, you are not alone. There are resources to help.
What do I do if I’m a survivor of IBSA?
Here are a few resources that may be helpful:
Cyber Civil Rights Initiative: CCRI provides services to adult survivors of image-based sexual abuse, including sextortion, non-consensual pornography/“revenge porn,” and deepfake/cheap fake pornography. Their safety center provides a step-by-step guide that walks you through potential courses of action. CCRI only provides services within the United States, but their website has a list of international resources for those outside the United States.
Removing IBSA from Google’s search results: This Google Search Help article explains the process of requesting that non-consensually recorded or distributed sexual images (referred to in the article as “non-consensual explicit or intimate images”) be removed from Google’s search results. It also links you to a form to submit such a request. You will need the URL of the website containing the image, the URL of the Google search results page that contains the link to the website you’re reporting, and the screenshot of the image. The article walks you through how to obtain each of these things.
If you have had synthetic sexual images (e.g. “deepfakes” or “cheapfakes”) made of you without your consent, you can request that Google remove these images from search results here. Further, if you have had your personally identifiable information shared alongside the images (ie. “doxxing”), you can request Google remove that information here.
DCMA takedowns: Technically, it is possible to use copyright law to help remove nonconsensually shared sexually explicit images, particularly if you are the one who originally took the image. There are multiple services that may help with this. This website provides an overview of how it works.
Bark: This webpage on Bark has links to each state’s laws regarding sexting and minors. If you scroll to the bottom of the page you can see links to each state’s laws.
NCOSE Law Center: The NCOSE Law Center is the catalyst for lawsuits against the world’s biggest companies profiting from sexual exploitation. NCOSE’s lawyers have already spurred nine trailblazing lawsuits against online platforms that have facilitated sexual exploitation, and are currently involved in lawsuits against Pornhub and Twitter for the trafficking and sexual abuse of children. If you would like to see if you qualify for a lawsuit, please fill out this short intake form.
What if I’m under the age of 18?
If you are under the age of 18, any sexual image of you is considered child sexual abuse material (sometimes known as “child pornography”). The National Center for Missing and Exploited Children website has instructions detailing what to do if your nudes are put on the internet and how you can get them taken down.
How society can do better
Though resources do exist to help survivors of image-based sexual abuse, there are not enough resources for how common this issue is. State laws are piecemeal—some states regard IBSA as a felony, others as a misdemeanor, and some states have no laws regarding IBSA at all. Survivors should not have to repeatedly ask websites to take down their images, only to be ignored, as happens far too often. Online platforms need to be more proactive in requiring age and consent verification for any explicit image they host and facilitate access to.
With the help of survivors, NCOSE is working to change things.
NCOSE works to combat image-based sexual abuse in the following ways:
- Survivor Leadership – NCOSE is advised and joined in advocacy by a network of survivors.
- Civil Litigation – NCOSE is filing lawsuits against major companies that allow the sharing of IBSA and often profit from it (Learn more here).
- Survivor Services – NCOSE assists survivors with DMCA requests to remove their IBSA from online platforms.
- Corporate Accountability – NCOSE is calling for corporations like Google, Reddit, and Discord to create responsible practices and policies around IBSA.
- National Legislation – NCOSE is pursuing federal legislation in the United States and abroad ensuring that meaningful consent was collected by all depicted in sexually explicit materials (Learn more here).
- State Legislation – NCOSE is working to pass legislation in the two remaining states without non-consensually shared/recorded explicit content (sometimes called “revenge porn”) laws – MA and SC.
How can you help?
1. Help survivors find resources!
IBSA is disturbingly common, and chances are someone you know has been victimized. It is crucial to get the word out about resources. So please consider sharing this blog with your networks! You can share by clicking on the social media icons below the title of this article, or by using the click-to-tweet below.Have you had sexual images of yourself shared without your consent? Help is available! See what kind of resources and options exist for you here. Click To Tweet
2. Call on Google to stop surfacing IBSA in search results
Please take 15 seconds to sign this petition, asking Google Search to better prevent and remove IBSA.