How Image-Based Sexual Abuse Flourishes on Reddit

By:

Reddit is one of the top 20 internet platforms in the world. As of 2021, it is worth $6 billion, doubling its 2019 worth. It has 52 million daily active users with 4 million daily posts and a further 2 billion comments. 

People go on Reddit to exchange ideas, chat about shared interests, and meet new people online. Its users upload their own material—photos, videos, discussion topics, etc.—and can create what are called ‘subreddits.’ These subreddits can be about anything, from cooking and gardening to extreme sport. Some subreddits are funny; some are dedicated fan pages to shows like Ted Lasso. Some are places where strangers come to seek and offer advice to each other. 

But some have a much darker agenda. Subreddits are also places where people share child sexual abuse materials (‘child pornography’), hardcore pornography, or image-based sexual abuse. There are subreddits dedicated to normalizing incest. Some are used to advertise prostitution and potentially sex trafficking. On Reddit, people can just key in a topic or phrase like “NSFW” (not safe for work) and pull up thousands of subreddits. 

Just like that, an internet platform used by 52 million people per day becomes a predator’s paradise—a playground of overt misogyny, sexual abuse, and gender-based violence.

Reddit’s Policies and Practices Allow Image-based Sexual Abuse to Flourish on its Platform

What is Image-Based Sexual Abuse? 

Image-based sexual abuse (IBSA) is a broad term that includes a multitude of harmful experiences, such as nonconsensual sharing of sexual images (sometimes called “revenge porn”), pornography created using someone’s face without their knowledge (or deepfake pornography), nonconsensual recording or capture of intimate nude images (to include so-called “upskirting” or surreptitious recordings in places such as restrooms and locker rooms via “spycams”), recording of sexual assault, and more. 

Often victims do not know that their images have been shared online and find out in different ways. In other instances, the sharing, or threat of sharing, intimate images are used to exploit victims, known as ‘sextortion.’ 

Very often the victim’s personal details such as their email address, social media accounts, phone number, and home address are posted alongside the images. This puts them at an increased risk of being harassed and stalked both online and off. 

In the US 1 in 8 social media users report being targets of this image-based sexual abuse, while the UK has seen a doubling of victims between 2019 and 2021. Unsurprisingly the majority of victims are women and girls and the perpetrators are men. Images are collected and traded online like olden day baseball cards.

How IBSA Perpetuates on Reddit

On Reddit, some subreddits exist with the sole intention of posting sexually explicit images of women and girls without their consent. These are often specific to geographic locations, with users actively seeking images of women in a certain city or town.  

Shockingly many subreddits are dedicated to men swapping explicit images, videos, and other materials of their wives and girlfriends. One such subreddit, ‘wife pic trading,’ set up in mid-2015 has nearly 250,000 members. While some women may consent to the sharing of their explicit images in such fora, it is impossible to tell how many of these images are being shared nonconsensually—and Reddit has no metric to verify age or consent. There are also subreddits dedicated to “leaked” (aka stolen or nonconsensually shared) sexually explicit content from OnlyFans or Snapchat, and more. (See the proof section of EndSexualExploitation.org/Reddit.) 

Needless to say, image-based sexual abuse has detrimental effects on victims, who report experiencing depression, anxiety, and PTSD as a result. Horrifically, there have been some high profile cases of young people dying by suicide as a direct result of having their intimate images shared online and the onslaught of further abuse and harassment that followed. 

Despite this, Reddit does little to stop this abuse from taking place.

Reddit Refuses to Enforce Policies that Would Stop Abuse

Reddit technically has a policy against nonconsensually sharing sexually explicit materials, yet this is largely ineffective in practice. Reddit is one of the few big social media sites that still allows pornography on its site with Twitter being another. This leaves the platform rife with sexual exploitation. 

Platforms that allow user-uploads of pornography are replete with nonconsensually shared materials and even videos of child sexual abuse and sex trafficking. For example, Pornhub and its parent company MindGeek are currently being sued by survivors of child sexual abuse for hosting videos of their exploitation (Learn more about how NCOSE is working on this lawsuit here).  

With 4 million posts a day, 2 billion comments, and a further 2 billion upvotes—all of which are exponentially growing year on year—one would imagine that Reddit employs an army of content moderators to ensure the safety of its users but that is not the case.

Reddit Does Not Effectively Moderate Content

Reddit relies on community ‘moderators’ for subreddits, which according to their User Agreement is ‘an unofficial, voluntary position that may be available to users of the Services.’ The main purpose of these ‘moderators’ is to ensure discussions ‘stay on topic,’ not to prevent abuse.

Reddit does not effectively moderate content, beyond employing a handful of administrators who may address issues such as bullying. Nowhere in their User Agreement does it mention anything about explicit or adult content. For a platform that allows users aged 12 and up—and with no age verification mechanism—this is not just shocking but negligent. Add this to the rampant sexual abuse and exploitation and it’s no surprise that Reddit made NCOSE’s Dirty Dozen List last year and has earned its reputation as the ‘wild west of the internet’ in the worst way.

Reddit Hesitates to Listen to Survivors Meanwhile Abusive Content is Recirculated

Even in the most high profile of cases, Reddit refuses to take appropriate action to remove nonconsensually shared images. In 2019 the owners of Girls Do Porn, a pornography website, were charged and subsequently found guilty of sex trafficking young women to appear in sex videos.  

The women at the center of this heinous crime showed great courage and strength in taking this case against their exploiters, but the guilty verdict is not where their pain ended. Their images and videos are still online being shared and spread across the internet. 

After the charges were initially filed, Reddit, under pressure given the profile of the case, removed the biggest subreddit dedicated to sharing Girls Do Porn images and videos. However, Reddit did not remove all the subreddits and even more subreddits emerged in the wake of the case.  

Almost a year after the owners of Girls Do Porn were found guilty of coercing young women into sex, subreddits still existed with not just images and videos of these women but with their full names and other identifying information. Reddit allowed these subreddits to remain on its platform for months, not only adding to the trauma experienced by these women, but also putting their online and physical safety in jeopardy.   

Child Sexual Abuse Material (CSAM aka Child Pornography) on Reddit

In addition to image-based sexual abuse of adults on Reddit, there have been many cases of CSAM (aka child pornography), which is contraband, meaning it is illegal under federal law to house, possess, or distribute it, including on the Internet. 

Cases have been reported where child sexual abuse material survivors have actively and repeatedly reached out to Reddit to request that nonconsensually shared sexually explicit images of them or others be removed, but to no avail. This constant battle to have images removed from Reddit and other similar platforms just adds to the trauma already experienced by victims of IBSA.

Reddit Fails to Confirm Removal of Illegal Content

In 2020, NCOSE assisted a survivor who was a minor at the time of her sexual abuse and had scores of abuse images uploaded and disseminated on the Internet.  

One of the websites the CSAM was uploaded to was Reddit. There were at least 28 unique URLs depicting this survivor’s child sexual abuse on Reddit. On December 18, 2020, NCOSE sent a takedown request on behalf of the survivor to Reddit’s CEO, Reddit’s Legal Department, as well as Reddit’s information, contact, and help desk email addresses. Several months later, we still had not received a response or acknowledgement from Reddit despite being informed of the very serious federal crimes occurring on their website.    

Eventually, we received a notice from Reddit stating that they no longer monitored contact@reddit.com or support@reddit.com, which then prompted us to fill out a Customer Support Ticket on their website. After a convoluted process, we submitted the ticket once again explaining the very serious exploitation and abuse of our client and the federal crimes occurring on their website. NCOSE received an automated email stating that Reddit would look into our situation and investigate further if any action was necessary. 14 months after the initial takedown request, NCOSE still does not have confirmation that the content has successfully been removed.

Reddit Has the Power to Prevent Illegal Content on its Platform

Reddit needs to be held to account for the harm caused by failing to prevent and remove both image-based sexual abuse and child sexual abuse materials on its platform. There are countless survivors out there who have had their privacy and dignity stripped from them by this abuse, living in daily fear of what will come next—where their image will be posted next and how many more anonymous men will be trading them in subreddits, without casting their humanity a second thought. 

All the while, Reddit is doubling its profits and shrugging its metaphorical shoulders like there is nothing they can do to prevent this. Like holding a can of water while staring at a fire—they know how to put the fire out and they have the money and the means to do it.  

They just don’t want to.

Reddit can become part of the solution by: 

  • Implementing strong policies against hardcore pornography.
  • Instituting proactive moderation and filtering solutions to enforce such a policy.
  • Instituting survivor-centered practices and reporting mechanisms.
  • Banning users who upload sexually explicit material, especially if the material depicts child sexual abuse material or nonconsensually shared intimate images, and prevent them from creating another account.
  • Creating prominent, simple, and lock-able parental controls and prevent adult strangers from messaging minors. 

Take action and learn more at EndSexualExploitation.org/Reddit/.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.