Mainstream Contributors To Sexual Exploitation

It takes 5 seconds to find Sexual Exploitation on Reddit

This “front page of the Internet” is a hub of child sexual abuse materials, sex trafficking, and image-based sexual abuse.

Updated 6/22/2023: Reddit, after being placed on the Dirty Dozen List for three consecutive years, has recently announced that the platform will soon limit access to pornography and sexually explicit content (what they term “not safe for work”) through third-party apps: meaning it may soon be impossible to view pornography on Reddit through unofficial apps. Additionally, Reddit banned the use of “save image/video” bots within their API (application programming interface)—another change NCOSE specifically requested, as these bots contribute significantly to the circulation and collection of child sexual abuse material, image-based sexual abuse, and pornography on the platform. 

Reddit—a mainstream discussion platform hosting over two-million user-created “communities”—is a hub of image-based sexual abuse, hardcore pornography, prostitution – which very likely includes sex trafficking – and overt cases of child sexual exploitation and child sexual abuse material (CSAM).

Reddit has recently undertaken a PR blitz to try and salvage its image. It publicized that it has a policy against non-consensually shared explicit material (sometimes called “revenge porn”), and has taken new steps to remove reported content. However, these policies on paper are not translating into proactive prevention or removal of abuse in practice.

Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Reddit for more details.

Take Action

Our Requests for Improvement


Evidence of Exploitation

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.

Reddit’s most recent update to the platform’s non-consensual intimate media policy (what Reddit calls Rule 3) in March 2022 was recently praised for its progress in the 2022 Transparency Report and the ArsTechnica article Reddit cracked down on revenge porn, creepshots with twofold spike in permabans

However, NCOSE researchers and other industry experts have found otherwise.

NCOSE researchers have found that Reddit is filled with non-consensually shared “leaked” explicit content, as well as “deepfake” pornography. In fact, many posts on Reddit even monetize creating sexual “deepfake” images of any person someone submits.

Even the most apparent violations go unchecked by Reddit, such as one dedicated sexual deepfake community of over 48,000 members that is currently active on Reddit. They have amassed over 15 million “donated” sexually explicit and pornographic images to train their sexual deepfake technology and post sexual deepfakes and AI-generated pornography on an hourly cadence on Reddit.

Often sharing of image-based sexual abuse also includes attempts to dox the people depicted. NCOSE researchers identified a another subreddit containing over 1.3 million members, that allowed users to post images of pornography performers, content creators, or other influential women for members to help identify their real names.



Image-based sexual abuse (IBSA) is a broad term that includes a multitude of harmful experiences, such as non-consensual sharing of sexual images (sometimes called “revenge porn”), pornography created using someone’s face without their knowledge (or sexual deepfakes), non-consensual recording or capture of intimate nude images (to include so-called “upskirting” or surreptitious recordings in places such as restrooms and locker rooms via “spycams”), recording of sexual assault, the use of sexually explicit or sexualized materials to groom or extort a person (also known as “sextortion”) or to advertise commercial sexual abuse, and more.

Note: sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, known as child sexual abuse material (CSAM – the more appropriate term for “child pornography”), which is illegal under US federal statute. CSAM is not to be conflated with IBSA.

See more evidence with this easy to view and download PDF

In addition to failing to adequately enforce their own policies, Reddit refuses to implement meaningful age or consent verification for people depicted in sexually explicit images. Sexually explicit and pornographic content are actively being shared on Reddit without reservation – allowing exploitation, including child sexual abuse material (CSAM) to thrive. NCOSE researchers found indicators of CSAM within a few minutes. 

In 2021, Reddit was sued by 6 victims of CSAM for knowingly profiting from and facilitating her abuse. Although the case was dismissed in 2022, Reddit continues to be implicated in CSAM cases. 

News reports, such as those documented below, are examples of Reddit’s problematic “hands off” approach when it comes to regulating non-consensual sexually explicit content, allowing child sexual abuse material (CSAM) to exist on the platform.

The following cases are just a few reports from the past year:  

  • In September 2022, a 47-year-old Kentucky teacher was arrested after the Internet Crimes Against Children Task Force discovered he had posted CSAM on his Reddit account.   
  • In October 2022, a 38-year-old Porter County man was arrested for disseminating CSAM on Reddit and was held on a $25,000 bond.   
  • In December 2022, a 37-year-old Kenosha man was arrested for 10 counts of felony possession of child pornography. Investigators were informed by NCMEC that the offender had uploaded CSAM on two social media apps, Kik and Reddit, both of which reported the man’s behavior to NCMEC’s Cyber TipLine. 
  • In January 2023, a 44-year-old Maryland man was arrested on numerous charges related to rape, sexual solicitation of a minor, and dissemination of CSAM.   
  • On March 14, 2023, a 34-year-old Salina, OK man, was charged with numerous counts of disseminating CSAM after investigators found he had been sharing CSAM on Reddit. Investigators estimated the man had over 15,000 images of CSAM of teenage girls between the ages of 13 to 16. He was quoted to be ‘bragging’ to other Redditors that he had been engaging in sex with a 12-year-old girl and her grandmother and mother would watch.   
  • On March 27, 2023, a Springfield, IL man was sentenced to 96 months of federal prison and a lifetime of supervised release for the distribution of more than 600 images of CSAM. He also owned a subreddit of approximately 900 members where the offender would share images of CSAM. 
  • On March 28, 2023, a Covington, KY high school teacher was arrested on two charges of pandering obscenities of a minor after investigators received a tip from Reddit regarding the distribution of CSAM. He had been employed as a Spanish teacher and assistant soccer and esports coach at Holmes High School since 2021.
  • On March 29, 2023, another man was charged on five counts related to cyberstalking, hacking, and selling CSAM and IBSA on Reddit, Twitter, and Telegram.  

Until Reddit can implement a robust age and consent verification process for sexually explicit content and create clarity and consistency for policy implementation, pornography must be prohibited to stem the IBSA and CSAM rampant on Reddit. 

See more evidence with this easy to view and download PDF

Reddit refuses to implement effective content policies and moderation practices allowing severe sexual content, sexual violence, and sexualization of minors to exist on Reddit.

Although Reddit banned hate speech and sexism/misogyny from their platform, communities like r/jailbait_ll, a reiteration of the now defunct r/jailbait community that was banned in 2012 for inciting violence against women and hosting CSAM, continue to exist on Reddit. 

In addition to this, NCOSE researchers have found artificial / illustrated pornography depicting bestiality and even illustrated bestiality involving a child-like figure.

See more evidence with this easy to view and download PDF

Absence of industry-standard protections and practices and anemic policy implementation on Reddit create an ecosystem of exploitation. 

Lack of accountability.  

Reddit is substantially behind current safety and moderation practices, failing to implement even the most basic safety practices, such as requiring a verified email address – something that is now much of an industry-wide standard. As a result, ban evasions, throw-away accounts, and escaping accountability for posting harmful and illicit content are commonplace on Reddit. Instagram, Facebook, and TikTok all require users to provide a valid email address to create a profile on their app and limit the ability to search content without creating an account – basic features that Reddit fails to require.  

Lack of either protection or tools for moderators.  

One study estimated that moderators do a total of $3.4 million worth of unpaid moderation for Reddit. Female moderators have also reported experiencing abuse once community members are aware of their gender, contributing yet again to Reddit’s problems with misogyny and sexism. Moderators have experienced abuse at the hands of throw-away accounts for attempting to make their communities safe and enjoyable environments – something Reddit admins fail to achieve.  

Lack of reporting mechanisms.  

Reddit also fails to provide effective and easy reporting mechanisms for users and moderators on their platform. Even when one user reported the sale of suspected CSAM, Reddit refused to act.  

This is especially concerning, given that moderators do not have the tools or ability to remove content from Reddit permanently. Moderators can “remove” content and ban users from subreddits. Still, the content continues to exist on the platform marked as [removed] and remains visible until a Reddit Admin physically deletes the [removed] content.   

An article in 2022 exposed Reddit’s refusal to allow moderators to remove content which inadvertently led to a collection of posts in the subreddit r/TikTokThotsa community of 1.3 million users, in which moderators removed posts containing potential CSAM that remain searchable on Reddit today.  

Failure to implement existing policies.  

As is clear by reviewing all the compiled evidence, Reddit fails to implement its own policies – with newly announced policies easily proven ineffective.   

In addition to the example of the recently updated IBSA policy which failed NCOSE researcher testing, Reddit announced automatic tagging of NSFW images in March 2022. Despite Reddit’s promise to automatically tag sexually explicit images as NSFW, this failed to occur for many of the images within these deepfake and IBSA communities. 

See more evidence with this easy to view and download PDF

Fast Facts

Reddit is 13+; Apple App Store is 17+; Google Play M

One of the top 20 websites in the world: More than 57 million daily users and 1.1 billion monthly users

One subreddit dedicated to sharing nonconsensual sexual images had more than 20,000 users and 15,000 images of over 150 women; there are countless subreddits like this

34% of minors 9-17 have used Reddit

Seeking to go public in 2023


Stay up-to-date with the latest news and additional resources

Recommended Resources

Is Reddit safe for kids? An honest Reddit review for parents.

International Letter to Reddit signed by 319 survivors and organizations from 31+ countries

Reddit Assessment and Parent's Guide

Read this helpful resource from the Social Media Victims Law Center

Inside the secret world of trading nudes

Women are facing threats and blackmail from a mob of anonymous strangers after their personal details, intimate photos and videos were shared on the social media platform Reddit.


Help educate others and demand change by sharing this on social media or via email:


Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.