Mainstream Contributors To Sexual Exploitation
This “front page of the Internet” is a hub of child sexual abuse materials, sex trafficking, and image-based sexual abuse.
Updated 6/22/2023: Reddit, after being placed on the Dirty Dozen List for three consecutive years, has recently announced that the platform will soon limit access to pornography and sexually explicit content (what they term “not safe for work”) through third-party apps: meaning it may soon be impossible to view pornography on Reddit through unofficial apps. Additionally, Reddit banned the use of “save image/video” bots within their API (application programming interface)—another change NCOSE specifically requested, as these bots contribute significantly to the circulation and collection of child sexual abuse material, image-based sexual abuse, and pornography on the platform.
Reddit—a mainstream discussion platform hosting over two-million user-created “communities”—is a hub of image-based sexual abuse, hardcore pornography, prostitution – which very likely includes sex trafficking – and overt cases of child sexual exploitation and child sexual abuse material (CSAM).
Reddit has recently undertaken a PR blitz to try and salvage its image. It publicized that it has a policy against non-consensually shared explicit material (sometimes called “revenge porn”), and has taken new steps to remove reported content. However, these policies on paper are not translating into proactive prevention or removal of abuse in practice.
Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Reddit for more details.
Reddit’s most recent update to the platform’s non-consensual intimate media policy (what Reddit calls Rule 3) in March 2022 was recently praised for its progress in the 2022 Transparency Report and the ArsTechnica article Reddit cracked down on revenge porn, creepshots with twofold spike in permabans.
However, NCOSE researchers and other industry experts have found otherwise.
NCOSE researchers have found that Reddit is filled with non-consensually shared “leaked” explicit content, as well as “deepfake” pornography. In fact, many posts on Reddit even monetize creating sexual “deepfake” images of any person someone submits.
Even the most apparent violations go unchecked by Reddit, such as one dedicated sexual deepfake community of over 48,000 members that is currently active on Reddit. They have amassed over 15 million “donated” sexually explicit and pornographic images to train their sexual deepfake technology and post sexual deepfakes and AI-generated pornography on an hourly cadence on Reddit.
Often sharing of image-based sexual abuse also includes attempts to dox the people depicted. NCOSE researchers identified a another subreddit containing over 1.3 million members, that allowed users to post images of pornography performers, content creators, or other influential women for members to help identify their real names.
———–
Definitions:
Image-based sexual abuse (IBSA) is a broad term that includes a multitude of harmful experiences, such as non-consensual sharing of sexual images (sometimes called “revenge porn”), pornography created using someone’s face without their knowledge (or sexual deepfakes), non-consensual recording or capture of intimate nude images (to include so-called “upskirting” or surreptitious recordings in places such as restrooms and locker rooms via “spycams”), recording of sexual assault, the use of sexually explicit or sexualized materials to groom or extort a person (also known as “sextortion”) or to advertise commercial sexual abuse, and more.
Note: sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, known as child sexual abuse material (CSAM – the more appropriate term for “child pornography”), which is illegal under US federal statute. CSAM is not to be conflated with IBSA.
In addition to failing to adequately enforce their own policies, Reddit refuses to implement meaningful age or consent verification for people depicted in sexually explicit images. Sexually explicit and pornographic content are actively being shared on Reddit without reservation – allowing exploitation, including child sexual abuse material (CSAM) to thrive. NCOSE researchers found indicators of CSAM within a few minutes.
In 2021, Reddit was sued by 6 victims of CSAM for knowingly profiting from and facilitating her abuse. Although the case was dismissed in 2022, Reddit continues to be implicated in CSAM cases.
News reports, such as those documented below, are examples of Reddit’s problematic “hands off” approach when it comes to regulating non-consensual sexually explicit content, allowing child sexual abuse material (CSAM) to exist on the platform.
The following cases are just a few reports from the past year:
Until Reddit can implement a robust age and consent verification process for sexually explicit content and create clarity and consistency for policy implementation, pornography must be prohibited to stem the IBSA and CSAM rampant on Reddit.
Reddit refuses to implement effective content policies and moderation practices allowing severe sexual content, sexual violence, and sexualization of minors to exist on Reddit.
Although Reddit banned hate speech and sexism/misogyny from their platform, communities like r/jailbait_ll, a reiteration of the now defunct r/jailbait community that was banned in 2012 for inciting violence against women and hosting CSAM, continue to exist on Reddit.
In addition to this, NCOSE researchers have found artificial / illustrated pornography depicting bestiality and even illustrated bestiality involving a child-like figure.
Absence of industry-standard protections and practices and anemic policy implementation on Reddit create an ecosystem of exploitation.
Lack of accountability.
Reddit is substantially behind current safety and moderation practices, failing to implement even the most basic safety practices, such as requiring a verified email address – something that is now much of an industry-wide standard. As a result, ban evasions, throw-away accounts, and escaping accountability for posting harmful and illicit content are commonplace on Reddit. Instagram, Facebook, and TikTok all require users to provide a valid email address to create a profile on their app and limit the ability to search content without creating an account – basic features that Reddit fails to require.
Lack of either protection or tools for moderators.
One study estimated that moderators do a total of $3.4 million worth of unpaid moderation for Reddit. Female moderators have also reported experiencing abuse once community members are aware of their gender, contributing yet again to Reddit’s problems with misogyny and sexism. Moderators have experienced abuse at the hands of throw-away accounts for attempting to make their communities safe and enjoyable environments – something Reddit admins fail to achieve.
Lack of reporting mechanisms.
Reddit also fails to provide effective and easy reporting mechanisms for users and moderators on their platform. Even when one user reported the sale of suspected CSAM, Reddit refused to act.
This is especially concerning, given that moderators do not have the tools or ability to remove content from Reddit permanently. Moderators can “remove” content and ban users from subreddits. Still, the content continues to exist on the platform marked as [removed] and remains visible until a Reddit Admin physically deletes the [removed] content.
An article in 2022 exposed Reddit’s refusal to allow moderators to remove content which inadvertently led to a collection of posts in the subreddit r/TikTokThots, a community of 1.3 million users, in which moderators removed posts containing potential CSAM that remain searchable on Reddit today.
Failure to implement existing policies.
As is clear by reviewing all the compiled evidence, Reddit fails to implement its own policies – with newly announced policies easily proven ineffective.
In addition to the example of the recently updated IBSA policy which failed NCOSE researcher testing, Reddit announced automatic tagging of NSFW images in March 2022. Despite Reddit’s promise to automatically tag sexually explicit images as NSFW, this failed to occur for many of the images within these deepfake and IBSA communities.
One of the top 20 websites in the world: More than 57 million daily users and 1.1 billion monthly users
One subreddit dedicated to sharing nonconsensual sexual images had more than 20,000 users and 15,000 images of over 150 women; there are countless subreddits like this
34% of minors 9-17 have used Reddit
Seeking to go public in 2023