TRIGGER WARNING: The following content contains descriptions of child sexual abuse and may be upsetting for some readers.
The world was on lockdown during the COVID-19 Pandemic. Seeking connection, 11-year-old C.H. logged onto Omegle to video chat with other kids. After ending the call with them, she was randomly placed in another chatroom, but this time, she could not see the other person on the line. The screen was black…
Suddenly, text began to appear on the screen.
The stranger whom Omegle had connected this 11-year-old with started rattling off C.H.’s personal information. He threatened to hack her devices if she did not comply with his demands to remove her clothes and touch herself in a sexual way. She pleaded for him to stop, but he was relentless. She eventually complied, as the predator took screenshots.
After the horrific experience, C.H. told her parents what happened. Heartbroken and disturbed, they promptly called the police and sued Omegle for its lack of regulations, which allowed C.H. to be connected with an unknown adult man, ultimately leading to her exploitation.
But their case never went anywhere. Why? Section 230 of the Communications Decency Act.
Despite the fact that it was Omegle’s dangerous product design that allowed an 11-year-old to be connected with a predator, Section 230 allowed Omegle to escape any accountability.
Section 230 is the greatest enabler of sexual exploitation, which is why ending it is the sole focus of this year’s Dirty Dozen List. With YOUR help, survivors, like C.H., can get the justice they deserve.
In 2025, rather than our traditional Dirty Dozen List approach of targeting 12 corporations who facilitate sexual exploitation, we are focusing on 12 survivor stories that unmask the real root of the problem: Section 230 of the Communications Decency Act.
What is Section 230?
Section 230 of the Communications Decency Act states that tech platforms “should not be treated as the publisher or speaker of any information provided by another information content provider.” So, what does that mean?
Although it was never the original intention of the law, courts have since misinterpreted it as granting tech platforms broad immunity from liability for any harm that occurs on their platforms from users’ actions.
But here’s the problem: online harm is rarely a matter of users’ actions alone. Tech companies take a very active role in facilitating and multiplying this harm—whether through dangerous product design, failure to respond to reported crimes, or more.
Without liability, tech platforms have no incentive to make the necessary safety reforms to protect their users.
But wait… Don’t companies have a moral obligation to make their platforms safe, regardless of if there is a law requiring it or not? In a perfect world, yes. Sadly, for them, morality usually does not prevail over profits.
The practice of exploiting vulnerable users on social media is actually quite lucrative for tech companies. In fact, a lawyer defending Twitter (now X) in a case involving its failure to remove reported child sexual abuse material (CSAM) went so far as to assert that CSAM was “popular content” as part of his defense.
Why are We Changing our Approach?
The scale of online sexual exploitation is increasing as the Internet becomes a more pervasive tool used by perpetrators.
The National Center for Missing and Exploited Children (NCMEC) received over 186,000 reports of online enticement of children for sexual abuse in 2023—a 300%+ surge since 2021. Last year, NCMEC received 812 reports of sextortion per week.
According to the 2023 Federal Human Trafficking Report and research conducted by Thorn, most sex trafficking victims are advertised online.
In 2023, more than 1 in 3 minors (35%) reported having an online sexual interaction, 1 in 4 (28%) with someone they believed to be an adult.
1 in 8 adult Facebook users have been victims of nonconsensual distribution of sexually explicit material or threats to distribute this material. The online availability of AI-generated forged pornography (i.e. “deepfake” pornography) increased by 464% between 2022 and 2023.
These facts don’t lie. Why is this exploitation so rampant? Why are these crimes increasing rather than decreasing? Why do we have to play whack-a-mole, chasing after corporations one by one, begging them to make safety changes? Section 230 is the answer to all these questions.
Section 230 is the foremost facilitator of online sexual exploitation. While the law was one of 12 targets on the 2024 Dirty Dozen List, it has become the primary focus of this year’s campaign because ending Section 230 would curb exploitation of users across all tech platforms, rather than going brick-by-brick targeting each platform individually.
It is time to stop accepting incremental change.
That is why the 2025 Dirty Dozen List presents a call to repeal Section 230 of the Communications Decency Act.
We need you to take action—contact your legislator and ask them to repeal Section 230!
Visit dirtydozenlist.com to learn more about this year’s campaign.