A Mainstream Contributor To Sexual Exploitation

The Greatest Enabler of Online Sexual Exploitation

Misinterpretations of Communications Decency Act Section 230 grant Big Tech blanket immunity for any and all types of sexual abuse and exploitation they facilitate. Until we amend CDA 230, corporations can’t be held accountable!

Take Action

The very foundation of our online society, the Communications Decency Act Section 230, was laid in the early days of the Internet back in 1996, but its cracks have now become chasms. What once seemed a necessary legislative underpinning for online business to thrive now stands as the greatest opus to shield technology companies from any and all accountability—especially when it comes to the proliferation of sexual exploitation. 

Just to highlight the antiquity of Section 230, here are items to consider:  

  • Enacted in 1996, Section 230 predates social media, smartphones, and the modern internet; it governed just 20 million American users, not today’s 300 million
  • Established before the advent of Google (1998) and YouTube (2005), Section 230 remains unchanged, outdated for today’s digital giants hosting billions of content pieces. 
  • Untouched since its inception, it was already in place eight years by the time Facebook launched in 2004, connecting billions globally. 
  • By 2012, with Snapchat’s new ephemeral messaging, Section 230 had been static for 16 years amidst vast technological changes. 
  • Despite the rise of deepfake technology and widespread application of AI, the law remains unchanged in an era of exponential internet growth. 

It is clear that Section 230, currently over a quarter-century old, is ill-suited for today’s advanced digital landscape, demonstrating a clear and pressing need for reform.  

The Shield That Became a Weapon 

The genesis of the Communications Decency Act (CDA) was rooted in Congress’ commendable aspiration to shelter children from pernicious elements lurking on the burgeoning Internet. The architects of the CDA sought to incentivize tech companies to actively safeguard younger users from harmful content. Congress added Section 230 to the CDA with the aim of fostering an environment where platforms could grow and flourish without fear of being held liable for content posted by third parties. Fast forward to today, and it is apparent that this measured shield intended to protect innovation has become an impenetrable fortress for social media giants and online platforms, even when their very business models facilitate and profit from horrors—sex trafficking, child sexual abuse material (CSAM), and the scourge of image-based sexual abuse (IBSA)

A Legacy of Exploitation Ignited by 26 Words

"No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

The 26 words of Section 230 have inadvertently birthed the darkest realities of our online world. The intent of Congress was clear—to not give this blanket immunity when it comes to harmful content. Yet, as the Supreme Court parsed and interpreted the CDA, throwing out sections while leaving these words to stand alone, we have seen a massive expansion of court-granted immunity for Big Tech. Indeed, courts across the land are confused and continue to grant tech companies their request for a golden ticket out of any situation where their accountability is brought into question. 

Paving the Way for Responsible Internet Platforms 

Imagine if online platforms were forced to reckon with the content and harms they have so long enabled and profited from. Imagine if they were brought to task for the sex trafficking they facilitated or for the CSAM that proliferates under their watchful gaze. Envision a world where tech companies allowing children on their platforms are mandated to prioritize their safety at the very core of their platform design. Imagine that tech companies not only listen but also unequivocally stand by victims of sexual abuse, instead of aligning themselves with perpetrators for the vile sake of profit maximization. We need not merely imagine—it’s time to demand it. The current state of Section 230 should not give these platforms a ‘Get out of Jail Free’ card when they hold the keys to prevention in their very hands. 

This Jane Doe Could Be Any of You 

There is no shortage of individual stories of those who have faced the worst that the internet has to offer. Take, for example, the tragic tale of a teen girl who was brutally raped, the abuse was filmed and then uploaded to Pornhub, Discord, Reddit and X, and then she became the subject of a viral harassment campaign. Her mental and emotional well-being suffered irreparable damage, all while platforms monetized the content that drove her to the brink. She lives in fear that the abuse videos will surface at any moment. Each time they do, she begs for them to be taken down, jumping through hoop after hoop that these companies have implemented just for her to ask it be removed.  

In a world where accountability is as nebulous as the digital landscape itself, her story is not only heartbreaking but infuriating. A minor tweak, a word change, a reform—could have changed her story. 

*Survivor is one of our Law Center’s former clients 

Act Now to Avoid Regret Later 

Section 230 is not a sacred text; it is a piece of legislation that needs to evolve with the technology and society it aims to govern. No other industry enjoys such freedom from regulation or from accountability for the harm they cause. It is time for Congress to act, to reframe and revise, to bring clarity and hold accountable the online platforms that have for too long hidden behind its skirts. This is not a call for the censorship of the Internet, but for the preservation of human dignity and the protection of our most vulnerable. We cannot afford to wait for another Jane or John Doe. 

This 6-min video by Bloomberg Quicktakes provides a broader explanation of how we got this law, the consequences of the law on society, and some of the arguments on both sides of the arguments around CDA 230 reform. This video explains many points not addressed by NCOSE who strictly focuses on sexual abuse and exploitation.

This report by the Congressional Research Service provides detailed history, explanation, and challenges around this law. It is a great place to learn more about the issues in general. 

Imagine if online platforms were given the ultimate ‘Get Out of Jail Free Card.’ Well, they already have one! CDA 230 grants tech giants like Meta’s Family of Apps (which has a user base that surpasses the populations of the world’s top three most populous countries combined), unchecked power online, allowing online platforms to operate with impunity, evading accountability for content on their platforms. 

A Groundswell for Change

This is not merely one voice calling out in the wilderness. The Department of Justice has sounded alarm bells, arguing that Big Tech’s ‘immunity’ is absurd and antithetical to the core of CDA’s original intentions. Senators on both sides of the aisle, key publications, and even judges in their dissenting opinions have echoed the same. The ground beneath Section 230 has been crumbling, and the demand for action has reached deafening volumes.

Other Voices Calling for CDA230 Reform:

U.S. Department of Justice

“The Department of Justice has concluded that the time is ripe to realign the scope of Section 230 with the realities of the modern internet. Reform is important now more than ever. Every year, more citizens—including young children—are relying on the internet for everyday activities, while online criminal activity continues to grow. We must ensure that the internet is both an open and safe space for our society. Based on engagement with experts, industry, thought leaders, lawmakers, and the public, the Department has identified a set of concrete reform proposals to provide stronger incentives for online platforms to address illicit material on their services, while continuing to foster innovation and free speech.”

Senator Dick Durbin

“The tech industry alone is not to blame for the situation we're in. Those of us in Congress need to look in the mirror. In 1996, the same year the Motorola Star Tech was flying off shelves and years before social media went mainstream, we passed Section 230 of the Communications Decency Act. This law immunized the then fledgling internet platforms from liability for user generated content.…For the past 30 years, Section 230 has remained largely unchanged, allowing Big Tech to grow into the most profitable industry in the history of capitalism without fear of liability for unsafe practices. That has to change.”

Senator Lindsey Graham

“So the bottom line is, you can't be sued. You should be. And these emails would be great for punitive damages but the courtroom's closed to every American abused by all the companies in front of me. Of all the people in America we could give blanket liability protection too, this would be the last group I would pick. It is now time to repeal Section 230.”

Senator Sheldon Whitehouse

“We are here in this hearing because, as a collective, your platforms really suck at policing themselves. We hear about it here in Congress with fentanyl and other drug dealing facilitated across platforms. We see it and hear about it here in Congress with harassment and bullying that takes place across your platforms. We see it and hear about it here in Congress with respect to child pornography, sexploitation, and blackmail and we are sick of it. It seems to me that there is a problem with accountability because these conditions continue to persist. In my view, Section 230, which provides immunity from lawsuit, is a very significant part of that problem.”

Notable Section 230 References from the Senate Judiciary Hearing

Big Tech and the Online Child Sexual Exploitation Crisis, January 31, 2024

Victims Refused Justice Due to CDA 230

Doe v. Twitter, Inc., 2023 WL 8568911 (N.D. Cal. Dec. 11, 2023)


Two young boys were exploited online. Images of their abuse were uploaded to Twitter, garnering hundreds of thousands of views and countless downloads. Their classmates found out and one boy became suicidal. One of the boys begged Twitter to take down the content, even uploading photo of his ID proving that they were just young teens. Twitter not only kept running ads to the child sexual abuse content, but they replied to the desperate young man saying that they reviewed the material and it didn’t violate their standards. Later, in a response to the lawsuit, Twitter said even if they had profited from CSAM or sex trafficking, that they were immune under CDA Section 230. The 9th Circuit has since agreed with Twitter. 

Court Findings and NCOSE Analysis:

Court’s Analysis/CDA 230: “With respect to Claim Four, the Court found that Section 230(c) of the Communications Decency Act (‘CDA’) precluded Plaintiffs from stating a viable claim for possession and distribution of child pornography because that claim was aimed at Twitter’s failure to remove content from its platform, thus treating Twitter as a traditional publisher.”
NCOSE Analysis: This quote demonstrates a judge’s interpretation of CDA 230, protecting platforms like Twitter from liability for user-generated content.

FOSTA and CDA 230 Immunity: “With respect to the exemption from CDA § 230 immunity adopted under FOSTA (codified at 47 U.S.C. § 230(e)(5)), the Court concluded that that exemption was not limited to claims that meet the stringent criminal law standards applicable to claims asserted under Section 1591…”
NCOSE Analysis: Here, the judge discusses the complexity of applying FOSTA exemptions to CDA 230 immunity, indicating nuances in how platforms are held accountable.

Court’s Conclusion and Dismissal: “The Court finds that Plaintiffs’ beneficiary liability claim fails under the standard set forth in Reddit. Further, Plaintiffs have not pointed to any way they can salvage this claim by amendment. Therefore, the Court GRANTS Twitter’s Motion and dismisses Plaintiffs’ remaining claim with prejudice and without leave to amend.”
NCOSE Analysis: This conclusion highlights the court’s decision to protect the defendant under CDA 230, reinforcing the challenges plaintiffs face in holding platforms accountable for facilitating sex trafficking.

Does 1-6 v. Reddit, Inc., 51 F.4th 1137 (9th Cir. Oct. 24, 2022)


After parents discovered sexually explicit materials of their children posted to Reddit, they immediately contacted subreddit moderators and Reddit employees, reporting the content and asking for it to be removed. Results were inconsistent, as Reddit sometimes removed the content, though not always, and the content was often reposted shortly after removal. Reddit hosts numerous child pornography subreddits that have been repeatedly reported.

Court Findings:

The 9th Circuit held: “We conclude, based on the law as written by Congress, that civil plaintiffs seeking to overcome section 230 immunity for sex trafficking claims must plead and prove that a defendant-website’s own conduct violated 18 U.S.C. § 1591.”

“For claims based on beneficiary liability, this requires that the defendant knowingly benefited from knowingly facilitating sex trafficking. . . . [T]he plaintiffs have not plead that Reddit has done so in this case. . . .” Id. at 1146.

Doe v. Kik Interactive, Inc., 482 F. Supp. 3d 124 (S.D. Fla. 2020)


A minor was solicited by numerous adult men on Kik, who then convinced her to take and send graphic sexual pictures of herself to them. They also sent her sexually explicit photographs via Kik. Kik has long known that such activity is rampant on its site and has continued to allow adults to interact directly with children. As a host website for third parties to post content, Kik got the case dismissed by claiming immunity from suit under Section 230.

The court reasoned, “Congress only intended to create a narrow exception to the CDA for “openly malicious actors such as Backpage where it was plausible for a plaintiff to allege actual knowledge and overt participation” and that a finding of actual knowledge and overt participation in a venture of sexual trafficking is required to defeat CDA immunity.” P. 1250-51.

Court Findings and NCOSE Analysis:

CDA Immunity and TVPA Claims: “Owners and operators of mobile messaging service, as providers of interactive computer service, were entitled to immunity under Communications Decency Act (CDA) with regard to minor’s claim for damages under civil remedy provision of Trafficking Victims Protection Act (TVPA), alleging owners and operators knew that sexual predators used its service to contact and solicit sexual activity with minors but had failed to provide any warnings or enact policies to protect minors, where minor did not allege facts indicating that owners and operators knowingly participated in the sex trafficking venture involving minor.”
NCOSE Analysis: This illustrates the court’s rationale for granting CDA immunity to the defendants, emphasizing the lack of direct participation by the service provider in the alleged sex trafficking activities.

Defendants’ Argument for Immunity:“Defendants argue that they are immune from suit under the Communications Decency Act (‘CDA’), 47 U.S.C. § 230. Section 230, entitled ‘Protection for private blocking and screening of offensive material,’… provides that ‘[n]o provider … of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.'”
NCOSE Analysis: This quote shows the defendants leveraging CDA 230 to assert their immunity from liability for user-generated content.

Plaintiff’s Allegations Against Defendants: “Plaintiff alleges that Defendants are liable for damages for knowingly ‘participat[ing] in IDENTIFIED KIK USERS’ venture in violation of 18 U.S.C. § 1591 by benefiting from, and knowingly facilitating (by not implementing policies sufficient to combat a known problem), the venture in which IDENTIFIED KIK USERS used Kik Messenger in the interstate commerce to subject Jane Doe to sex trafficking.'”
NCOSE Analysis: Here, the plaintiff’s allegations attempt to circumvent CDA 230 protections by focusing on the defendants’ failure to implement protective policies.

FOSTA’s & CDA Immunity: “In 2018, Congress enacted the Fight Online Sex Trafficking Act (‘FOSTA’) and removed sex trafficking from CDA immunity. FOSTA provides that ‘[n]othing in this section (other than subsection (c)(2)(A)) shall be construed to impair or limit … any claim in a civil action under section 1595 of Title 18, if the conduct underlying the claim constitutes a violation of section 1591 of that title.'”
NCOSE Analysis: This highlights FOSTA’s attempt to carve out exceptions to CDA 230 immunity for cases involving sex trafficking.

Court’s Conclusion on CDA Immunity Despite FOSTA: “Plaintiff has not alleged facts that would plausibly establish that Defendants knowingly participated in the sex trafficking venture involving her; she alleges that Defendants knew that other sex trafficking incidents occurred on Kik. This does not satisfy FOSTA’s requirement that the conduct underlying the claim violate 18 U.S.C. § 1591.” 
NCOSE Analysis: The court concludes that despite FOSTA, the plaintiff’s allegations do not overcome CDA 230 immunity as they do not sufficiently allege the defendants’ knowing participation in sex trafficking.

L.W. through Doe v. Snap Inc., No. 22cv619-LAB-MDD, 2023 WL 3830365 (S.D. Cal. June 5, 2023)


This product liability case involves three Plaintiffs, girls aged 11-12 years old (representing a class), who were contacted by adult male perpetrators on Instagram, Twitter, and Omegle, and groomed by the men via Snapchat. The perpetrators sent pornographic images to the girls to groom them and demanded CSAM from the girls, which they posted online. The court found that because Snap is an internet service provider and not a content creator, it could not be held liable under the heightened beneficiary liability standard that Section 1591 imposes. The defendants also sought sanctions against the victims in this suit, claiming the case was baseless and frivolous.

CDA 230: “Plaintiffs’ allegation that Snap knew of the trafficking conduct because it regularly collects ‘troves of [user] data and information’ is equally unpersuasive. Even accepting this allegation as true, the Court can’t conclude that Snap’s knowledge of illicit activity is tantamount to participation in the activity.” P. *7.“…[L]awsuits brought against interactive computer service providers based solely on failure to adequately monitor and regular end-users’ harmful messages fall squarely within protections of Section 230.” P. *8.

Court Findings and NCOSE Analysis:

Broad Immunity Provided by CDA 230: “Immunity under the Communications Decency Act (CDA) applies only if the interactive computer service provider is not also an information content provider, which is defined as someone who is responsible, in whole or in part, for the creation or development of the offending content.”
NCOSE Analysis: This highlights the broad immunity granted to platforms, potentially even when their services are used for harmful activities.

Judicial Interpretation of CDA 230’s Immunity: “When a plaintiff cannot allege enough facts to overcome Communications Decency Act (CDA) immunity, a plaintiff’s claims should be dismissed.”
NCOSE Analysis: This illustrates a judicial stance that often protects service providers under CDA 230, sometimes at the expense of victims’ ability to seek recourse.

Role of Service Providers and Content: “Products liability and false advertising claims brought against instant messaging application developer… treated developer as a ‘publisher or speaker’ of content, for purposes of determining developer’s immunity under the Communications Decency Act (CDA), regardless of how claims were framed.”
NCOSE Analysis: This reflects the challenges plaintiffs face in holding platforms accountable when the law treats them merely as neutral intermediaries.

Difficulty of Overcoming CDA 230 Immunity: “Defendants are entitled to Section 230 immunity on each of Plaintiffs’ claims, and the FAC is DISMISSED in its entirety.”
NCOSE Analysis: Demonstrates a case outcome where the broad immunity under CDA 230 leads to the dismissal of serious allegations against service providers.

Opponents to CDA 230 Reform

Amid the growing call for reforms on CDA Section 230, there’s a notable faction of apologists staunchly opposing these changes. Predominantly backed by significant Big Tech money interests, these entities are ardently working to halt reform efforts, miring congressional and judicial understanding in confusion, and staunchly resisting any form of regulation or accountability. This resistance is not merely ideological. Many of the most vociferous opponents benefit directly from the broad legal protections afforded by Section 230. Companies like Meta, Google, and Snap Inc., under the guise of advocacy for free expression and innovation, fund organizations which actively lobby against reform measures. This deep entanglement of tech giants with advocacy groups raises critical questions about the genuine motivations behind their opposition to Section 230 reform and whether their efforts merely serve to preserve their unfettered control over the digital landscape, at the expense of societal welfare.


Help educate others and demand change by sharing this on social media or via email:


Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.