At the age of 13, John Doe and his friend were groomed by an online predator pretending to be a 16-year-old girl. This predator coerced the two boys into sending sexually explicit images and videos (also known as child sexual abuse material, CSAM).
Later the images and videos were posted on Twitter, garnering at least 167,000 views and more than 2,000 retweets. The images and videos circulated around John’s school community. He felt so humiliated and, feeling like there was no escape, John contemplated ending his life.
John and his mother reported the content to Twitter. They begged for the CSAM to be removed, even sending photos of John’s ID to prove that he was a minor. Twitter’s response left them heartbroken:
“We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”
On behalf of John Doe and his friend, the Haba Law Firm, the Matiasic Firm, and the NCOSE Law Center is suing Twitter (now X) for knowingly possessing CSAM and knowingly benefitting from sex trafficking.
From the begining of the litigation, Twitter/X maintained that it could never be held to account in civil court for knowing violations of the federal laws protection child-victims of sexual crime. In earlier decisions in the case the district judge interpreted the Communications Decency Act Section 230 (a.k.a. “CDA 230” or “Section 230”) as providing legal immunity to Twitter, and dismissed the boys’ claims.
John Doe #1 and #2 appealed this decision and arguments in the case were heard on Monday in the Ninth Circuit Court of Appeals.
Peter Gentala, Senior Legal Counsel for the NCOSE Law Center, spoke before the court on behalf of the John Does. Once again, Section 230 became the heart of the argument in this case.
Communications Decency Act Section 230 Lies at the Heart of Twitter Lawsuit
The attorney litigating on behalf of Twitter rested their entire argument on the basis of the Communications Decency Act Section 230. This law is outdated, established at the dawn of the Internet in 1996. It was originally designed to allow tech companies to prosper and to protect good faith efforts to moderate content without fear of being held liable on account of that moderation. While not ill-intentioned in its enactment, the law has since been misinterpreted as providing near blanket immunity to tech companies for harms caused by their platform, no matter how active a role they may have taken in those harms.
Judges on the panel brought up several other cases where Section 230 was relevant to determine the scope of its protections in this case. While the attorney for Twitter claimed that this is a case of failure to moderate content, which they say is covered by Section 230, Gentala argued the opposite:
“This is not a failure to monitor the platform case… This is that unusual case where the platform itself has confirmed, in writing, it reviewed the child pornography and communicated to the trafficked person that it intended to allow that illegal image to stay on its platform,” said Gentala.
After being notified that there was CSAM being circulated on its platform, Twitter proactively made the decision to leave the content online while it continued to garner thousands of views and shares, putting more dollars into the pockets of this big tech corporation.
Gentala: “From the record, we know for certain that Twitter knew that the plaintiffs were children. Twitter knew that child pornography was created through coercion. Twitter knew that the pictures were created through a coercive process. And Twitter reviewed and then wrote an email to John Doe I confirming its review. So armed with that knowledge, Twitter then decided to continue profiting from the child pornography and that constitutes benefitting from sex trafficking.
Twitter’s attorney argued that the company was not participating in sex trafficking, but saying the actual trafficking venture occurred years earlier when the young teens were coerced into creating and sending CSAM to predators.
“These allegations are that years after heinous actors convinced the Doe plaintiffs to create these images of themselves, someone, not the traffickers, uploaded to Twitter the offending images. That is not trafficking of a person.”
However, the complaint filed by the Does makes note of the ways Twitter benefits off of content that is put on its platform. “It’s constantly monetized and cross-monetized,” said Gentala.
Making money off of sexual content that was generated through force, fraud, or coercion meets the legal definition for sex trafficking. Twitter had knowledge that this content was made through coercion and fraud, and yet, failed to do anything about it. Therefore, the platform was complicit in sex trafficking.

Claims Against Twitter About Dangerous Product Design
The judges also asked questions about the John-Doe Plaintiffs’ claims based on product design. One example is Twitter’s search functions that suggest “popular” content.
Gentala: “A search comes in for a hashtag that’s very common for child pornography, like ‘megalinks,’ and Twitter uses its platform to say, “would you like to search instead for ‘megalinks young?’” Something that would pinpoint even more child pornography. So what’s happening with the way this platform is structured is that it’s not just a meeting place for people to trade child pornography, it’s structured in a way that it’s actually a marketplace.”
Judges pressed Twitter’s attorney on why Twitter should be protected under Section 230 based on the product design claims. They cited that Lemmon v. Snap, a case in which two boys died in a high-speed car accident after the platform, Snapchat, encouraged them to drive at high speeds, did not apply Section 230 immunity due to negligent product design.
Twitter’s attorney responded by saying that the reason Snap was not covered by Section 230 is because their product was encouraging users to engage in harmful acts, rather than a failure to moderate third-party content. Further, he attempted to nullify product design claims by stating that “Popular content being treated as popular content is not a basis for overcoming Section 230,” referring to child sexual abuse material as “popular content.”
The reference to CSAM as “popular content” shows an even greater incentive by Twitter to leave it on their platform. Since they know it’s popular, it will make them money, so why would they want to take it down?
There is growing awareness that a platform’s decisions to arrange and recommend third party content can play powerful role in causing harm. Gentala referenced a case against TikTok, where the tech platform was not provided with Section 230 immunity based on platform design.
“What the Third Circuit says in a case against TikTok, is that content curation decisions by a platform do not necessarily get swept up by Section 230 because that is something that the platform itself is speaking. It has responsibility for it.”
Failure to Report CSAM to Law Enforcement
There were also disputes over whether Twitter’s dysfunctional reporting protocols contributed to the harm caused the plaintiffs in this case.
A U.S. law establishes a standard that makes the failure to report child abuse illegal.
“With regard to the child pornography claim, the duty there is once Twitter knows it is in possession of child pornography, to not continue to possess it. And we know that [Twitter] did exactly that,” said Gentala.
This marks another insufficient safeguard on the Twitter platform since the content not only failed to be removed, but failed to be reported to legal authorities.
Twitter’s attorney himself acknowledged that the platform did not remove the content, even after reviewing the report. “[The content] was reported, it was registered, and what’s the problem? It was not taken down,” he said.
He continued to maintain that Twitter should be immune anyway, stating that holding tech companies accountable for illegal third-party content being uploaded to a site is “anti-thetical” to what Congress intended with Section 230.
Twitter’s attorney stated, “This is the prototypical Section 230 case.” And we agree. The mere fact that Twitter can admit to partaking in a despicable crime and maintain that they should still be protected, shows that Section 230 urgently needs to be reformed. It is dangerous and it allows Big Tech companies to exploit users for a profit without facing any consequences. This case makes that fact crystal clear.
The three-judge panel took the case under advisement and will likely issue an opinion later this year.