VICTORY! Twitter Case Will Proceed, Overcoming Section 230 Immunity


Jane Doe will always remember the night her son, John Doe, confessed to her that he was suicidal on account of a video that had been posted on Twitter (now X).

“[T]he first kick in the gut is that I think I know my kid and he’s suicidal,” she says. “And the second kick in the gut is that what is this video?”

The video had been created three years ago. And it was child sexual abuse material (CSAM, or “child pornography” under the law).

When John was thirteen, a sex trafficker had deceived and blackmailed him and his friend into making sexually explicit videos of themselves. A compilation of the CSAM videos were later posted to Twitter for all to see.

Jane Doe rushed to support her son. They reported the video to Twitter multiple times, informing them it was illegal CSAM and begging them to take it down. Twitter replied, asking for John’s ID. It was immediately provided, proving John was the boy in the video and was a minor.

Then came Twitter’s unthinkable decision:

We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.”

It’s almost impossible to believe. Twitter reviewed the illegal CSAM, confirmed that it was in fact CSAM by verifying John’s age and identity, and then refused to take it down.

This is why the NCOSE Law Center and the Haba Law Firm are representing Jane and John Doe in a lawsuit against Twitter.

But until now, Jane and her son John have faced crushing disappointment. A district court ruled that Twitter was completely immune from any liability, due to Section 230 of the Communications Decency Act, and that the case should be thrown out.

Determined to right this immense injustice, we appealed the decision before the Ninth Circuit (the second highest court under the Supreme Court). And now, in a tremendous victory, aspects of this decision have been reversed and John and Jane’s lawsuit against Twitter is allowed to proceed!

This is the first federal lawsuit to proceed against a social media company for its failure to report CSAM to NCMEC and for the harm to children it caused by this negligence. Also, this decision is now one of the small handful of cases to ever proceed against a website past a Section 230 defense. Talk about a groundbreaking victory!

This sets a powerful precedent for future cases, paving a way for other survivors of sexual exploitation to sue the social media companies that facilitated their abuse with dangerous product design and negligent behavior.

What is Section 230 of the Communications Decency Act?

John and Jane’s lawsuit against Twitter revolves around the issue of Section 230 of the Communications Decency Act, so it is very important to understand this law.

The Communications Decency Act was penned in 1996 to help protect children form harmful content online. Section 230 was added to the Act to help balance the interest of child protection with allowing for the growth of a then-nascent tech industry, as well as to encourage tech platforms to moderate harmful content, without fearing that their moderation would be used as a reason for liability. (Recent court cases at the time had set an odd precedent where companies who engaged in moderation were held liable for harmful third-party content while companies who did not engage in moderation weren’t, because the act of moderating meant a company “should have known” about the content.)

However, since 1996, the original intention of Section 230 has been completely turned on its head. Rather than promoting online safety and responsible behaviour from tech companies, it has been used to provide tech companies with near blanket immunity for knowingly, recklessly, or negligently facilitating sexual exploitation.

In today’s age, it is nearly impossible for a lawsuit to overcome the barrier of Section 230 immunity. The vast majority of lawsuits seeking to hold a tech company accountable are thrown out before they can ever get started. In other words, victims of all kinds of wide-ranging harms are barred from even bringing their case and getting their day in court because of Section 230.

As such, the fact that the Ninth Circuit Court of Appeals has allowed John and Jane’s lawsuit against Twitter to proceed to Discovery is an incredible victory, and one that will have a gargantuan ripple effect. 

Breaking Down the Ninth Circuit’s Decision

There are positive and negative aspects of the Ninth Circuit’s ruling, but the bottom line is, the case is allowed to move forward and the plaintiffs have a chance to hold Twitter liable.

What’s the positive?

The Ninth Circuit ruled Twitter does not have Section 230 immunity for failing to report the CSAM of John to the National Center on Missing and Exploited Children (NCMEC). All tech companies are required by law to report child sexual exploitation to NCMEC “as soon as reasonably possible” after being made aware of it. The fact that the District court ruled Section 230 made Twitter immune from negligence claims based on failing to comply with this law was baffling, and it is a great victory that this decision was overturned.

Further, the Ninth Circuit ruled that Twitter does not have Section 230 immunity for product defects that pertained to the difficulty of reporting CSAM. At the time of the lawsuit, someone wishing to report CSAM could not use Twitter’s general report function but had to instead locate a separate form that is more difficult to find and is complicated and insufficient. Child-protection watchdogs have criticized Twitter’s reporting mechanism, stating that it lags behind its industry peers in this regard. Especially when we know that tech companies often deliberately make certain actions easy and certain actions difficult to manipulate user behavior, Twitter’s uniquely complicated reporting mechanism for CSAM is abhorrent.

We expect that this particular aspect of the ruling will have a ripple effect across the tech industry, incentivizing companies to prioritize better reporting functions. As soon as companies know they can be held liable, they will make safety changes—showing, of course, that they could have made those changes all along. The tech industry’s cynical talking point that making their platforms safe isn’t possible or is too hard simply isn’t true. What they really mean, is making these changes isn’t profitable.

What’s the negative?

Sadly, the Ninth Circuit upheld aspects of the District Court’s decision as well. It ruled that Section 230 did provide immunity for our John Doe’s claims that Twitter knowingly possessed and distributed CSAM, that Twitter did not remove the CSAM pending review of Jane and John’s report, and that its search suggestions and hashtags amplified the CSAM on the platform.

Perhaps most notably, the Ninth Circuit also found that Section 230 immunity barred John Doe’s claim that Twitter knowingly benefitting from sex trafficking. This was done in spite of the fact that FOSTA-SESTA (Fight Online Sex Trafficking Act and Stop Enabling Sex Trafficking Act) was passed in 2018 specifically to clarify that Section 230 immunity does not immunize tech companies for knowingly benefitting from a sex trafficking venture.

At the time that FOSTA-SESTA was passed, NCOSE celebrated it as a tremendous victory. Yet since then, courts have broadly neglected to apply FOSTA-SESTA as intended. To this day, we are aware of only one company which was successfully sued under FOSTA-SESTA (Salesforce, for aiding the infamous sex trafficking practices of Backpage.com). Every other case has failed.

One has to wonder: If Twitter reviewing CSAM, verifying its illegal nature, being notified it was created through a sex trafficking scheme, and specifically choosing to continue profiting from this material does not allow it to be sued under FOSTA-SESTA, what does?

Ultimately, the lesson has been learned: narrow reforms to Section 230 have not and will not work. As was the case with FOSTA-SESTA, Big Tech lobbyists will intentionally influence the writing of these reforms to ensure they have no teeth.

This is why we need to repeal Section 230 completely or enact radical reform and allow the tech industry to be sued like any other industry.

This is Still a Huge Win

While aspects of the Ninth Circuit’s decision were frustrating, this remains a groundbreaking victory. Recall: the vast majority of cases against tech companies are given full Section 230 immunity and thrown out entirely. The that the plaintiffs are allowed to proceed with their case is huge. They have been allowed a path for holding Twitter accountable; and this sets a precedent for other survivors holding social media companies accountable as well.

Become a Monthly Donor to the NCOSE Law Center to Support this and Other Crucial Lawsuits

The NCOSE Law Center always represents its survivor clients free of charge. As such, these survivors rely on the generosity of donors like you make their lawsuits possible. Please consider becoming a monthly donor to the NCOSE Law Center to ensure this and other groundbreaking lawsuits have the necessary funding to move forward!

Remember that Twitter has trillions of dollars with which to hire the most expensive lawyers in the field; child survivors like John Doe only have you.

Thank you in advance for your generosity to this crucial cause!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.