Donate Now

Fact Check: Twitter’s Response To Lawsuit

By:

Twitter is one of the most prolific distributors of sexual exploitation material—including material soliciting, advancing, and depicting human trafficking and material depicting the sexual abuse of children. Twitter is not a passive, inactive agent in the distribution of this harmful material. Twitter’s own practices, business model, and technology architecture encourage and profit from the distribution of sexual exploitation material. A lawsuit is being brought against Twitter on behalf of survivors by the National Center on Sexual Exploitation Law Center, the Haba Law Firm, and the Matiasic Law Firm.

The desired outcome of the Twitter lawsuit is a measure of justice for survivors and Twitter being held legally accountable to the reality that it can and must make a remarkable difference and promote significant steps forward in the fight against sexual abuse and exploitation online.

Twitter Is Profiting from Child Sexual Abuse Material: Twitter is being sued by the National Center on Sexual Exploitation Law Center, on behalf of John Doe, for violating federal law on sex trafficking through its involvement in and profiting from the sexual exploitation of a minor.

In May 2021, Twitter filed a motion to dismiss the plaintiffs’ first amended complaint that contained many claims about the facts of the case. The purpose of this article is to look at several of those claims (a copy of Twitter’s filed motion to dismiss can be accessed via Public Access to Court Electronic Records [PACER]) in conjunction with the facts of the case.

Twitter Knew About and Was Provided Evidence of Child Sexual Abuse Material of Plaintiffs on Its Platform

Twitter was informed of child sexual abuse material (CSAM) videos on their platform by one of the victims, who provided government issued ID proving he was the individual depicted and a minor. Twitter affirmatively chose to keep the video live and informed the child victim that video of his abuse “does not violate our terms.”

What follows are the supporting facts:

  • John Doe reported the video directly to Twitter using their CSAM reporting form, explained that he and the other child depicted are minors and explained that the video was only created under duress of “harassment and threats”
  • Twitter requested “additional information” and John Doe uploaded his government issued ID which showed that he was 16 years old
  • Twitter responded: “Thank you for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” And left the video for further dissemination. The video remained live for an additional seven days
  • John Doe’s mother also reported the video, explained the videos depicted “a sexual abuse situation,” and received no response
  • Twitter only removed the child sexual abuse material at the request of Homeland Security after John Doe’s mother was connected to an agent through a friend. Not because Twitter reported the video to law enforcement
Twitter was informed of child sexual abuse material (CSAM) videos on their platform by one of the victims. Twitter affirmatively chose to keep the video live and informed the child victim that video of his abuse “does not violate our terms.” Share on X

Twitter Does Not Enforce Its Own Terms of Service or Policies Regarding Child Sexual Abuse Material (CSAM)

Twitter’s motion to dismiss (which can be accessed via Public Access to Court Electronic Records [PACER]) claims that Twitter has a “zero tolerance policy” when it comes to child sexual abuse material (CSAM)—which is sometimes referred to as “child pornography.” However, the first amended complaint filed by the plaintiff lays out the following facts:

  • Twitter refused to allow a child protection agency access to its API to scan for known CSAM, which it allows/sells access to for vendors/advertisers
  • Twitter has conspicuously low reports of CSAM to the National Center for Missing and Exploited Children (NCMEC) according to their yearly reports
  • Twitter has a very low rating from the Canadian Centre for Child Protection (CCCP) for its CSAM reporting structure. In other words, it has been verified that it is notably difficult to report CSAM on Twitter’s site
  • Twitter allows hashtags almost exclusively used for the trading of illegal CSAM to flourish on their platform (while blocking other hashtags such as #savethechildren) and Twitter monetizes this activity through ads. The FAC includes a screenshot of an open solicitation for CSAM on Twitter next to a promoted advertisement.
  • John Doe 1 and John Doe 2’s experience is a stunning refutation of Twitter’s claims
  • One of the accounts that shared the illegal CSAM videos on Twitter had been previously reported to Twitter for sharing CSAM videos specifically—a screenshot of this report was included in the complaint
Twitter has refused to allow a child protection agency access to its API to scan for known CSAM— an API which it allows/sells access to for vendors and advertisers. Share on X

Twitter Profited from Child Sexual Abuse Material of the Plaintiff Being Shared and Distributed on Its Platform

In its motion to dismiss (which can be accessed via Public Access to Court Electronic Records [PACER]), Twitter attempted to claim that it did not monetize or benefit from the child sexual abuse material on its platform. The following facts from the plaintiff’s first amended complaint refute such claims and show how Twitter did, in fact, monetize and benefit from child sexual abuse material on its platform:

  • Twitter monetizes its platform by selling advertisements, selling access to its API and through data licensing
  • Over 80% of Twitter’s revenue comes from advertising. In the third quarter of 2020, Twitter received an estimated $936 million in revenue, $808 million was from advertising and $127 million was from data licensing
  • The CSAM video depicting John Doe #1 and John Doe #2 had over 167,000 views and over 2,200 shares. This is substantial user engagement (a.k.a. benefit) that provided valuable data (a.k.a. benefit)—if not advertising space—for Twitter
The CSAM video depicting John Doe #1 and John Doe #2 had over 167,000 views and over 2,200 shares. This is substantial user engagement (a.k.a. benefit) that provided valuable data (a.k.a. benefit)—if not advertising space—for Twitter. Share on X

Twitter is Liable for Its Complicity in the Alleged Crimes Laid Out by the Lawsuit

Amidst the myriad claims (which can be accessed via Public Access to Court Electronic Records [PACER]) attempting to dismiss the lawsuit’s allegations is a stunning claim from Twitter:

“…the law does not punish a defendant [Twitter] for participation in a lawful venture with sex traffickers, or knowingly but passively receiving the financial benefits of sex trafficking. Rather, liability arises only when a defendant makes ‘some overt act that furthers the sex trafficking aspect of the venture.’”

You read that right: Twitter claims participating with sex traffickers, and knowingly receiving the financial benefits of sex trafficking is legal. Here’s an important reminder: the “venture” referenced by Twitter that is being litigated in this lawsuit against Twitter is the distribution of child sexual abuse material.

Twitter goes on to claim that “merely failing to remove third-party content, even if abhorrent, is precisely what CDA § 230 immunizes.”

Again, you read that right: Twitter thinks it should be immune from liability for the broad distribution of child sexual abuse material because of Section 230 of the Communications Decency Act.

This is not the position of a responsible corporate citizen. Twitter has failed these survivors, and until it changes course, it will continue to be a hub for exploitation.

Twitter thinks it should be immune from liability for the broad distribution of child sexual abuse material because of Section 230 of the Communications Decency Act. This is simply not true. Share on X

Have You Been Sexually Exploited on Twitter?

If you or someone you know have had child sexual abuse material posted on Twitter, and the platform knew or should have known it was there but either failed to act or refused to act, then it’s possible (though not guaranteed) that you might have a case against the platform. If Twitter has failed to remove or refused to remove child sexual abuse material, please contact us here.

Were you exploited on Twitter? Connect with our legal team about the Twitter lawsuit today.

How You Can Support the Lawsuit Against Twitter

By refusing to act and take down the child sexual abuse material featuring John Doe, even when informed by multiple channels, Twitter enabled the sexual abuse of a minor and profited off this abuse. It’s time for corporations—it’s time for Twitter—to be held accountable under the law for putting profits before people.

Stand up for justice and share the Twitter Lawsuit story on social media to help spread the word.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.