Donate Now

John Does v. Twitter Lawsuit Heard on Appeal in the 9th Circuit

By:

*Jane and John are pseudonyms used in court to protect the identities of the Plaintiffs in the case.  


It was around 11:30 pm when the text message came that would blow Jane’s world apart. 

[Your son] is on the phone with my niece right now and he’s discussing that he doesn’t want to live anymore and is suicidal.

Jane couldn’t believe it. By all appearances, her son seemed to be thriving. Surely, there must be some mistake!  

But that night, Jane spoke to her son, and he told her that it was true. He was suicidal because of a video that was being passed around at his school. 

As Jane listened to her son, the horrifying story unfolded. 

Three years before, when John was thirteen, a sex trafficker posing as a sixteen-year-old girl had groomed John and his friend into sending sexually explicit photos and videos of themselves (i.e. child sexual abuse material, or “child pornography”). A compilation of these videos was consequently posted to Twitter. There, it gathered 167,000 views and was retweeted more than 2,000 times. It was also passed around by people in John’s school, eventually driving John to contemplate ending his life.  

After John shared the story with his mother, the two of them contacted Twitter multiple times, begging the company to take the child sexual abuse material down and even sending photos of John’s ID, proving he was a minor. Only to be met with this chilling response: 

“We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time.” 

Twitter Lawsuit

John Doe and the NCOSE Law Center Continue Fighting to Hold Twitter Accountable!

The NCOSE Law Center and The Haba Law Firm are representing John Doe and his friend in a lawsuit against Twitter. The case, John Doe #1 and John Doe #2 v. Twitter, argues that the company’s conduct in the above story violated numerous laws, including knowingly benefiting from the sex trafficking of the plaintiffs, possessing child sexual abuse material, knowingly distributing child sexual abuse material (“CSAM”), and other offences.  

The most recent step in this case was the oral argument held on April 20th, in which Lisa Haba, Special Counsel for NCOSE and Partner at the Haba Law Firm, argued before the 9th Circuit Court of Appeals that the plaintiffs’ sex-trafficking and CSAM claims against Twitter should move forward, as provided by a key ruling  of the Federal District Court in August of 2021. Twitter had appealed that ruling to the 9th Circuit, and the plaintiffs cross-appealed.  

During Twitter’s segment of the oral argument, one Judge stated to the company’s lawyer:  

“The facts aren’t good for you in this case. I mean, in this case, Twitter was asked to take [child sexual abuse material] down and Twitter said ‘we’ve looked at it and we’re not taking it down.’” 

You can watch the recording of the oral argument here.

John Doe #1 and John Doe #2 v. Twitter is a watershed case for holding tech companies accountable for knowingly facilitating their sexual abuse and exploitation. Learn more about the Twitter lawsuit here

How Can You Help?

Please consider a donation to the NCOSE Law Center, so we can continue our work representing John Doe and other survivors in lawsuits against the entities that profit from and enable their exploitation!  

If you have not already, please also sign this petition in support of John Doe.

Will Elon Musk Make Good on His Promise to Prioritize Addressing Child Sexual Exploitation?

When Elon Musk purchased Twitter in 2022, he declared that “removing child exploitation is priority #1.” While this was a heartening statement, Musk has taken many alarming actions since which give reason for extreme skepticism. In fact, most experts agree that Musk’s actions since purchasing Twitter have so far served to make the problem of child sexual exploitation worse. 

For example, Musk cut the Twitter Inc. team dedicated to handling child sexual exploitation to half its former size. He also stopped paying Thorn, the child protection agency which provided the software Twitter relies on to detect child sexual abuse material, and stopped working with Thorn to improve this technology.

The National Center for Missing and Exploited Children (NCMEC) also reported that, following Musk taking over the company, Twitter declined to report the hundreds of thousands of accounts it suspended for claiming to sell CSAM without directly posting it. A Twitter spokesperson stated that they did this because such accounts did not meet the threshold for “high confidence that the person is knowingly transmitting” CSAM. However, NCMEC disagrees with that interpretation of the rules and asserts that tech companies are also legally required to report users that only claim to sell or solicit CSAM. 

In the John Doe #1 and John Doe #2 v. Twitter case, the company continues to claim 100% legal immunity for its actions in facilitating the child sexual exploitation of John Doe and his friends. Is this Musk’s stance? How will he proceed to make good on his promise to address child sexual exploitation on Twitter, as “priority #1”?  

Were You Exploited on Twitter or on Another Social Media Platform?

If you were sexually exploited on a social media platform like Twitter and are interested in exploring your legal options, we invite you to connect with the NCOSE Law Center Team.  

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.