Donate Now

Twitter Profited from Sex Trafficking of Young Boy via Child Sexual Abuse Material

By:

When John Doe was only 13 years old, a tragic series of events unfolded that he was powerless to stop and Twitter made money off his most traumatic moments. It is on behalf of John Doe and countless other survivors like him that the National Center on Sexual Exploitation Law Center—along with The Haba Law Firm and The Matiasic Law Firm—has brought a lawsuit against Twitter.

When he was just 13 years old, John Doe was contacted by someone on Snapchat with whom he began to exchange messages and develop a relationship. The other person, who John Doe was led to believe was a 16-year-old girl, eventually asked him to send nude pictures of himself. As soon as he did, the nature of their correspondence changed dramatically and Doe found himself instantly trapped in a world of abuse and exploitation he never imagined.

After the first picture, whoever was on the other end of the phone screen began extorting him. The predator wanted more sexually graphic pictures and videos of John Doe and threatened that, if Doe did not provide the demanded material, the nude pictures of Doe would be distributed to his parents and others in his community.

Pimps and sex traffickers often coerce victims into making social media or advertising posts, or create posts in their victim's name. Twitter, #CleanItUp. Share on X

Fearing the worst, John Doe complied with these demands and sent more sexually explicit content to the exploiter who was clearly not a 16-year-old girl. The exploitation escalated to include a demand for video of Doe performing sex acts with another person and, burdened and coerced by intense fear and shame at the thought of being exposed, Doe complied with the increasingly extreme demands.

John Doe eventually managed to end correspondence with the exploiter and he thought the entire nightmare experience was over for good—until 3 years later. When John Doe was 16 years old, he learned from a classmate at his high school that videos of his abuse were surfacing on Twitter.

Now, not only did he have to deal with the reality of child sexual abuse material of himself being scattered across the Internet, Doe also had to face the humiliation of his peers seeing his violation immortalized on Twitter. He faced vicious bullying at school and suffered severe humiliation and emotional distress.

In the midst of this deep mental anguish, John Doe attempted to contact the users who had posted the child sexual abuse material of him on Twitter. While one user deleted the footage, the other poster ignored Doe’s pleas and left the video online.

For years, John Doe had been bearing this burden on his own. Feeling helpless, he confessed to feeling suicidal as he felt the weight of his situation crashing down around him. His mother, Jane Doe, was contacted by a family friend and it was then that she finally heard her son’s story of being lured, groomed, and extorted. She immediately contacted Twitter and the authorities in order to remove the criminal content.

Despite both her and John Doe contacting Twitter multiple times and even verifying his identity as a minor—per Twitter’s reporting process—Twitter claimed that the videos posted did not violate their policies and refused to remove them. It was not until after a family contact at the Department of Homeland Security got involved that Twitter finally removed the content, but not before the video depicting graphic sexual content of a minor—literal child sexual abuse material—had accrued over 160,000 views.

Each one of those views was harmful to John Doe and profitable for Twitter.

Had Twitter acted swiftly after John Doe initially reported the video depicting his abuse the first time, it could have significantly mitigated the harm he experienced. Had Twitter followed its own policies regarding child sexual abuse material (CSAM) and federal law by removing the videos after John Doe and his mother confirmed it was CSAM, Twitter could have helped to curb the harm.

Instead, Twitter refused to remove or block the content depicting the sexual exploitation of John Doe—who was clearly and demonstratively a minor—and continued to knowingly profit from its distribution.

Twitter hosts cyber-based sexual harassment, revenge pornography, and even sexually exploited images of children on its platform. Twitter, #CleanItUp. Share on X

The law certainly doesn’t protect child sexual abuse material, and it should not and can not protect companies like Twitter who turn a blind eye to the sexually abusive, degrading, and illegal material scattered throughout its platform.

John Doe continues to recover from his experience with the help of his family. He is now seeking legal accountability for Twitter, which according to the Trafficking Victims Protection Reauthorization Act (TVPA), was in violation of federal laws against sex trafficking.

Sex trafficking is defined as when a person participates in a commercial sex act and is either 1) under the age of 18 or 2) subjected to force, fraud, or coercion—both of these requirements meet John Doe’s situation exactly.

By refusing to act and take down the child sexual abuse material featuring John Doe, even when informed by multiple channels, Twitter enabled the sexual abuse of a minor and profited off this abuse. It’s time for corporations—it’s time for Twitter—to be held accountable under the law for putting profits before people.


Share this story on social media to spread the word.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.