Donate Now

TikTok Faces Federal Scrutiny for Child Sexual Abuse Material (CSAM) 

By:

With more than a billion active monthly users worldwide, TikTok is a popular social networking app for creating and sharing short videos. The platform is particularly popular with young people. 25% of TikTok users in the U.S. are between the ages of 10-19 and 43% of the global user base is between 18-24 years old. 

However, with that widespread popularity comes a host of problems that the company is struggling to resolve—especially protecting children from harm on its platform. TikTok has come under fire in the past several years for predators easily accessing and grooming children as well as for hosting sexually explicit and pornographic content. These issues have been documented extensively by National Center on Sexual Exploitation researchers (a sample of the evidence can be found here).

The Dangers of Sexual Exploitation on TikTok

Several major media outlets have been investigating and reporting on the extensive harms on TikTok with Forbes calling the app “a magnet to sexual predators” and Wall Street Journal posting a series of stories in the past half year about TikTok’s algorithms serving up pornography and drugs to minorscontent promoting eating disordersdangerous viral trends, and negative mental health impact on teen girls using the platform—especially on those posting “suggestive” videos. 

TikTok becoming known as a “hunting ground” for predators has not stopped its growth, and the platform has given predators easy access to groom, abuse, and traffic children. Exploiters use TikTok to view minor users, comment on videos, and message children where they are often requesting or sending sexually explicit videos or pictures.  

In the fall of 2021, TikTok executives were called to testify regarding failures to protect young children in online spaces as a part of Congressional hearings that also scrutinized similar issues on Facebook/Instagram, Snapchat, and YouTube. 

In April 2022, the Financial Times reported that the U.S. Department of Homeland Security is investigating how TikTok handles child sexual abuse material (CSAM, sometimes referred to as “child pornography”), while the Department of Justice is also reviewing how a specific privacy feature on TikTok is being exploited by predators on the app. The article states:

 “’It is a perfect place for predators to meet, groom and engage children,’ said Erin Burke, unit chief of the child exploitation investigations unit at Homeland Security’s cyber crime division, calling it the ‘platform of choice’ for the behaviour.” 

Investigations into these issues reveal a larger problem with not just TikTok, but social media platforms in general and how they handle moderation. Many companies like Meta employ over 15,000 human moderators in addition to AI software that can catch known instances of CSAM (or “hashed” CSAM) but, even then, the rising tide of child sexual abuse and other forms of exploitation is overwhelming. 

The Internet Watch Foundation noted that 2021 was the “worst year on record” for online child sexual abuse. Yet the National Center for Missing and Exploited Children (NCMEC), which handles reports of child abuse from these platforms, found that TikTok made only 155,000 reports in the last year compared to Instagram’s 3.4 million reports. 

Clearly TikTok’s “zero-tolerance” policy for child sexual abuse material and other inappropriate content is not working as live videos appearing on feeds featuring nudity and sex acts, clever hashtags and spellings to work around bans in order to promote dangerous and exploitative behavior like OnlyFans, adult predators targeting and grooming young children, and indicators of CSAM trading abound. The Financial Times explored some of the patterns of behavior that allow this to happen on TikTok:

“One pattern that the Financial Times verified with law enforcement and child safety groups was content being procured and traded through private accounts, by sharing the password with victims and other predators. Key code words are used in public videos, user names and biographies, but the illegal content is uploaded using the app’s ‘Only Me’ function where videos are only visible for those logged into the profile.”

Solutions for Combating Sexual Abuse and Exploitation on TikTok

TikTok has implemented several of our recommendations to significantly improve their safety features for minors, such as disabling direct messaging for those under 16 and allowing parents to lock caregiver controls with a pin code. These features go beyond what any other social media platforms have implemented to date. TikTok has also released extensive Community Guidelines, clearly defining terms and listing activities and content prohibited on the platform, including content that “depicts, promotes, or glorifies” prostitution or pornography, content that simulates sexual activity (either verbally, in text, or even through emojis), or non-consensual sex. 

Despite these changes, it is clear that TikTok needs to do much more to protect minors from predators, exposure to sexual content, and pornographic websites. We (and others like the National Association of Attorney Generals) have been pressing on TikTok to give more control to parents and to proactively moderate content because even our own research confirms what the Wall Street Journal found: under an account we created as being for a 13-year-old, we were easily able to find videos promoting OnlyFans, as well as other pornography and prostitution sites despite the fact that this type of material is against TikTok’s Community Guidelines.

TikTok and all social media companies must be accountable for the environments they create, especially when their insufficient policies and practices leave so much room for exploitation, abuse, and harm. 

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.