TikTok

The Problem

With more than a billion active monthly users worldwide, TikTok is a wildly popular social networking app for creating and sharing short videos. The platform is particularly popular with young people: a quarter of US TikTok users are between the ages of 10 – 19 (surpassing Instagram’s Gen Z user base and catching up quickly to Snapchat), while 43% of the global userbase is between 18-24 years old.

Unfortunately, as it was rising in popularity in the United States, TikTok was known to be a predator’s hunting and continues to be the “platform of choice” for predators to access, engage, and groom children for abuse. 

Exploiters utilize TikTok to find and view minor users, comment on videos, and message children, often requesting or sending sexually explicit videos or pictures. Evidence is growing of child sex abuse material (CSAM, also known as child pornography) trading on the platform – with US Department of Homeland Security noting that between 2019 and 2021 the number of TikTok-related child exploitation investigations has increased seven-fold.

After being named to the 2020 Dirty Dozen List and meeting with NCOSE, TikTok implemented several of our recommendations to significantly improve their safety features for minors, such as disabling direct messaging for those under 16 and allowing parents to lock controls with a pin code. TikTok also released extensive Community Guidelines, clearly defining terms and listing activities and content prohibited on their platform, including content that “depicts, promotes, or glorifies” prostitution or pornography, content that simulates sexual activity (either verbally, in text, or even through emojis), or non-consensual sex.

While these changes are certainly encouraging, it remains to be seen how well new policies will be put into practice. We remain concerned about the extent of harmful content still accessible by young users – including advertising for pornography and prostitution sites – and believe there is still more TikTok could do to protect youth using their platform.

Tell TikTok they must do more to protect kids: take action below!

While the improved controls and defaults to the TikTok platform have likely made the app safer for kids, there is still an abundance of content on TikTok that could be harmful to minors and that normalizes the commercial sexual exploitation industry.  

We trust that TikTok continues its recent trend of increased responsibility and accountability – creating a safer online platform for all users. Specifically we ask that TikTok:  

  • Enforce policies and Community Guidelines consistently and thoroughly 
  • Defaul“Restricted Mode” for all minors 
  • Provide caretakers and minors more resources to manage their accounts and enhance safety, such as prompts upon account-creation and embedded PSAs throughout the app to teach users to identify and report inappropriate or abusive behavior (sextortion, sexual harassment, grooming, etc.) 
  • Develop algorithms that detect age-lying by minors, as well as adults 
  • Improve processes to assess reports and proactively find and block (permanently) abusive and exploitative accounts, content, and hashtags, including those promoting the commercial sex industry 
  • Adjust Apple App rating from 12+ to 17+ and Google Play from “Teen” to “Mature” to more accurately reflect the content on TikTok 

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections.
POSSIBLE TRIGGER.

NCOSE researchers have evidence indicating child sex abuse trading on TikTok, but given the nature of the evidence we will only make it available to law enforcement, policymakers, and media.

Proof

Sexualized Content on TikTok Discover and Home

Sexualized Content on TikTok Discover and Home

Note: Sexualized Content Below.

Since TikTok instituted new Community Guidelines in fall of 2020, there has been less sexual content appearing on Home and Discover. However, it’s still there without even having to look for it – including when Restricted Mode is on.

In addition to sexualized dancing, video trends, and hashtags, a significant amount of lyrics violate the Community Guideline prohibiting: Content that contains sexually explicit language for sexual gratification. As music is such an integral part of the TikTok experience, and given that music is so influential in shaping teen’s identity, emotions, socialization, and tastes,[i] TikTok must age-gate the content of the songs used by minors.

In restricted mode, videos with the following lyrics showed up first in the “Home” section of an account set at 13 years old with Restricted Mode on (the videos were usually accompanied by sexualized dancing).

Videos from 2/13/21:

“Foot Fungus” by Ski Mask the Slump God (multiple videos found using this song)

Skrrrt, uh, uh, drop it on my cock
Skrrrt, I'ma h-hot nigga, no pot
Skrrrt, niggas still gettin' buns, hot dogs
Skrrrt, niggas not fuckin' with ya, coleslaw

“DDLG” by ppcocaine (note: DDLG stands for daddy dom/little girl)

Bet you never had a girl like me
Suck on my clit and smoke good tree (haha)
Do not hesitate to have me on my knees
For you, I might let you hit it for free
For you, haha, I might let you hit it for free
For you (I probably shouldn't though)
I might let you hit it for free

Real renegade shit, daddy, I'm your slut (I'm your slut)
Kissin' on my pussy, put your thumb in my butt

[i] Miranda, Dave, “The role of music in adolescent development: much more than the same old song,” February 2012: https://www.tandfonline.com/doi/full/10.1080/02673843.2011.650182

OnlyFans Promotion on TikTok

OnlyFans Promotion on TikTok

Contented banned according to TikTok’s Community Guidelines is still easily found, even with “Restricted Content” turned on. Within 30 seconds of creating a 13-year-old account and searching “OnlyFans,” our researcher was led to countless videos and hashtags promoting OnlyFans, most of which glamorizing paid sex and normalizes sexual exploitation. And it only took three clicks to get to an OnlyFans page containing sexually graphic photos, including male nudity and an erect (clothed) penis (see screenshots below).

Advertisers and users are creative in bypassing algorithms meant to identify banned or “Restricted Mode” content. For example, users have listed OnlyFans – a site widely known for selling pornography and prostitution –  as onlyf@ns. Currently #accountant is being used to advertise and look for prostitution.[i] Improving AI capabilites and/or increasing human review must be prioritized in order to identify and remove banned content. It is also imperative that banned users’ IP addresses, emails, and phone numbers are blocked to prevent predatory users from setting up a new accounts. OnlyFans is named to the 2021 Dirty Dozen List: Learn more here.

[i] Allen, Joseph, “TikTok Accountants Aren't the Same People Who Help You File Your Taxes” Distractify, January 2021: https://www.distractify.com/p/tiktok-accountant-meaning

Pictures and accounts were available both with and without Restricted Mode turned on.

News Articles

News Articles

Fails to suspend accounts of people sending sexual messages to children. https://www.bbc.com/news/blogs-trending-47813350?intlink_from_url=https://www.bbc.com/news/topics/c255806071xt/tiktok&link_location=live-reporting-story

Fined by FTC for illegally collecting personal information from children. TikTok agreed in the settlement to pay 5.7 million dollars. https://www.komando.com/happening-now/550519/super-popular-tiktok-app-may-have-stolen-your-kids-personal-information

Man arrested in LA for preying upon children on TikTok and sending vulgar messages and showing up at a 9-year-old girls house posing as a delivery man https://www.newsweek.com/los-angeles-county-tiktok-application-lasd-james-anthony-gonzales-child-abuse-1333043

High failure to reply to reported accounts and videos- 2500 signed a petition and his account is still active at the time this was posted- @IraTheFallen https://www.buzzfeednews.com/article/ryanhatesthis/tiktok-has-a-predator-problem-young-women-are-fighting-back

Many articles refer to the well-known predator @thebudday. His account remained active until he took it down himself after a visit from the FBI. He tried to make another account but was blocked. TikTok never contacted him or the girl who he had been messaging. TikTok then confirmed that he was banned for breaking community guidelines. https://www.buzzfeednews.com/article/ryanhatesthis/tiktok-has-a-predator-problem-young-women-are-fighting-back

Forbes calls it “Magnet to sexual predators” (https://www.forbes.com/sites/enriquedans/2019/07/04/tiktok-a-lesson-in-irresponsibility/#70ffd5d52cf8)

Grooming for Sexual Abuse on TikTok

Grooming for Sexual Abuse on TikTok

In addition to TikTok’s troubling association with the normalizing explicit images, TikTok has facilitated a space for sexual grooming by abusers or sex traffickers. These abusers or traffickers utilize TikTok to view minor users and either comment and/or message these minors with sexually explicit content.

An advocacy group accurately called TikTok a “hunting ground” for predators to abuse children and Forbes identified TikTok as a “magnet to sexual predators.”[1]

The National Society for the Prevention of Cruelty to Children (NSPCC) surveyed 40,000 schoolchildren and discovered that 25 percent of the children had livestreamed with a stranger and that one in 20 children were asked, while livestreaming or in the comments of a posted video, to take their clothes off.[2]

A spokesperson from NSPCC commented on the study, linking it to TikTok, stating: "We know that a significant amount of children are being contacted via popular livestreaming apps, such as TikTok, by abusers who are using them as a hunting ground.”[3]

TikTok has become an environment where exploiters pose as younger adolescents and initiate sexually graphic conversations.[4] 


[1] Brown, Shelby. “TikTok, livestreaming apps are 'hunting ground' for abusers, warn kids' advocates.” CNET. (February 2019).

https://www.cnet.com/news/tiktok-live-streaming-apps-are-hunting-ground-for-abusers-warn-childrens-advocates/?utm_source=Triggermail&utm_medium=email&utm_campaign=Post%20Blast%20%28bii-digital-media%29:%20Tiktok%20users%20take%20content%20moderation%20into%20own%20hands%20%7C%20Chrome%20changes%20will%20weaken%20publisher%20paywalls%20%7C%20Snap%20pitches%20brand%20safety&utm_term=BII%20List%20DMedia%20ALL   (accessed July 26, 2019).

Dans, Enrique. “TikTok: A Lesson in Irresponsibility.” Forbes. (July 2019). https://www.forbes.com/sites/enriquedans/2019/07/04/tiktok-a-lesson-in-irresponsibility/#70ffd5d52cf8 (accessed July 26, 2019).

[2] National Society for the Prevention of Cruelty to Children “Livestreaming and video-chatting: Snapshot 2” https://learning.nspcc.org.uk/media/1559/livestreaming-video-chatting-nspcc-snapshot-2.pdf?_ga=2.35661672.916872377.1567544459-2006998223.1567544459

[3] Brown, Shelby. “TikTok, livestreaming apps are 'hunting ground' for abusers, warn kids' advocates.” CNET. (February 2019).

https://www.cnet.com/news/tiktok-live-streaming-apps-are-hunting-ground-for-abusers-warn-childrens-advocates/?utm_source=Triggermail&utm_medium=email&utm_campaign=Post%20Blast%20%28bii-digital-media%29:%20Tiktok%20users%20take%20content%20moderation%20into%20own%20hands%20%7C%20Chrome%20changes%20will%20weaken%20publisher%20paywalls%20%7C%20Snap%20pitches%20brand%20safety&utm_term=BII%20List%20DMedia%20ALL (accessed July 26, 2019).

[4] Murdock, Jason. “Los Angeles Man, 35, Targeted Kids On TikTok App Posing As 13-Year-Old: Police.” Newsweek. (February 2019). https://www.newsweek.com/los-angeles-county-tiktok-application-lasd-james-anthony-gonzales-child-abuse-1333043 (accessed July 2019).

The Impact of Cyber-Based Sexual Abuse

The Impact of Cyber-Based Sexual Abuse

Within the last decade, cyberbullying has emerged as a pernicious new form of bullying that breaks the spirits of our nation’s children. It has been deemed a public health issue[1] and is a matter of serious concern to our organization.

We are especially concerned by evidence which shows that some cyberbullying activity involves sexual harassment and coercion. It is our view that much of the activity referred to under the guise of “sexting” actually represents cyber-based sexual abuse. For instance, offline sexual coercion has been “significantly associated with sending and being asked for a naked image, as well as receiving a naked image without giving permission.”[2] Researchers have also documented “aggravated” forms of “sexting” that may involve adults soliciting sexual images from minors, as well as criminal or abusive behavior by minors such as extortion, or the creation and sending of images without the knowledge of the minors pictured.[3]

Sexting generally has been linked to risky behaviors, as well as sexual abuse and violence. Italian researchers report that of the 536 participants aged 13 to 18 (who were part of a larger study of sexting behaviors), 79.5% reported having sexted at least once, and 8.2% publicly posted a sext as least once.[4] This is terribly disconcerting, as in some instances such sexting could constitute self-produced child pornography. Importantly, extending previous similar findings, the researchers found that of the total 1334 person sample studied (aged 13 to 30):

  • 13% sexted during substance use at least once;
  • 30% had been forced to sext by a partner at least once;
  • 10% had been forced to sext by friends at least once;
  • 95% had sent sexts to strangers;
  • 59% had sent sexts about someone else [sometimes referred to as “secondary sexting”] without her/his consent at least once.[5]

Further, their results confirmed a relationship between sexting and dating violence: “Specifically, moderate and high users of sexting are more likely to be perpetrators of dating violence, including online, than low users of sexing.”[6] Sexting has also been linked to smoking, substance use, alcohol abuse, and binge drinking.[7]

These finds should deeply concern TikTok. With more than 500 million active users worldwide, rates like those have significant implications, especially when TikTok’s reputation is well known for being a social media platform for minors.[8] Although, TikTok deleted many accounts that were assumed to be users under the age 13, undoubtedly, younger minors are still prevalently using the app.[9]

Another case in point pertains to TikTok’s live stream feature. We have observed hypersexualized and explicit sexual content being broadcasted by users on this feature. Additionally, many parents and children have reviewed TikTok negatively due to sexually explicit content on the platform.[10]  Users of TikTok's services—who are permitted by your Terms of Service to be as young as 13-years-old—are not equipped by TikTok with tools to block exploitive material themselves. 

These issues—cyberbullying, cyber-based sexual abuse, and sexting—are impacting an entire generation of American youth, and much more needs to be done to prevent their deepening harms. Thus, it is inexcusable that in its Community Guidelines, TikTok attempts to address serious issues such as those outlined above in the Community Guidelines, but then does not show follow through in ridding the platform of sexual abusers.


[1] Charisse L. Nixon, “Current Perspectives: The Impact of Cyberbullying on Adolescent Health,” Adolescent Health, Medicine, and Therapeutics 5, (2014): 143–158.

[2] HyeJeong Choi, Joris Van Ouytsel, and Jeff R. Temple, “Association between Sexting and Sexual Coercion among Female Adolescents,” Journal of Adolescence 53, (2016): 164–168.

[3] Janis Wolak and David Finkelhor, “Sexting: A Typology,” (Crimes Against Children Research Center, 2011)

[4] Mara Morelli, personal communication, February 20, 2017.

[5] Mara Morelli, Dora Bianchi, Roberto Baiocco, Lina Pezzuti, and Antonio Chirumbolo, “Sexting, Psychological Distress and Dating Violence among Adolescents and Young Adults,” Psicothema 28, no. 2 (2016): 137–142, http://www.psicothema.com/pdf/4303.pdf, (accessed July 26, 2019).

[6] Ibid.

[7] Ibid.

[8] Sehl, Katie. “What is TikTok, Who Uses it, and Should Brands Care About It?” Hootsuite. (May 2019). https://blog.hootsuite.com/what-is-tiktok/ (accessed July 26, 2019)

[9] Malik, Daniyal. “TikTok Is Deleting All Users Under 13 Years-Old.” Digital Information World. (March 2019). https://www.digitalinformationworld.com/2019/03/tiktok-is-deleting-all-users-under-13.html (accessed July 26, 2019)

[10] Monticello Kievlan, Patricia. “TikTok- Real Short Videos.” Common Sense Media. (n.d.)https://www.commonsensemedia.org/app-reviews/musically-your-video-social-network (accessed July 26, 2019)

TikTok’s Policies

TikTok’s Policies

While third-party users certainly bear responsibility for how they use the app, corporations such as yours bear great social responsibility to do the utmost to ensure that their services are not used for harmful ends.

TikTok updated their Community Guidelines in 2020, with significant and laudable improvements. TikTok clearly defines several terms such as “grooming,” “sexual exploitation,” and “sexual harassment.” The Guidelines explicitly list the type of activity and content that is prohibited on their platform, including content that “depicts, promotes, or glorifies” prostitution or pornography, content that simulates sexual activity (either verbally, in text, or even through emojis), or non-consensual sex. The minor safety section outlines the type of actions that would constitute grooming (e.g. content that normalizes sexual contact between and adult and a minor) and even goes so far as to prohibit sexually dance such as “twerking, breast shaking, pelvic thrusting or fondling” oneself or another. 

TikTok Terms of Service.

Of course, what matters most is TikTok’s commitment to the consistent application of these new safety features across its platform. It’s too early to tell just how well TikTok is implementing these new guidelines. As such, we will continue watching how quickly violations are dealt with wherever they arise. 

Take Action

Are You A Survivor of Sex Trafficking?

Visit SexualExploitationLawsuits.com to Find Resources and Options for Legal Recourse.

Parents' Ultimate Guide to TikTok

Read this Great Resource from Common Sense Media

Share Your Story

Have you or someone you know been exposed to pornography and sexual exploitation on TikTok?

Recommended Resources:

Progress

The Dirty Dozen List has fostered incredible change—from legislation drafted to safety policies and features introduced and more.

Read More »
NCOSE Press Statement logo

WASHINGTON, DC (July 13, 2021) – The National Center on Sexual Exploitation (NCOSE) commended TikTok for announcing that it will use enhanced technology to identify and automatically remove “violative content” such as videos showing nudity, sexual activity, or threatening minor safety before it’s uploaded to the platform. In what should be

Read More »

A slew of parental supervision improvements has been made at Meta’s Messenger, Instagram, and TikTok—is it enough to protect kids?

Read More »