TikTok
The Problem
With more than a billion active monthly users worldwide, TikTok is a wildly popular social networking app for creating and sharing short videos. The platform is particularly popular with young people: a quarter of US TikTok users are between the ages of 10 – 19 (surpassing Instagram’s Gen Z user base and catching up quickly to Snapchat), while 43% of the global userbase is between 18-24 years old.
Unfortunately, as it was rising in popularity in the United States, TikTok was known to be the “platform of choice” for predators to access, engage, and groom children for abuse.
Exploiters utilize TikTok to find and view minor users, comment on videos, and message children, often requesting or sending sexually explicit videos or pictures. Evidence is growing of child sex abuse material (CSAM, also known as child pornography) trading on the platform – with US Department of Homeland Security noting that between 2019 and 2021 the number of TikTok-related child exploitation investigations has increased seven-fold.
After being named to the 2020 Dirty Dozen List and meeting with NCOSE, TikTok implemented several of our recommendations to significantly improve their safety features for minors, such as disabling direct messaging for those under 16 and allowing parents to lock controls with a pin code. TikTok also released extensive Community Guidelines, clearly defining terms and listing activities and content prohibited on their platform, including content that “depicts, promotes, or glorifies” prostitution or pornography, content that simulates sexual activity (either verbally, in text, or even through emojis), or non-consensual sex.
While these changes are certainly encouraging, it remains to be seen how well new policies will be put into practice. We remain concerned about the extent of harmful content still accessible by young users – including advertising for pornography and prostitution sites – and believe there is still more TikTok could do to protect youth using their platform.
Tell TikTok they must do more to protect kids: take action below!
See Our Requests For Improvement
While the improved controls and defaults to the TikTok platform have likely made the app safer for kids, there is still an abundance of content on TikTok that could be harmful to minors and that normalizes the commercial sexual exploitation industry.
We trust that TikTok continues its recent trend of increased responsibility and accountability – creating a safer online platform for all users. Specifically we ask that TikTok:
- Enforce policies and Community Guidelines consistently and thoroughly
- Default “Restricted Mode” for all minors
- Provide caretakers and minors more resources to manage their accounts and enhance safety, such as prompts upon account-creation and embedded PSAs throughout the app to teach users to identify and report inappropriate or abusive behavior (sextortion, sexual harassment, grooming, etc.)
- Develop algorithms that detect age-lying by minors, as well as adults
- Improve processes to assess reports and proactively find and block (permanently) abusive and exploitative accounts, content, and hashtags, including those promoting the commercial sex industry
- Adjust Apple App rating from 12+ to 17+ and Google Play from “Teen” to “Mature” to more accurately reflect the content on TikTok