WASHINGTON, DC (July 13, 2021) – The National Center on Sexual Exploitation (NCOSE) commended TikTok for announcing that it will use enhanced technology to identify and automatically remove “violative content” such as videos showing nudity, sexual activity, or threatening minor safety before it’s uploaded to the platform.
In what should be standard procedure for all social media platforms, TikTok will err on the side of caution by removing a video if it violates TikTok’s standards, and then will notify the uploaders, who can appeal the decision if they choose.
“TikTok’s new changes are setting higher standards for safety for all of its users. This move shows a genuinely proactive effort by TikTok to implement their Community Guidelines for the good of their users, rather than merely having these guidelines on paper to protect their own liability. These changes will dramatically reduce the number of people – including minors – subjected to sexually explicit material,” said Lina Nealon, director of corporate and strategic initiatives for the National Center on Sexual Exploitation.
“We hope to see greater use of technology by TikTok and other tech companies to combat – rather than perpetuate – sexual abuse and exploitation on their platforms. TikTok has made moves that the rest of the tech industry can and should emulate: erring on the side of removing potentially harmful content, prioritizing the well-being of their users and employees, and rejecting the notion that nudity and sexually explicit content is entertainment,” Nealon added.
Featured on NCOSE’s 2020 Dirty Dozen List for rampant grooming and exploitation children and on the 2021 Dirty Dozen List for highly sexualized content, TikTok has instituted a series of such sweeping and significant changes this past year that it now serves in many ways as an industry standard for child safety.