Originally Published at Bloomberg
By Shelly Banjo and Shawn Wen
TikTok is known primarily as a launchpadfor funny memes, dance routines and lip-synching videos. The company embraces that reputation with a tagline, “the last sunny corner on the internet.” But there’s a dark side to TikTok that engulfs some of the app’s youngest users.
Beneath the surface, TikTok also hosts videos promoting anorexia, bullying, suicide and sexual exploitation of minors. Highly personalized recommendations, driven by algorithms owned by the parent company ByteDance Ltd., often make it harder for parents to track what their children are seeing and for regulators to monitor what kids are being exposed to on the app.
Safety advocates said TikTok for years prized expansion over the protection of minors. “Their company exploded in growth across the world, and they just didn’t prioritize child safety as they were growing,” said Dawn Hawkins, who runs an advocacy group called the National Center on Sexual Exploitation.
Hawkins said she spent months helping an 8-year-old relative get inappropriate videos of him in his underwear taken down from TikTok. Hawkins acknowledged that TikTok recently made a number of sought-after improvements but said it’s still not a safe place for very young children to roam unmonitored.