If you haven’t yet heard about Facebook’s latest plans to further monetize our children, let’s get you up to speed…
Just over a month ago, an internal Facebook document was leaked by Buzzfeed revealing plans for an Instagram app targeting children 12 years and younger. This news unleashed a collective fury the likes of which we don’t often witness. Policymakers, psychologists, public health experts, child advocacy groups and parents from around the globe have turned their outrage into action.
CEO Mark Zuckerberg has received letters from 44 US attorneys general, Republican and Democratic legislators alike, mental health professionals and leaders in the movement to protect kids online: all demanding that Facebook—for once—prioritize the health, safety, and well-being of children by abandoning plans for the forthcoming “Instagram for Kids.”
Creating an Instagram app for kids is not only a bad idea, but utterly irresponsible given Facebook’s abysmal track record protecting children on its various platforms.
- Recent data from England’s child protection agency, NSPCC, found that more than half of the online child sex crimes reported to UK police in 2020 took place on Facebook-owned apps, with more than a third of child grooming instances having happened over Instagram.
- Child internet protection organization Thorn just released a report in early May, Responding to Online Threats: Minors’ Perspectives on Disclosing, Reporting, and Blocking, in which Instagram and Snapchat tied for the largest number of survey participants (ages 9–17 years old) reporting having had an online sexual interaction on that platform, defined as being asked for a nude image or video, being asked to go “on cam” with a nude or sexually explicit stream, being sent a nude photo or video, or being sent sexually explicit messages.
- For years, survivors allies have been telling us (and Instagram) that the platform is widely used by pimps to sell both children and adults for sex.
Kids 12 and under aren’t supposed be on Instagram—but they are. That same Thorn study found that 40% of the respondents ages 9–12 were on Instagram “at least once a day.”
Facebook justifies plans for creating an Instagram app for younger kids, promising that it will be a safer platform for children who want to use the popular photo- and video-sharing app; one that supposedly will be tighter on privacy and safety.
We’re not buying it.
No product for children is without risks. In fact, a design flaw in Facebook’s Messenger Kids—which the company touted as being completely safe—let thousands of kids join chats that were not approved by parents. Even if Facebook could create a platform safe from predators, the vast body of research outlining the detrimental effects of social media to children’s mental, physical, and socio-emotional health is overwhelming.
By introducing an Instagram product for even younger ages, Facebook is enmeshing younger and younger children into their brand, creating a ready pipeline of a yet untapped demographic that it can monetize.
Rather than building a new product to “hook” kids at even younger ages, Instagram should prioritize stemming the rampant sexual abuse and exploitation of the many minors currently on Instagram.
Together with an international coalition of allies, NCOSE has been working with Instagram for a quite some time, advocating for concrete measures to make their platform safer for the millions of minors who are using it. In fact, Instagram recently announced several positive changes we’ve been pushing for, specifically around direct messaging—a primary tool used by predators for grooming minors. Yet they still have a very long way to go and quite frankly, are not instituting changes with the urgency the gravity of the abuses require.
Let’s demand that Facebook for once put children’s well-being and safety over their bottom line by scrapping Instagram for kids.
Want to learn more about the harms of social media on kids and how they’re trying to navigate their digital world?