Instagram Harms Kids, Proposed Solutions Insufficient

NCOSE Press Statement logo

Washington, DC (December 8, 2021) – On the heels of head of Instagram Adam Mosseri’s testimony before the U.S. Senate, the National Center on Sexual Exploitation (NCOSE) reasserts its position that the platform continuously fails to prioritize protecting children, and that the recently announced changes fall far too short given the extent of harm on Instagram.

“In addition to the mental health harms Instagram knowingly inflicts on young people, the platform is also consistently noted as a top site used for grooming, child sex trafficking, and child sex abuse material. Yet adults continue to have easy access to minors in ways we would never allow offline. Instagram allows adults to sexualize minors in the comments and doesn’t prevent sexually explicit images from being direct messaged to minors, both of which are common tools used by predators for grooming and even sextortion,” explains Lina Nealon, director of corporate and strategic initiatives at the National Center on Sexual Exploitation.

“Mosseri’s proposed changes are disproportionately insufficient given the gravity and extent of the harms on Instagram. Rather than making significant alterations to the platform itself to rectify risky design and features, Instagram continues to place a greater burden on parents—and even children themselves—to monitor and manage an inherently dangerous product. Not only does this shift responsibility away from Instagram itself, it leaves children who don’t have the privilege of informed, involved caregivers at greater risk of exploitation.”

“While NCOSE advocates for increased caregiver oversight on their children’s online experience in addition to substantive changes to platforms themselves, we are skeptical that Instagram’s announced parenting tools focused primarily on time limits will have any significant impact on stemming abuse—especially when there is evidence that children may be groomed online in as little as 18 minutes. In addition to drastically limiting the ability of adults to interact with children on Instagram, caregivers should have the ability to lock safety features in place and monitor their children’s content and connections at a graduated degree depending on the child’s age. These features should be included from the start given the extensive dangers on Instagram.

“Given the extent of proven harms and risks on Instagram, the platform should also default all features to the highest safety, privacy, and content-control settings for minors and should remove the most high-risk features altogether, especially for 13–15-year-olds.

“As Instagram has yet to develop the technology and moderation capacity to stem the rampant abuses on its platform, keep kids 12 and under off the app, and drastically adjust algorithms, the company should err on the side of swiftly instituting much broader measures and tools to protect the safety and well-being of its youngest users,” Nealon concluded.

The Numbers


NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.


The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.


NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.



Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.