Meta to Blur Nudity on Instagram DMs for Minors – A Direct Result of NCOSE’s Advocacy

NCOSE Press Statement logo

WASHINGTON, DC (April 12, 2024) – The National Center on Sexual Exploitation (NCOSE) said that Meta’s announcement that it will automatically blur nude images in Instagram direct messages for minors under age 18 is a direct result of NCOSE’s request for the company to do so. Meta’s announcement came the day after it was placed on the 2024 Dirty Dozen List
NCOSE has also called on Meta to automatically blur nude images for all minors on Facebook and WhatsApp. 
“Meta announced in January 2024 that it would blur nudity for 13-15-year-olds, and while it was a much-needed fix, we questioned why the company didn’t extend those same protections to 16 and 17-year-olds given the rise in sextortion, including of older teens. NCOSE had been pressing on Meta for years to proactively prevent sexually explicit content from being sent to or sent by minors: most recently in a letter to inform Meta of its inclusion on the 2024 Dirty Dozen List. We are glad to hear that Meta finally listened to us and will enact automatic blurring of nudity for ALL minor accounts on Instagram – a common sense move that should be industry standard. But Meta must extend this safety feature to all of its platforms,” said Lina Nealon, Vice President and Director of Corporate Advocacy, National Center on Sexual Exploitation. 

Instagram is the #2 platform for the highest rates of sextortion, according to the Canadian Centre for Child Protection. There have been multiple cases of 16-17-year-olds tragically dying by suicide linked to Instagram sexual extortion schemes. Meta CEO Mark Zuckerberg was forced to apologize to several of the victims’ parents, who stood holding their deceased children’s photos during the January 31, 2024, Congressional hearing on Big Tech’s role in fueling the online child sexual abuse crisis. Meta’s platforms – Facebook, Messenger, Instagram, and WhatsApp – have consistently been ranked for years as the top hotspots for a host of crimes and harms: pedophile networks sharing child sex abuse material, where CSAM offenders first contact children, exploitative algorithms promoting children to adults, sex trafficking, sextortion, and image based sexual abuse

Though NCOSE supports Meta’s new changes, it also warns caregivers that teens can still view the content with a click of a button. “To truly protect children from the rising risks of sextortion, CSAM, sex trafficking, etc. – sexually explicit content should be completely blocked for minors. The cost of potentially life-threatening, lifelong trauma to kids far outweighs any possible benefit Big Tech tries to sell us for why minors should be able to receive or view sexually explicit content (a harmful and often illegal activity,” Nealon said. 

“Meta, a billion-dollar company, should and must prioritize implementing substantive changes across all of its platforms to protect children and teens from harm. This includes halting end-to-end encryption – which allows Meta to effectively blind itself to the most egregious harms on its platforms. And Meta should ban and remove all ‘nudifying’ bots, profiles, and ads within the Meta library and across all Meta platforms and products.”

About National Center on Sexual Exploitation (NCOSE)
Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national non-partisan organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health harms of pornography.

The Numbers


NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.


The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.


NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide


Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.