Apple to Automatically Blur Sexually Explicit Content for Kids 12 and Under, a Change NCOSE had Requested

NCOSE Press Statement logo

WASHINGTON, DC (June 8, 2023) – The National Center on Sexual Exploitation (NCOSE) commends Apple for implementing a major child safety change that NCOSE had requested: Apple’s nudity detection and blurring feature will now apply to videos as well as stills and will be on by default for child accounts age 12 and under.

“This is a major step in child protection by Apple and a victory for families. In partnership with our ally Protect Young Eyes, NCOSE has been calling on Apple for years to proactively turn on all child safety features as the default. We are grateful to Apple for enacting this change that will protect children from nudity, sexually explicit images, and pornographic content that can be harmful to minors in and of itself, but that is also often used to groom children for sexual abuse,” said Lina Nealon, Vice President and Director of Corporate Advocacy, National Center on Sexual Exploitation.

Predators send sexually explicit content as a way to build trust with both children and adults, usually asking for images in return. Once sexually explicit content is obtained, the predator may use it for sextortion (to obtain money, additional images, or as leverage for other demands), or may share it with others – possibly even posting the content on social media and pornography sites.

Teens and adults will also be able to opt-in to this critically important safety feature.  NCOSE hopes Apple will eventually turn this feature on by default for everyone – especially for children ages 13 – 17 (87% of whom own an iPhone) who are also targeted for sexual exploitation and sextortion at increasing rates and deserve greater protections. And even more and more adults are falling victim to sextortion, image-based sexual abuse, and “cyberflashing.” Apple’s AirDrop feature in particular has been under fire for years as a means to “drop” sexually explicit content into other people’s phones.

Apple has also made this technology available to developers through API – a commendable example of cross-platform collaboration around child safety. Discord said it plans to use it on its platform through i0S.

“We urge all companies – especially those most popular with teens like Snapchat and TikTok – to integrate this child safety measure for free through API. Apple has set an industry standard and in effect has issued an invitation to their tech peers to join them: we trust other companies will prove they truly care about protecting their most vulnerable users by accepting Apple’s offer,” Nealon said. 

While NCOSE applauds Apple for this significant improvement, the Apple App Store remains on the 2023 Dirty Dozen List for deceptive app age ratings and descriptions that mislead families about the content, risks, and dangers to children on available apps.

About National Center on Sexual Exploitation (NCOSE)
Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national non-partisan organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health harms of pornography.

The Numbers


NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.


The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.


NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide


Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.