Meta Pledges Action After Hundreds of Nudifying App Ads Exposed 

14-year-old girls in New Jersey were violated and humiliated when their classmates used AI to create nude images of them, and distributed them across the school.  

A female politician’s promising career was thrown into disarray when she discovered AI-generated pornography of herself online.  

A young woman’s trust was utterly destroyed when her own best friend posted AI-generated images of her on a pornography website.  

These are just a few real-life examples of how people’s lives have been ruined through AI-generated image-based sexual abuse (a.k.a. “deepfake pornography”). AI-generated IBSA is a rapidly growing form of sexual violence that is impacting countless people worldwide. No one is immune—if a photo of you exists online, you could be victimized.  

A common way AI-generated IBSA is created is through “nudifying apps,” which allow a user to take innocuous images of women and “strip” them of clothing. 

With the help of grassroots advocates like you, NCOSE has succeeded in getting many mainstream corporations to remove these apps or advertisements for them. Disturbingly, ads for nudifying apps still abound on Meta platforms like Instagram—even though these platforms are frequented by kids.  

On the 2024 Dirty Dozen List, NCOSE called for Meta to ban and remove all ads like these. Yet while they are banned on paper and some are removed, it remains alarmingly easy to find these ads in droves. Just a few weeks ago, a CBS investigation uncovered hundreds of ads for nudifying apps on Meta platforms. Shortly after, Meta issued a press release detailing their plan to crack down on nudifying app ads.  

While we regard Meta’s promises with a healthy dose of skepticism, given their long track record of safety features falling short, we tentatively celebrate this as progress. Thank you for raising your voice through the Dirty Dozen List and pressing on Meta to change! We will be keeping a close eye on Meta, to evaluate whether their PR is followed by a real improvement in the problem of nudifying app ads.  

Not a Fantasy

How the Pornography Industry Exploits Image-Based Sexual Abuse in Real Life

Meta Promises Improved Detection Technology

In their recent press release, Meta announced they’re building new technology to more accurately detect ads for nudify apps. They claim that this technology will work even when the ads don’t include nudity, and therefore evade Meta’s nudity detection system. Further, Meta says they are using matching technology to help “find and remove copycat ads more quickly.”  

Meta rightly states that nudifying apps are proliferating across the Internet, and removing them from one platform is not enough. As such, Meta committed to sharing the URLs to violating apps and websites with other tech companies, so those companies can investigate and hopefully take action as well. Meta says they have provided more than 3,800 unique URLs to other tech companies since March 2025.  

Meta Files Lawsuit Against Developer of Nudifying Apps  

Meta announced that it is suing Joy Timeline HK Limited, the entity between “CrushAI” nudifying apps. They state that the lawsuit “follows multiple attempts by Joy Timeline HK Limited to circumvent Meta’s ad review process and continue placing these ads, after they were repeatedly removed for breaking our rules.”  

While there is likely a PR element to this move, with Meta wishing to shift the focus to the developers’ culpability, rather than their own insufficient safeguards, this is still good news. We hope that this lawsuit will help deter some nudifying app developers who fear being held accountable under the law. At the same time, we must not lose sight of Meta’s own responsibility to keep their platforms safe. 

Section 230 of the Communications Decency Act allows online platforms like Meta to dodge liability for their dangerous products, placing blame solely on third parties, such as the people requesting ads or uploading content. There is no doubt that these people bear responsibility—but so does Meta.  

Urge Congress to repeal Section 230, so that online platforms like Meta can be held accountable!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.