Picture this:
You’re the parent of a child and it’s their 13th birthday. All they want is an Instagram account, and they’re finally old enough. “All my friends have Instagram,” they say. You’re hesitant, but they do meet Instagram’s age requirement, which is 13—and the Apple App Store rates Instagram for 12+. So, it should be okay.
To be extra sure they’re safe, you follow your child on Instagram to monitor their activity.
A few weeks later, as you’re scrolling through your Instagram feed to see what your child is up to, you see something that makes your heart stop: an advertisement for apps that use AI to create nude images of people.
You’re appalled that this type of advertisement exists at all … and how on Earth is it running on an app rated for 12 or 13-year-olds??
An Epidemic of AI-Generated Sexual Abuse Images
Below are examples of ads for “nudifying” apps that ran on Instagram and other Meta platforms:
These ads are part of an ecosystem fueling an enormous explosion in AI-generated sexually explicit images. This is a form of child sexual abuse material (CSAM) when the images depict children, and a form of image-based sexual abuse (IBSA) when they depict adults.
This type of abuse is growing alarmingly common. 2024 research from Thorn found that 1 in 10 kids say their friends or classmates use AI to generate CSAM of other kids.
In Florida, middle school students created AI-generated CSAM of their peers. Girls at a New Jersey high school were humiliated when their classmates did the same thing. The abusive behavior kids are committing against their peers is becoming increasingly common with the development of AI. Instagram is an app that is widely used by children, so promoting nudifying apps to users on this platform is a significant enabler for the creation of child sexual abuse material (CSAM).
Nudifying apps enable users to digitally remove the clothing of anybody (although many only work on females, highlighting the gendered nature of sexual violence). All the user has to do is upload an image of the person he wants to “nudify” and artificial intelligence will do the rest. This is typically done without the consent of the person depicted in the photo and can be shared on the Internet for thousands to see. This can damage the reputation and mental health of those who are shown in these forged images.
Meta’s Applications are Overflowing with Ads for Nudifying Apps
Meta did recently make some historic changes to protect children online with its introduction of safer “teen accounts” on Instagram. Teen accounts have defaulted safety settings for all users under 18, and for users under 16, a parent’s permission is required to change these safety settings. We acknowledge and are very grateful for these changes, but Meta’s platforms are still harboring a myriad of advertisements for nudifying apps.
One user collected hundreds of ads for nudifying apps that showed up on their Instagram feed, but even after they reported them, the advertisements continued to pop up on their timeline.
“No matter how many times I report the ads, and laboriously click through the selections that say I am not interested in them (Instagram makes this unnecessarily hard, and tedious), the ads keep getting served,” the user said.
The Washington Post reported finding “222 ads on Meta’s platforms for five different tools that offer to generate fake nude images of real people.” 404 Media investigated Meta’s Ad Library and found evidence of multiple ads for nudifying apps running on both Instagram and Facebook.
NCOSE was one of the early voices sounding the alarm about nudifying apps on Meta. This was one of the dangers we called out when placing Meta on NCOSE’s 2024 Dirty Dozen List. Thanks to you joining us in this campaign, the issue has since garnered significantly increased attention, and we hope will soon lead to real change!
Progress Fighting AI-generated IBSA & CSAM
Thankfully, we have had some victories in the fight against AI-generated CSAM & IBSA
Microsoft’s GitHub, one of the foremost artificial intelligence development platforms, made significant policy changes to prohibit the creation and distribution of software codes that are used to create AI-generated sexually explicit images. Microsoft’s GitHub is arguably the #1 corporation with the most power to combat AI-generated IBSA and CSAM; the importance of this victory cannot be overstated!
Further, just weeks after Apple and LinkedIn were named to the 2024 Dirty Dozen List, Apple removed four nudifying apps from the App Store and LinkedIn removed nudifying bot promotions and articles. Google also banned all ads for deepfake pornography apps, bots, and sites; removed 27 nudifying ads; and removed at least three nudifying apps. YouTube took down 11 channels and over 120 videos associated with nudifying apps.
With 2.2 billion active Apple devices across the globe, LinkedIn boasting 1 billion members, Google ads reaching approximately 4.77 billion people, and YouTube reaching almost 2.5 billion monthly active users, it’s safe to say these changes have an enormous impact!
Legislation Must Require ALL Tech Companies to Fight This Epidemic
While other big tech companies are joining the fight against nudifying apps, Meta is lagging behind, with its platforms still showing a plethora of these advertisements.
We must pass legislation that requires ALL tech companies to combat the epidemic of AI-generated IBSA & CSAM!
Three key bills are currently before Congress:
- The DEFIANCE Act (S. 3696/H.R. 7569) is one solution. This bill would establish a civil cause of action to allow victims to hold perpetrators accountable for distributing or threatening to distribute AI-generated sexual abuse images.
- The TAKE IT DOWN Act (S. 4569/H.R. 8989) criminalizes publishing or threatening to publish AI-generated sexual abuse images or other forms of IBSA.
- The SHIELD Act (S. 412 /H.R. 3686) criminalizes the distribution of non-consensually shared explicit images and criminalizes sexually exploitative images of children that do not meet the current definition of CSAM.
We have had great progress on all three of these bills! But we need your help pushing them to the finish line. Please TAKE ACTION NOW, urging Congress to pass this crucial legislation!