A Mainstream Contributor To Sexual Exploitation

Capture and Share The World's Moments

For countless children and adults that includes their worst moments. Grooming, child sexual abuse materials, sex trafficking, and many other harms continue to fester on Instagram.

Update: December 7, 2023: In a decision endangering millions of children and celebrated by predators worldwide, Meta announced that all messages on Facebook and Messenger will now be end-to-end encrypted (E2EE) by default. E2EE will soon be implemented across all Meta platforms, including Instagram Direct. This “see no evil” policy is in effect ends child sex abuse investigations on Meta, which in 2022 alone reported 20 million instances of CSAM to NCMEC on Facebook and Messenger. This choice is devastating blow to child safety on a platform that is one of (if not the) greatest facilitators of sexual abuse and exploitation. Read our press release here. 

Given this recent unethical choice by Meta, coupled with the ever-growing body of evidence of grave harms to children across all its platforms, The National Center on Sexual Exploitation calls on Meta to no longer allow minors on its platforms. The risks are too great and Meta has proven itself unwilling and unable to protect kids. 

Read more about the consequences of Meta’s decision by leading child safety expert John Carr.  

Take Action

Like so many other 12-year-old girls, Maya* just wanted to feel special.

And that’s how Robert* made her feel. He may just be a stranger messaging her on Instagram, but he was kind to her, told her she was pretty . . . So, when Robert asked for naked photos, Maya thought it couldn’t hurt.

She was wrong.

The interactions escalated to meeting in person, and before she knew it, Maya was being sex trafficked. Robert advertised her on her Instagram profile (which he took control of), and soon Maya was receiving direct messages from other men intending to pay to sexually abuse the 12-year-old girl.

Even after Maya escaped her sex trafficking situation, the exploitation did not end. Her trafficker had taken explicit photos and videos of her being sexually abused and continued to sell these on Instagram. The direct messages from sex buyers kept coming as well.

Overwhelmed and helpless, Maya fell into depression . . . At age fifteen, she died at the hands of a 43-year-old man who was contacting her on Instagram.

Instagram failed Maya. Their tools to detect grooming did not prevent an underage girl from being contacted by sexual exploiters and older men. The child sexual abuse material being circulated by Maya’s trafficker was reported, but was still on the platform at the time of Maya’s death.

Tragically, Maya is only one of countless children who are exploited and harmed on Instagram.

A 2022 study published by Thorn, a leading resource on online child exploitation, found that Instagram tied with Kik and Tumblr as the platform where minors reported the second highest rates of online sexual interactions with people they thought were adults. And a recent survey of 1,000 parents across the US by ParentsTogether found that Instagram correlated with higher rates of parent-reported children sharing sexual images of themselves – a form of Child Sex Abuse Material.

Children are also regularly exposed to pornography and harmful content on Instagram. The UK Children’s Commissioner report found that 33% of children who had seen pornography saw it on Instagram. Further, Instagram was the only platform to rank in the “top 5 worst” for every category of harm in the 2022 Bark Report: severe sexual content, severe suicidal ideation, depression, body image concerns, severe bullying, hate speech, severe violence.

How is it possible that, after all the scrutiny and backlash Instagram has rightly faced, they still have not made meaningful improvements? Why do they remain at the top of every list of the most harmful platforms? NCOSE is demanding answers—and more than that, we are demanding real change!

Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Instagram for more details.

See our Notification Letter to Instagram here.

*Pseudonyms are used to respect the victimized person

Our Requests for Improvement

Proof

Evidence of Exploitation

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.

Multiple news storiessurvivor and whistleblower testimony, lawsuits, and external reports would support our assessment that Instagram’s changes aren’t actually doing much to protect children. For years, Instagram has been on the top of nearly every list outlining the most dangerous apps for youth…and recent reports are no different:

See the evidence we’ve compiled in this easy to view & download PDF

Articles about Instagram’s inaction and poor policy implementation abound, here are some for the past year: 

Further, as long discussed and brought to Instagram’s attention by NCOSE and also by other advocates like Collective Shout and Defend Dignity—there is a rampant problem of parasitic accounts that collect photos of young children and collect them for fetishization and sexualization. Often in these cases, comments overtly sexualize children and often network to discuss the trading of more explicit content (child sexual abuse material). NCOSE and allied advocates have discussed examples of these accounts to Instagram for years.

See the evidence we’ve compiled in this easy to view & download PDF

Unfortunately, NCOSE researchers quickly came across dozens of accounts on Instagram with high markers for likely trading or selling child sexual abuse materials (I.e child pornography.) Why isn’t Instagram proactively removing such networking and sales accounts?

See the evidence we’ve compiled in this easy to view & download PDF

In addition to the many harms perpetuated against children, we know they are not the only ones at risk on your platform. Instagram is increasingly being called out as a hub for image-based sexual abuse – the nonconsensual capture, posting, and sharing of sexually explicit images. The Center for Countering Digital Hate found that Instagram failed to act on 90% of abuse sent via DM to high-profile women. 

See the evidence we’ve compiled in this easy to view & download PDF

As long discussed and brought to Instagram’s attention by NCOSE and also by other advocates like Collective Shout and Defend Dignity—there is a rampant problem of parasitic accounts that collect photos of young children and collect them for fetishization and sexualization. Often in these cases, comments overtly sexualize children and often network to discuss the trading of more explicit content (child sexual abuse material). NCOSE and allied advocates have brought examples of these accounts to Instagram for years (see this memo sent to Instagram in November of 2021

The Wall Street Journal’s investigative piece “Instagram Connects Vast Pedophile Network” elevated what NCOSE has been pushing on Instagram for years ….and the Standford Internet Observatory echoed our same sentiments of frustration that small groups of researchers are able to find instances of abuse instantly: why can’t Instagram? 

“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” he said, noting that the company has far more effective tools to map its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added. 

Test accounts set up by researchers that viewed a single account in the network were immediately hit with “suggested for you” recommendations of purported child-sex-content sellers and buyers, as well as accounts linking to off-platform content trading sites. Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children. 

The Stanford Internet Observatory used hashtags associated with underage sex to find 405 sellers of what researchers labeled “self-generated” child-sex material—or accounts purportedly run by children themselves, some saying they were as young as 12. According to data gathered via Maltego, a network mapping software, 112 of those seller accounts collectively had 22,000 unique followers. 

Underage-sex-content creators and buyers are just a corner of a larger ecosystem devoted to sexualized child content. Other accounts in the pedophile community on Instagram aggregate pro-pedophilia memes, or discuss their access to children. Current and former Meta employees who have worked on Instagram child-safety initiatives estimate the number of accounts that exist primarily to follow such content is in the high hundreds of thousands, if not millions. 

Minors including pre-teens are still being recommended via Explore, Suggested for You, and (particularly concerningly) Reels – where pre-teens are essentially providing sexual entertainment for men. Some girls have hundreds of thousands of followers and millions of views on their Reels.

Pre-teen girls followed by/targets of approaches from predators.

Accounts operating as pedophile networks.

Collective Shout frequently documents incidents where men make sexualized/sexual abuse/predatory comments to underage girls, and the account owner ‘likes’ the comment. Instagram repeatedly comes back with “no violation.” 

Hashtags also remain a serious concern, given the ease with which they connect predators with children. Repeat offenders posting sexualized comments should be barred from the platform and “suspicious adults” should be prevented from commenting on minors’ content, as well as on managed accounts for children 12 and under. We’ve also asked Instagram to prohibit re-sharing or re-posting minors’ content, or content portraying minors, without permission. 

See the evidence we’ve compiled in this easy to view & download PDF

Fast Facts

#1 Platform for the highest rates of sextortion (Learn More)

#2 platform where minors have had a sexual experience with an adult and also #2 for minors reporting any sexual experience

Apple App Store rating: 12+, Google Play Store rating: T

#2 parent-reported platform for sexually explicit requests to children

Similar to Instagram

TikTok

Especially since Instagram introduced Reels, NCOSE has focused a lot on both platforms. However, this year we felt TikTok was getting a lot of (deserved) attention with policymakers and press. We wanted to elevate Instagram on the DDL because it has never actually been on the List (Meta and Facebook have been), and recent data shows that child online exploitation seems to be much more rampant on Instagram than on TikTok.

Share

Help educate others and demand change by sharing this on social media or via email:

Facebook
Twitter
LinkedIn
Email

Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.