Mainstream Contributors To Sexual Exploitation
For countless children and adults that includes their worst moments. Grooming, child sexual abuse materials, sex trafficking, and many other harms continue to fester on Instagram.
Like so many other 12-year-old girls, Maya* just wanted to feel special.
And that’s how Robert* made her feel. He may just be a stranger messaging her on Instagram, but he was kind to her, told her she was pretty . . . So, when Robert asked for naked photos, Maya thought it couldn’t hurt.
She was wrong.
The interactions escalated to meeting in person, and before she knew it, Maya was being sex trafficked. Robert advertised her on her Instagram profile (which he took control of), and soon Maya was receiving direct messages from other men intending to pay to sexually abuse the 12-year-old girl.
Even after Maya escaped her sex trafficking situation, the exploitation did not end. Her trafficker had taken explicit photos and videos of her being sexually abused and continued to sell these on Instagram. The direct messages from sex buyers kept coming as well.
Overwhelmed and helpless, Maya fell into depression . . . At age fifteen, she died at the hands of a 43-year-old man who was contacting her on Instagram.
Instagram failed Maya. Their tools to detect grooming did not prevent an underage girl from being contacted by sexual exploiters and older men. The child sexual abuse material being circulated by Maya’s trafficker was reported, but was still on the platform at the time of Maya’s death.
Tragically, Maya is only one of countless children who are exploited and harmed on Instagram.
A 2022 study published by Thorn, a leading resource on online child exploitation, found that Instagram tied with Kik and Tumblr as the platform where minors reported the second highest rates of online sexual interactions with people they thought were adults. And a recent survey of 1,000 parents across the US by ParentsTogether found that Instagram correlated with higher rates of parent-reported children sharing sexual images of themselves – a form of Child Sex Abuse Material.
Children are also regularly exposed to pornography and harmful content on Instagram. The UK Children’s Commissioner report found that 33% of children who had seen pornography saw it on Instagram. Further, Instagram was the only platform to rank in the “top 5 worst” for every category of harm in the 2022 Bark Report: severe sexual content, severe suicidal ideation, depression, body image concerns, severe bullying, hate speech, severe violence.
How is it possible that, after all the scrutiny and backlash Instagram has rightly faced, they still have not made meaningful improvements? Why do they remain at the top of every list of the most harmful platforms? NCOSE is demanding answers—and more than that, we are demanding real change!
Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Instagram for more details.
See our Notification Letter to Instagram here.
*Pseudonyms are used to respect the victimized person
Multiple news stories, survivor and whistleblower testimony, lawsuits, and external reports would support our assessment that Instagram’s changes aren’t actually doing much to protect children. For years, Instagram has been on the top of nearly every list outlining the most dangerous apps for youth…and recent reports are no different:
See the evidence we’ve compiled in this easy to view & download PDF
Articles about Instagram’s inaction and poor policy implementation abound, here are some for the past year:
Further, as long discussed and brought to Instagram’s attention by NCOSE and also by other advocates like Collective Shout and Defend Dignity—there is a rampant problem of parasitic accounts that collect photos of young children and collect them for fetishization and sexualization. Often in these cases, comments overtly sexualize children and often network to discuss the trading of more explicit content (child sexual abuse material). NCOSE and allied advocates have discussed examples of these accounts to Instagram for years.
See the evidence we’ve compiled in this easy to view & download PDF
Unfortunately, NCOSE researchers quickly came across dozens of accounts on Instagram with high markers for likely trading or selling child sexual abuse materials (I.e child pornography.) Why isn’t Instagram proactively removing such networking and sales accounts?
See the evidence we’ve compiled in this easy to view & download PDF
In addition to the many harms perpetuated against children, we know they are not the only ones at risk on your platform. Instagram is increasingly being called out as a hub for image-based sexual abuse – the nonconsensual capture, posting, and sharing of sexually explicit images. The Center for Countering Digital Hate found that Instagram failed to act on 90% of abuse sent via DM to high-profile women.
See the evidence we’ve compiled in this easy to view & download PDF
#1 Platform for the highest rates of sextortion (Learn More)
#2 platform where minors have had a sexual experience with an adult and also #2 for minors reporting any sexual experience
#2 social media platform where children were most likely to have seen pornography
#2 parent-reported platform for sexually explicit requests to children