Donate Now

How many children have to die before Instagram will do something?

By:

“I have to tell you, our son is gone.”

These are words no parent should ever have to hear or speak. But this is what Martin* had to say to his wife, Chantel,* over the phone one evening. 

Martin and Chantel’s teenage son Jason* died by suicide after he was sexually exploited on Instagram. The exploiters used surreptitiously obtained child sexual abuse material of Jason to threaten him, demanding money. They also threatened to hurt or kill his parents. 

“You might as well end it now,” Jason’s exploiters told him at one point.

And that’s what Jason did. 

*Name’s changed to protect privacy of the victim and his family

Instagram Tops Charts for Sexual Extortion and Other Harms to Children

Jason was a victim of sexual extortion (popularly referred to as “sextortion”). Sexual extortion is the use of sexual images to blackmail the person or persons depicted. The blackmail may be for the purpose of obtaining money or more sexually explicit material, coercion into in-person sex or sex trafficking, pressuring the person to stay in a relationship, or other benefits to the perpetrator.

Sexual extortion frequently occurs through contact on social media platforms—and on Instagram most of all. Recent data from Cybertip.ca found that in 88% of child sexual extortion cases, the perpetrator initially contacted the child on Instagram or Snapchat. The numbers were roughly split between the two platforms.

Instagram is also the #2 parent-reported platform for sexually explicit requests to children, and the #2 platform where minors have had a sexual experience with an adult (tied with Kik and Tumblr).

In fact, Instagram regularly tops the charts in analyses on all sorts of harms to minors. It is #1 platform where minors reported potentially harmful experiences, and the only platform listed as top 5 worst for every single category of harm in Bark’s Annual Report (those categories being severe suicidal ideation, severe sexual content, depression, body image concerns, severe bullying, hate speech, and severe violence).

These are among the many reasons NCOSE named Instagram to the 2023 Dirty Dozen List.

While Instagram has made a couple positive changes since being named to the 2023 Dirty Dozen list—including updating parental controls and instituting a taskforce to investigate facilitation of child sexual abuse material—there is still a long way to go before they cease being one of the most dangerous platforms for kids.

Instagram’s Long History of Prioritizing Profits over Safety

Recent scrutiny from Congress and the press have reaffirmed Instagram’s long history of prioritizing profits over safety. Time and again, it’s been shown that Instagram knows about the severe harms its platform poses to children, but neglects to take sufficient action. Instagram and its parent company Meta have a long history of being exposed for knowing about severe harms but failing to take proportionate action.

A former Meta security consultant, Arturo Bejar, has become the most recent whistleblower. Bejar collected extensive data on Instagram and Meta’s harms to children and raised this data directly with the company leadership, urging them to take action. Disillusioned with Meta’s inadequate response, Bejar took the issue to the Wall Street Journal and to Congress, testifying in a hearing on Tuesday, November 7th.

The data collected by Bejar and his team during his time as a consultant at Meta showed that one in three girls 13-15 years old had experienced unwanted sexual advances on Instagram, with one-in-eight experiencing such advances in the past seven days alone. Bejar shared how his own 14-year-old daughter and “virtually all [her] friends” had experienced sexual harassment and/or sexual advances from strangers on Instagram. When these underage girls reported such conduct to Instagram, they were typically ignored or even told the conduct didn’t violate Instagram’s community guidelines. In fact, it was revealed at the Tuesday hearing that Meta only takes action on 2% of user reports.

As revealed by Jason’s tragic story, the consequences of children’s negative experiences on Instagram are truly dire—even, in some cases, a matter of life and death. As Senator Blumenthal stated during Tuesday’s hearing: “How many children have to die before [Instagram] will do something?”

All of this is reminiscent of another whistleblower incident from two years ago, in which former Meta employee Frances Haugen shared with Congress how the company covered up internal data on their platforms’ severe harms to teens, rather than taking action to address the issues.

Time and time again, Instagram/Meta have been proven to care more about profit and PR than the safety of their users. It’s time to hold them to account.

What Changes Does Instagram Need to Make?

Instagram must make meaningful changes to PREVENT sexual exploitation and other risks to children on their platform. It is not enough to merely respond retroactively to harm that has already occurred. “Policy enforcement is analogous to the police,” Bejar wrote in his letter to Meta leadership—arguing that reactionary responses to crime are necessary, but they are not what makes a community safe.

We are calling on Instagram to:

  • Block the sending and receiving of sexually explicit images for minors.
  • Disable direct messages for 15 and under, and restrict them for 16-17. Direct messages are the primary way sexual exploiters contact, groom, and extort children.
  • Prevent the creation and distribution of child sexual abuse material, including self-generated imagery.
  • Prohibit accounts and all content sexualizing minors including hashtags, emojis, comments on minors’ accounts, as well as on adult-managed accounts for children 12 and under

You can read more of our requests to Instagram here.

Please join us in urging Instagram executives to make these and other changes! Take 30 seconds to fill out the quick action form below:

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.