Instagram is a free photo and video sharing app that has long reigned as one of the most popular social media platforms around the world. Owned by Meta (formerly Facebook), Instagram boasts more than 2 billion active users, with a recent study noting that 81% of 10,000 US teens surveyed said they use Instagram. Instagram is known as the “lifestyle” app – compelling users to present a perfect life (often using Instagram-created filters) and leading to endless comparisons about bodies, decreasing self-esteem and increasing vulnerabilities. Predators troll Instagram to use young people’s need to be liked, accepted, praised – making Instagram one of the most dangerous online spaces for kids.  

In November 2019, the National Center on Sexual Exploitation partnered with DC-based, survivor-led service organization Courtney’s House, Australia’s Collective Shout, and Canada’s Defend Dignity to launch #WakeUpInstagram, a campaign calling out Instagram for facilitating child sexual abuse, sex trafficking, and the grooming of young children on its platform. In addition to galvanizing the general public to directly press Instagram to act, our coalition has continued to meet consistently with Instagram directly to advocate for concrete changes, provide expertise and evidence, and elevate survivor experiences on the platform.  

Many of the actions our coalition has urged Instagram to institute have been adopted and we applaud the platform for these improvements.  

At the time of the campaign’s launch, we urged Instagram to take three key actions:   

  1. Change its settings so that strangers cannot direct message minors. 
  2. Fix its algorithm to proactively remove sexualizing or sexually graphic comments on minor’s photos. 
  3. Update its reporting system so that if someone is reporting a sexual comment on a minor’s post it can be reported as such. The “harassment/bullying” selection does not capture the fact that these comments come from adults who are grooming/sexualizing/harassing a child. 

Happily, since 2020, Instagram has instituted many important safety changes in response to these three requests.

 In October 2020, Instagram expanded its reporting system to account for sexual exploitation or solicitation, non-consensually shared images, and sexually exploitative actions involving a child. It also implemented a function that allowed any user to opt out of receiving direct messages from a stranger.  

Later, in March 2021, Instagram went further and made it the default setting that adult users who have not been “friended” by a minor cannot send them a DM at all. If the adult user tries to do so, they receive a notification that DM-ing a minor is not an option. Furthermore, Instagram incorporated in-app prompts to encourage teens to be cautious in conversations with adults they’re already connected to. Instagram developed the ability to monitor suspicious behavior by adults and alert the teen; for example, if an adult is sending a large amount of friend or message requests to teens, Instagram would alert the teen about this behavior within their DMs and point them to an option to end the conversation or block or restrict the adult.  

In addition to encouraging teens to be more cautious, Instagram also began exploring how to make it more difficult for “suspicious adults” to find and follow teens. According to Instagram, this may include taking actions against adults who have been exhibiting potentially suspicious behavior (e.g. sending friend requests to several teens at once or being repeatedly blocked by many minors). Preventative actions could include restricting those adults from seeing teen accounts in ‘Suggested Users’, preventing them from discovering teen content in Reels or Explore, and automatically hiding their comments on public posts by teens.

In March 2022, Instagram launched a new Family Center with teen safety tools, parental monitoring, and education resources. The parental controls allow guardians to monitor a child’s time on Instagram, see with whom their children are interacting with, and can be alerted if a child reports a user or post. While any step toward child protection is good, these changes do little to make the platform safer (more on this below).  

These are all laudable changes, and we are grateful to Instagram for adopting them. However, it is imperative that the change does not stop here. Especially in light of recent research which sheds light on the extent of the harm that Instagram causes children. 

NCOSE's Memo to Instagram

Read NCOSE's email memo to Instagram in November of 2021

However, the #WakeUpInstagram campaign continues for two main reasons.  

First: Despite the incremental improvements, many more changes must be made in order to make the platform safe as evidenced by the fact that Instagram continues to be cited by researchers, survivors, and law enforcement as one of the most dangerous platforms for children and vulnerable adults.  

Second: Meta (formerly Facebook, the owner of Instagram), was exposed by whistleblower Frances Haugen in 2021 to have acted in bad faith by hiding evidence of Instagram’s harms to minors – especially girls. Instead of making substantive improvements to stem those harms, Meta directed resources to the development of Instagram for Kids to try to expand its user base and hook kids to its brand at even younger ages.   

Instagram as a primary social media platform through which minors experience harm  

Instagram is consistently noted as a top platform used for grooming and child sex trafficking.  

The 2020 Federal Human Trafficking Report  released by the Human Trafficking Institute found that 65% of child sex trafficking victims recruited on social media were recruited on Facebook, with Instagram cited as the second most frequently used platform (14%). 

The latest data from UK’s National Society for the Prevention of Cruelty to Children noted Instagram as the most commonly used site for grooming, as flagged by police in 32% of instances in 2020 where the platform was known  

In May 2021, the child protection non-profit Thorn published some highly informative quantitative research, based on data collected in 2020. According to this research, Instagram ranked second as the social media platform most widely used by minors. 76% of the 9-17 year-olds surveyed by Thorn reported using Instagram (50% at least once a day); this level of use was second only to Youtube, at 97%.  

Instagram ranked at the top among platforms for various harms caused to minors. Thorn found the following regarding harm on Instagram: 

  • 26% of surveyed minors reported having had a potentially harmful online experience through Instagram (tied with Snapchat for the highest percentage).
  • Instagram tied with Snapchat again as most popular platforms where the most survey participants said they have had an online sexual interaction (16% of all respondents). Sexually explicit interaction could include being asked to send a nude photo or video, go ‘on cam’ with a sexually explicit stream, being sent a sexually explicit photo (of themselves or another child), or sexually explicit messages, etc.
  • Of those who use Instagram at least once a day, 22% reported experiencing a sexually explicit interaction on the platform (second only to Snapchat at 23%). 
  • Most disturbing, Thorn notes that among the most used platforms, Instagram (together with Snapchat) appears to host the highest concentration of sexually explicit interactions between minors and adults (13% of users).
  • Teenage girls, who are particularly vulnerable to online sexual interactions, have the majority of these experiences on Instagram (21%) and Snapchat (21%) —mirroring the experience of minors overall.  

Given Instagram’s top-ranking rates with respect to both use by minors and dangers posed to minors, it is absolutely incumbent upon Instagram to change this: they
must pursue all possible measures to ensure the safety of the children who use their platform.   

Instagram hiding and ignoring research demonstrating the platform’s harms to minors   

Thorn is not the only entity who has conducted research on Instagram’s harms to minors. Facebook, the owner of Instagram, conducted such research as well, and found Instagram to be harmful to minors (especially young girls) in numerous ways. Very disappointingly, however, Facebook failed to act honestly and responsibly in the face of its own research, and instead concealed the findings from the public. It was due to a whistleblowers revelations and subsequent hearings in the US Senate that brought to light what many had long suspected: that Facebook consistently chose profit over child protection and well-being. 


Yet even while knowing all these facts about how Instagram harms children, Facebook still strove to attract more children to its brand at an increasingly young age. In March 2021, an internal Facebook document was leaked by Buzzfeed revealing plans for “Instagram Kids”, which would target children 12 years and younger. While pressure from the National Center on Sexual Exploitation and other child safety advocates, legislators, and 44 US attorneys general successfully moved Facebook to pause its plans for Instagram for Kids, the fact that these plans were adopted at all while Facebook was fully aware of the harms Instagram causes to minors is indicative of the company’s callous prioritization of its bottom line over the safety of our children.  

What Instagram Still Needs To Improve:

Invest in and expand use of existing technology to scan for and block sexually explicit images in posts and messages.

Survivors of sex trafficking share that sexually explicit images of them taken by their traffickers continue to be posted and reposted across Instagram – even when the child is able to leave the trafficker. Pimps will often leave their victims’ pages up, using it as a marketing and grooming tool. Given the exponentially rise in child sex abuse material and non-consensually shared images, Instagram should prioritize expanding use of existing technology, including AI, to scan for and remove sexually explicit images/nudes in posts and direct messages and remove and block the offending users.

Prioritize using and/or creating tools to prevent, identify, and remove content, hashtags, and comments that sexualize children on minors’ accounts, as well as on managed accounts for children 12 and under. Remove and block offenders.

Sexualization and sexual exploitation of children continues to be a serious problem on Instagram. While some children are themselves posting sexualized content, hashtags, or comments, we know that adults are using your platform to normalize the idea that children are sex objects, grooming children through comments (and still through DMs even with the new policy), and using Instagram to sell children’s images or children themselves. Greater restriction by Instagram on the type of content allowed, as well as expanded use of technology to identify and remove material could significantly reduce the access by predators to children – including pedophiles, pimps, and sex buyers.

Repeat offenders posting sexualized comments should be barred from the platform and “suspicious adults” should be prevented from commenting on minors’ content, as well as on managed accounts for children 12 and under.

Minors including pre-teens are still being recommended via Explore, Suggested for You, and (particularly concerning) Reels – where pre-teens are essentially providing sexual entertainment for men. Some girls have hundreds of thousands of followers and millions of views on their Reels.

Related to the point above, Instagram should expand its Community Guidelines to prohibit hashtags, emojis, usernames, etc. that sexualize minors. While we appreciate the warnings that now appear on searches such as “teen pornstar” or “teen porn,” allowing those terms at all diminishes the gravity of a criminal act and normalizes a growing crisis. That’s the “best case” scenario. At worse, that content is being used to signal to predators of available CSAM, trafficking victims, etc.

We recommend reviewing TikTok’s expanded Community Guidelines as a bare minimum comparison.

Remove or limit risk-accelerating features for minors, especially 13 – 15-year-olds.

The industry is moving toward limiting or removing some features all together for minors that have proven to be high risk. We appreciate the steps taken to default some features – like “privacy” for new users 13 – 15 and encourage Instagram to continue assessing each available feature and whether adjustments need to be made for minors; including implementation of a graduated approach which could significantly improve the safety of young teens who are new on their digital journey and lack the brain development to fully understand risks and consequences.  

  1. Stop recommending unconnected adults to minors in “Discover People.” As many young people are in a social-media arms race with their peers to have as many followers as possible, this feature encourages them to befriend strangers. Even adults connected to friends are not friends. They are strangers.
  2. Restrict all adults from seeing minors in “Suggested Users” or “Discover People” We thank Instagram for making it harder for “suspicious adults” from seeing teen accounts. We recommend expanding this to prevent all adults from seeing minors’ accounts. If a minor knows an adult in real life, they can very easily obtain their user information and connect that way – and vice versa. 
  3. Disable Vanish Mode which is often used to send sexually explicit images and/or hide dangerous interactions.
  4. Remove DM ability for 13 – 15-year-olds altogether (as TikTok has done) given the high risk and reality of grooming and exploitation through this feature. If Instagram continues to move toward end-to-end encryption, considering this option becomes even more crucial for child safety.
  5. Remove the ability for any unconnected adults from commenting on minor’s posts, even if the account is public 

Instagram has proven to be dangerous for children. And while we advocate for greater responsibility by Instagram itself in amending problematic features, we also advocate for the critical need for parents and caregivers to have more options in protecting their children and managing their online experiences. We find the  “parental controls” (initially announced days before Mr. Mosseri was to testify in front of Congress and rolled out in March 2022) utterly insufficient to stem the many harms on the platform.  Children have to approve parental supervision and can turn it off at any time. And while monitoring screen time and who children are interacting with are definite improvements, they are insufficient. 

It’s past due for Instagram to develop tools that allow greater oversight to caregivers. Tools could include a graduated approach, with greater oversight available for 13 -15 – year-olds who are early in navigating the digital landscape. At the very least, caregivers should have the option of being alerted when settings (such as privacy) are changed and an ability to block certain interactions with adults. 

We’ve heard Instagram argue against pin-protected controls saying this option may be used to assert greater dominance by domestic abusers, traffickers, and other perpetrators over their victims. While this may be the case, it is a small percentage of people whose abusers already likely have all of their account information (passwords, etc.), and are, unfortunately, already monitoring their every move. We recommend creating features that could protect a large majority of young users from finding themselves in risky or dangerous situations in the first place. 

The current process to report an account of someone aged 12 or under is overly burdensome and should be rectified given Instagram’s own admission that there are countless young children on the platform.

Instagram should remove all accounts promoting or hosting prostitution and pornography – either individual users or corporate accounts, such as Pornhub and OnlyFans. At the very least, they should be blocked for minors.

While we are grateful and relieved that Facebook has paused the plans for Instagram for Kids, such plans must be cancelled completely. Rather than building a new product to “hook” kids at even younger ages, Instagram should prioritize stemming the rampant sexual abuse and exploitation of the many minors currently on Instagram. 

Instagram needs to clearly warn parents and youth of the risks of using their platform and provide information and resources about grooming, sex trafficking, sextortion and other threats known to exist on Instagram. 

Instagram most recently unveiled Family Center, 67-page Parent Guide and Education hub, which includes information and tips for parents and youth themselves to make their Instagram experience safer and healthier – with resources on bullying, suicide prevention, misinformation, etc. (all important and good!). However, glaringly absent is any information, resources, or reference about how to prevent, recognize, or report grooming, sex trafficking, child sex abuse material, or pornography – major safety risks which we know are prevalent on Instagram. In fact, the section in the Safety Center on reporting content and accounts inexplicitly excludes the reporting for “nudity or sexual activity” (which thankfully IS a reporting option on the app that we have worked with Instagram to improve). If this is the primary resource for parents and kids, it is deceptive and dangerous to exclude the known harms on the platform.


Given the number of minors using Instagram and the many proven dangers to their well-being, coupled with the fact that Instagram has the resources to institute sweeping changes – it is truly negligent of Instagram to continue reactively making small, incremental changes rather than proactively transforming their product.

It is long past time that Instagram embrace responsibility for rectifying the extensive harms it perpetuates on its inherently harmful platform.

WARNING: There are graphic images and text descriptions shown in these sections.


The Failures of Instagram Privacy Settings

The Failures of Instagram Privacy Settings

It takes four clicks to turn on privacy settings, which are not intuitive. Please move the privacy settings to the first page of the menu, potentially under “Your Activity” in order to promote awareness of this safety feature.

To improve its systems, Instagram should:

  • Make privacy settings more visible in order to increase awareness of safety tools;
  • When an account is private, remove the ability for strangers to send unsolicited direct messages to that account and remove the ability for that person’s account to be visible in the Likes or Comments on other posts—similarly to how Twitter hides the names of accounts when they are set to private;
  • Create an alert that appears in the direct messages area whenever a new person sends a user a direct message – which alerts the user to the fact that they can report accounts. Many teenagers and adults are unsure how to define sexual harassment, so we encourage Instagram to also include a link within that safety alert to the following definition of sexual harassment: Sexual harassment includes but is not limited to, unwanted sexual advances or attention including physical actions, speech that is sexually provocative, and unsolicited sending of or requests for pornography or nude images/videos. This kind of notification could mirror Instagram’s efforts on self-harm as outlined here: https://wellbeing.instagram.com/safety by also giving them the option to connect with services;
Grooming and Sex Trafficking on Instagram

Grooming and Sex Trafficking on Instagram

Examples of Coercion in Posting Pornographic Videos or Ads for Sex Acts on Instagram

Several 15-year-old girls who survived sex trafficking told NCOSE that girls were often forced to perform sex acts, or nude dances, on Instagram Live in a manner that made it look as if they were consenting during the act even though their trafficker stood in the corner the whole time.

Another young girl shared her story online about how being direct messaged by a pimp on Instagram as a minor eventually led to her being sold for sex on the streets. https://youtu.be/itqJIr3Osaw

Another anonymous girl shared with NCOSE that she was solicited on Instagram to send nude images, and after she did, that person used it to blackmail her into sex trafficking.

A California man has been charged with sex trafficking after forcing a San Jose woman he met on Instagram into a life of prostitution all over the western United States, authorities say.

A Texas pimp/trafficker created a dedicated Instagram page for selling 15-20 women for sex. “Get a glimpse of the life of these Macknificent ladies!” he wrote. He also created a Facebook page for the women with a “book now” option. Police referred to the women in his Instagram posts as being “controlled” by him. According to Felicia Grantham, human trafficking coordinator for the Fort Worth Police Department, there have been “53 at least semi-identified victims.” https://www.star-telegram.com/news/local/crime/article219689045.html

A Nashville man was been sentenced to 15 years in federal prison in a sex trafficking case involving a 13-year-old girl. He met Instagram in 2016 and recruited her on that platform. https://www.newschannel5.com/news/nashville-man-gets-15-years-in-child-sex-trafficking-case

Examples of Grooming/Soliciting Direct Messages

Domestic sex trafficking survivor Courtney Litvak has spoken openly about how her grooming started through Instagram direct messaging and Snapchat private messaging and an The Epoch Times article about her story reported data from Tammy Toney-Butler, an anti-human trafficking consultant for Path2Freedom, which shows that 55% of domestic sex trafficking survivors who entered "the life" in 2015 or later met their trafficker for the first time using a mobile app, website, or text. https://www.theepochtimes.com/trafficking_3152248.html

Two Californian youths aged 15 and 17 narrowly avoided becoming victims of a predator they met on Instagram thanks to an American Airlines agent. https://www.independent.co.uk/news/world/americas/instagram-sex-trafficking-plot-stopped-airline-agent-denice-miracle-a8216726.html

A child advocate ally created a 15-year-old girl account, which liked a few celebrities and “cute boys” accounts.  Within two hours she got a message from a man soliciting for nude pictures and send her some.

One father uncovered a sex-trafficking scheme where an Instagram account posing as a young boy would contact young girls, including his daughters: "Bruce was a “friend” to quite a few of the girls in my daughter’s circle of friends and they would chat daily. Bruce also had many friends that were being introduced to the circle and they all began to chat through Instagram...he unknowingly was luring young girls into his circle as prey for the men to pick and choose from. The circle of Bruce’s friend list reached the globe and his over 2k followers were nothing more than a smorgasbord of young unaware children these men were chatting with." https://faithit.com/dad-sex-trafficking-scheme-daughter-instagram/

A young college student told NCOSE there are accounts on Instagram targeting her sorority friends attempting to groom them for commercial sexual exploitation in a “sugar baby” relationship. Given insurmountable student debt facing many college students, they are particularly economically vulnerable to this kind of sexual exploitation.

Police: Alleged Gwinnett sex trafficker met runaways on Instagram https://www.ajc.com/news/local/police-alleged-gwinnett-sex-trafficker-met-runaways-instagram/YJtpHdkRvR0Ust7395tO3O/

Rhode Island parents' daughter went missing, and they "learned from her friends a few hours later that she had been communicating with a man on Instagram. At once, Hewartson said he realized they were dealing with a sex trafficker or pedophile. The sheriffs elevated the case from that of a missing person to a suspected wilful abduction. A neighbor and a family member reported a bearded man in a black car slowly passing back and forth along the country road at the time the teen went missing....They couldn’t reach anyone at Instagram to ask for assistance...On Tuesday morning, they tracked the teenager to Warwick." https://www.providencejournal.com/news/20190808/father-of-abducted-teen-girl-grateful-for-second-chance--audio

This father in the video below discusses his 13-year-old’s experience on Instagram. A stranger who looked similar to another teenage girl started talking to her in an innocent way. But then later they started asking if she or any of her friends wanted to go “on a date” particularly “if they need money.” The person also tried to find out the location of where the girl lived. This example of grooming for likely sex trafficking was documented by Defend Dignity in Canada.

Report by BBC: "Slave markets found on Instagram and other apps" https://www.bbc.com/news/technology-50228549

Pedophile-Like Comments on Young Girls' Photos

Pedophile-Like Comments on Young Girls' Photos

Due to the graphic and disturbing nature of the images and text, we've removed them. Please contact us for more information.

Take Action

Tell Instagram: Stop Targeting Children

How Social Media Preys on Children

Frances Haugen testifies about Facebook's harm to children.

Share Your Story

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Help educate others and demand change by sharing these on social media:


Last August 2021, The National Center on Sexual Exploitation, Protect Young Eyes, child safety advocate Melissa McKay, and 25 child advocacy groups sent a letter to Apple encouraging the tech giant to consider 10 critical safety features to increase child safety. Many of you joined us, also pleading with Apple

Read More »

PORNHUB DISAPPEARS FROM INSTAGRAM!   Instagram is the latest to cut ties with Pornhub! Over the years, Instagram has pushed millions of visitors, including children, over to Pornhub’s website which is rife with criminality.   View this post on Instagram A post shared by NCOSE (@endexploitation) There are now hundreds of survivors, many lawsuits,

Read More »
What the Snapchat Family Center Looks Like for Parents and for Teens

An explanation of the Snapchat Family Center tools, how to get them, and what more Snap needs to change to protect its young users.

Read More »