A Mainstream Contributor To Sexual Exploitation

Snapchat Is Dangerous By Design

Year after year, Snapchat is named a top spot for sextortion, sexual interactions between minors and adults, and pornography exposure.

Take Action

Updated 9/28/23: Snapchat, an app which parents and child safety experts have consistently named as one of the most dangerous, has made numerous substantial safety changes as a direct result of being named to the 2023 Dirty Dozen List! The changes announced by Snapchat include:

  1. improving detection and moderation for sexually explicit and exploitative content,

  2. defaulting content controls for new minor-aged accounts joining Family Center,

  3. increasing parent’s visibility into their child’s activity through Family Center, and

  4. creating dedicated resources on sexual abuse and exploitation. 

In a public statement, Snapchat thanked NCOSE for influencing these improvements, stating: “Several of our new product safeguards were informed by feedback from The National Center on Sexual Exploitation (NCOSE). Our new in-app educational resources were developed with The National Center for Missing and Exploited Children (NCMEC). We are grateful for their recommendations and contributions.” 

Check out our blog and press release for more information!

Updated 6/22/2023: Snapchat responded to its placement on the 2023 Dirty Dozen List in a detailed letter to NCOSE, stating that they “immediately conducted a comprehensive review of your concerns and recommendations to identify what actions we could take quickly, as well as additional tooling and product improvements that we could make longer term.” Some of the initial actions Snap took included implementing consequences for the problematic content providers which NCOSE brought to their attention, conducting an audit of the effectiveness of Content Controls and other features in the Family Center, and initiating a deep-dive into their editorial guidelines for both Spotlight and Discover. Our team will meet with Snapchat later this summer to review promised improvements. 

Samantha* chewed her lip as she stared at the request for a sexually explicit video.  

She probably shouldn’t do this. She was only eleven years old. And Matthew was twenty-two. . .  

But this kind of thing was safe on Snapchat, right? The videos would automatically disappear after Matthew looked at them.  

Samantha hit record . . . And then she hit send.  

Immediately, Samantha felt shame and fear wash over her. What would her parents think if they found out? With trembling fingers, Samantha frantically sought out Snapchat’s “My Eyes Only” feature. As she hid the sexually explicit videos in the passcode-protected folder, relief slowly seeped into her body.  

Thank goodness for Snapchat’s brilliant design features . . . Her parents wouldn’t find out. She could continue sending videos to Matthew safely, in peace.  

But she was not safe. And she was not at peace.  

Samantha is a pseudonym, but the narrative is based on true events, as alleged in a lawsuit against Snap (the owner of Snapchat). Samantha’s lawsuit is only one of many, in which the tech company is being sued for causing grave harm to minors—including leading to their sexual exploitation and sometimes death. Samantha herself attempted suicide twice as a result of the exploitation she experienced on Snapchat and other social media platforms.  

Snap knows how dangerous its platform is but has done little to resolve the issues. Instead, Snap continues to keep and even add yet riskier features, such as My Eyes Only, Speed filter, Snap Streaks, and MyAI. 

Research, law enforcement, survivors, lawyers, and other child online safety experts consistently name Snapchat within the top three most dangerous platforms for children—and available data on child online exploitation confirms this assessment.  

Snap must radically reform its platform. Enough is enough!

Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Roblox for more details.

See our Notification Letter to Snapchat here.

Our Requests for Improvement

Proof

Evidence of Exploitation

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.

It is well-documented that Snapchat is one of the most popular – if not the most popular – app where children send sexually explicit images of themselves (a form of child sex abuse material) and where they have sexual interaction with adults. 

A simple Google Search will reveal multitudes of news stories related to grooming and child sex abuse on Snap, with some situations tragically leading to the death of the children victimized… 

Children like S.R. who became addicted to Snapchat at the age of nine. By the age of eleven, S.R.’s feverish engagement on Snapchat had escalated to her regularly talking with older men and sending them sexually explicit images of herself.

The explicit images, which were child sexual abuse material (CSAM), were shared publicly. They were seen by people in S.R.’s school, leading to her being severely bullied.

S.R. fell into depression. And eventually, this young girl committed suicide.

How many more children will lose their lives or suffer lifelong trauma until Snap makes significant changes to ensure their platform is safe for kids? 

See all the proof we’ve compiled (as of July 2023) in this easy to view & download PDF

Despite having made some improvements throughout the years to limit rampant pornography, sexually explicit content is still easily accessible and even promoted to minors – including in the more “public” and curated areas Spotlight and Stories. In fact, two recent studies noted Snap to be a go-to place for pornography and sexually explicit content:  

Often times the pornographic videos and images are used as a way to promote various forms of commercial sex, leading users – including minors – directly to prostitution sites and accounts. However, users also post very graphic sexually explicit content not only in Snaps, but in thier “stories” as well. Furthermore, prostitution promotion and highlighly sexualized content is easily found and is not blocked with the Content Controls available through Snap’s Family Center. 

*All of the images accessed by fake teen accounts aged 15 or 16. The same content is served to 13 and 14 year olds.

 

See all the proof we’ve compiled (as of July 2023) in this easy to view & download PDF

Instead, of making their platform inherently safer, Snap continues to make risky features available and to roll out new products that clearly were not designed with children in mind (and even lie about them as evidenced here and here).  

The riskiest section of Snapchat, the Chat, has seen no substantive reform – or at least not any that have been announced publicly or that our researchers have noticed. And the evidence listed earlier in this letter supports this assessment. Chat, or the “Snaps,” are where children are most often harmed: it’s where the child sexual abuse material is captured and shared, it’s where the grooming and the sextortion happens. Snap could be taking basic, common sense measures like proactively blocking sexually explicit content being sent to and from minors. It could be blurring sexually explicit content sent to adults, something dating app Bumble has as a default.  

Other risky features like Snap Map and Live LocationSee Me in Quick Add, and My Eyes Only (which has been long called out by child safety advocates and police as a tool for keeping child sexual abuse material), continue to be available to minors.  

Why?  

We can only surmise that Snap leaves kids at risk because it’s profitable. (Click on hyperlinks to see articles warning about each features).  

In March 2023, Snap’s My AI was released without warning and was quickly exposed for giving advice to a researcher posing as a 13-year-old about how to have sex with a 31-year-old they met through Snapchat! Did Snap think that saying “sorry in advance” with a lengthy disclaimer somehow excused any potential harm to children by this experimental product that clearly wasn’t ready to be unleashed on minors? MyAI should NOT be available to minors at all (a sentiment more and more parents are sharing). We will not go deep into all the problems and risks with MyAI – thankfully there’s a flurry of media attention around that currently. But we thought we’d share a few of the conversations our 14-year-old researcher had with MyAI from April 19 – 29 (these conversations were after Snap added so-called safeguards and thrust it onto everyone’s accounts without their consent) 

MyAI told our researcher (who told MyAI she “was” 14-year-old) that she is the only one who can decide when she should have sex. US Federal law sets the age of consent at 18, and most states have it set at 16 – 18. This advice came after the researcher had also told MyAI her parents believe she is too young to have sex. Why is MyAI giving teens advice that is illegal and in contradiction to parental guidance? 

MyAI also gave instructions to our “teen” on how to create a second account and lets them know parents will be unlikely to find out. While it encouraged the teen to not keep secrets, the answer to the main question had already been given.

See all the proof we’ve compiled (as of July 2023) in this easy to view & download PDF

Snap announced several safety measures in 2022. NCOSE has assessed and tested the primary changes Snapchat rolled out around improved child safety – in particular Family Center – and have found them to be grossly inadequate – and arguably even dangerous as they give parents and the public a false sense of safety, security, care, and concern for Snap’s young users. (You can read our more detailed blog outlining the flaws in Family Center, see additional evidence in our downloadable proof PDF, and review the assessment by Organization for Social Media Safety Snapchat’s Family Center: A New Talking Point Not a Tool.) 

We’ve also noticed mutliple instances of misleading, inaccurate information from Snap to parents. NCOSE has taken several of Snapchat’s claims and then included a small sampling of screenshots from of our young teen’s account set up by one of our researchers to test the accuracy of Snapchat’s statements.  

For example, in Snap’s Parent’s Guide section “Protections for Snapchatters Under 18, ” Snap tell parents: 

What Snap tells the child completely contradicts what is written in the Parent’s Guide.

Contradictory and deceptive messaging such as the example above are abundant (you can view more in the proof PDF below). 

Lastly, instead of making Snapchat inherently safer by turning all the highest safety settings on by default, removing risky features, and making all safety tools available to all minors, Snap shifted most of the burden onto already overwhelmed caregivers by releasing Family Center (a tool we have determined is largely ineffective). Available safety features are not even defaulted in Family Center, so parents need to turn them on. Perhaps most inexplicably, the safety features offered in Family Center (like Content Controls) are not offered to all minors (even kids linked to Family Center need their parents to turn on safety features). So it’s the kids who have the privilege of involved, informed, tech-savvy parents…or any parents at all (the ones least likely at risk to begin with) that get most protections on Snap.

See all the proof we’ve compiled (as of July 2023) in this easy to view & download PDF

Fast Facts

Apple App Store rating: 12+, Google Play Store rating: T, Common Sense Media: 16+ 

#1 parent-reported platform for sharing of child-sex abuse material (Parents Together, April 2023)

#1 platform where most minors reported having an online sexual interaction, #3 for sexual interaction with an adult (Thorn Report, February 2023)

#2 highest platform used for sextortion (Snapchat 38%, Instagram 42%) – “by far the most frequently used social media environments where victims were targeted”

#3 parent-reported platform for sexually explicit requests to children and among the highest rates of parent-reported exposure to inappropriate sexual content

Share

Help educate others and demand change by sharing this on social media or via email:

Facebook
Twitter
LinkedIn
Email

Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.