VICTORY! Snapchat Makes Numerous Safety Changes in Response to Dirty Dozen List

By:

90% of young people in 20+ countries protected!

Snapchat, an app which parents and child safety experts have consistently named as one of the most dangerous, has made numerous substantial safety changes as a direct result of being named to the 2023 Dirty Dozen List! NCOSE and our supporters have been pressing on Snapchat since 2016, and we are very grateful that they have now listened to our call and are taking significant steps to prioritize safety. 

The changes announced by Snapchat include improving detection and moderation for sexually explicit and exploitative content, defaulting content controls for new minor-aged accounts joining Family Center, increasing parent’s visibility into their child’s activity through Family Center, and creating dedicated resources on sexual abuse and exploitation. 

In a public statement, Snapchat thanked NCOSE for influencing these improvements, stating: 

“Several of our new product safeguards were informed by feedback from The National Center on Sexual Exploitation (NCOSE). Our new in-app educational resources were developed with The National Center for Missing and Exploited Children (NCMEC). We are grateful for their recommendations and contributions.” 

But this isn’t just NCOSE’s win—YOU made this happen by signing our actions and participating in the Dirty Dozen List campaigns every year!  

The impact of this victory is truly astronomical. Snapchat is immensely popular among youth, with 90% of 13–24 year olds in 20+ countries using the app. Thanks to you raising your voice and taking action, all of these children and youth will be much better protected! 

Read on for a detailed breakdown of what exactly Snapchat has changed, and why it is so important.  

Improved Detection and Moderation for Sexually Explicit and Exploitative Content 

As painful as it may be—let’s refresh our memory on the egregious content Snapchat was openly serving to its minor-aged users. In our letter and proof to Snapchat, NCOSE researchers outlined the extensive amount of sexually explicit and harmful content they were exposed to while posing as teenagers on Snapchat. Commonplace were videos of people having intercourse on Public Stories, profile links leading to hardcore pornography used to advertise prostitution, topless maid service promotion, pornography performer interviews, sex act simulations videos on Discover … and more.

Throughout the 2023 Dirty Dozen List campaign, you joined us in asking Snapchat to proactively block, detect, and remove sexually explicit content. We are thrilled to report that Snap has instituted some major policy changes to help prevent access to harmful content, has improved processes for detecting and removing violative content, and has raised the threshold of what is type of content is acceptable.  

In our letter to Snapchat, we showed how their detection system was easily evaded through the use of special characters; for example, profiles like “nu-des6” surfaced in searches for “nudes” and “nudes leaked.” Such profiles would lead to prostitution sites advertising with hardcore pornography. Now, Snapchat has improved their Abusive Language Detection system to handle character variants, so that bad actors can no longer as easily trick the system.  

You also joined us in asking Snapchat to proactively detect, remove, and block accounts or bots promoting pornography or prostitution. In response, Snapchat has begun proactively detecting and removing sexually explicit content in Public Stories and is only allowing official partners/advertisers to have links in their public profiles. That means that only a select few vetted Snapchat partners can have links in their profiles—this is no longer available to just any Snapchatter. These links were one of main ways users—including minors—could easily access external prostitution or pornography sites.  

Further, Snapchat removed ¼ of their publishers that they found were violating rules, including most of the accounts we brought to their attention as violating their policies, and took steps to better ensure the removal of other bad actors. For example, they created a “strike system” to enforce against users continuously publishing violative content. 

Improved Visibility and Content Controls for Parents in Family Center  

Another request you joined us in making through the Dirty Dozen List campaign was for Snapchat to expand the Family Center functions, allowing parents more visibility into their child’s activity on the app. Starting in 2024, Snapchat will provide parents with a view-only insight to their kids’ most important safety settings and will allow parents to see who messages their child (currently they are only able to see who their child messages). Further, parents will have increased visibility into their child’s safety settings and group membership/activity (though they will not be able to see the content).  

Additionally, Snapchat’s Content Controls will be ON by default for all new minor-aged accounts joining the Family Center. The controls will also restrict even more content that may be considered “suggestive,” and will hide unverified accounts (which helps prevent access to bad actors and bots attempting to drive users to prostitution and pornography sites).  

Dedicated Resources on Sexual Exploitation and Abuse 

For years, we have been pushing on Snapchat to provide in-app resources about sexual exploitation and abuse to its users, as these harms are a very real risk on the platform. Such resources are especially important for minors because data shows that Gen Z users are more likely to search for resources on social media rather than on search engines. Not only did Snapchat formerly not provide users with information on these issues, but some search terms which might be used by those seeking help (e.g. “sex trafficking” or “nudes leaked”) actually led to sexually exploitative content! 

Now, Snapchat is providing in-app resources that will appear when users make the same searches. NCOSE is grateful that Snapchat is proactively adopting a more survivor-centered approach by offering resources for search terms associated with exploitation rather than surfacing results of actual exploitation.

Further, in response to our requests, Snapchat will soon be releasing multiple Safety Snapshot episodes about sexual abuse and exploitation and will create a dedicated page on sexual risks and harms resources and support.  

NCOSE Moves Snapchat from the Dirty Dozen List to the Watch List  

NCOSE warmly commends Snapchat for their prompt, receptive response to the concerns we raised through the Dirty Dozen List campaign. In recognition of the substantial improvements they’ve already implemented across multiple areas of the platform and the forthcoming changes they’ve announced, NCOSE is officially moving Snapchat from the Dirty Dozen List to the Watch List for 2023. The Watch List is intended for corporations who have taken significant steps forward, but whom we have some lingering concerns and will continue to monitor closely.  

A key remaining concern we have is the fact that Snapchat has yet to address the most dangerous feature of the platform through which the most egregious abuses happen: the chats. NCOSE and other allies have been advocating that Snapchat automatically block sexually explicit content sent or received by minors in chats and blur sexually explicit content sent to users 18+. Despite policies prohibiting the sharing of sexually explicit content, this practice remains prevalent on the platform—which is especially alarming when it involves minors, who may be engaged in grooming conversations with predators. For example, the most recent report by child online safety organization Thorn found Snapchat to be #1 platform where most minors reported having an online sexual interaction, and #3 for sexual interaction with an adult.  

Google just started proactively blurring all sexually explicit images, Bumble also blurs sexually explicit images for its users, and Apple will soon offer such a feature to all users and default it for kids 12 and under through Family Sharing. We are hopeful that Snapchat will be the next major corporation to adopt such a common-sense policy. 

We’re optimistic that Snapchat will continue to build on their admirable progress in prioritizing safety by making this and other critical changes to keep kids safe.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.