PROGRESS! Discord Makes Important Updates to NSFW Policies

Discord, a popular communication service used by over 100 million active monthly users, recently announced key changes to the way they handle explicit content on their platform—just months after the company was named to the National Center on Sexual Exploitation’s annual Dirty Dozen List for a distinct lack of safety features. Discord’s previous policies fostered an environment of online exploitation, with the lack of moderation and content enforcement allowing exploiters to groom children for sexual abuse or sex trafficking, and to trade pornography—including child sexual abuse materials, non-consensually recorded and/or shared pornography, and more.

Discord’s new policies surrounding “Not Safe For Work” (NSFW) content help alleviate concerns about child safety and the potential exposure to graphic, exploitative pornography scattered across Discord’s thousands of servers, and the National Center on Sexual Exploitation commends Discord for taking a step in the right direction in protecting ALL the users on their platform. Their new update is exactly one of the improvements we asked for back in February, an automatic ban on any minor-aged account from joining servers that contain NSFW content.

.@Discord has new safety features after being named to the 2021 #DirtyDozenList! Learn more... Click To Tweet

Here is a breakdown of the latest changes:

  • Discord added the ability to designate entire servers as NSFW instead of just individual channels within a server. Now, servers that are marked as NSFW will automatically block minor users from joining them, an easy loophole discovered in our Dirty Dozen List research that allowed accounts as young as 13 years old to join pornography servers dedicated to trading non-consensual images, even if some of the individual channels were age-gated. This measure protects children from joining inappropriate servers—as long as Discord’s content moderation can properly flag these communities and users choose to proactively self-designate.
  • To access any NSFW server on the Discord iOS app, all users 18+ must opt-in to seeing such content in their personal settings. Discord representatives said that in order to comply with the Apple App Store rules, users must toggle on the option to view NSFW servers while on their desktop or browser. This means that all minors and any adult user who does not opt-in no longer has access to the thousands of NSFW servers while using the iOS Discord app.
  • Discord blocked all “servers that are focused exclusively on pornographic content” from the iOS app, regardless of whether or not a user opted in to view NSFW content. It remains to be seen whether or not Discord can keep up with the tens of thousands of NSFW servers on the platform—and how Discord will enact and enforce this designation.

While we celebrate Discord’s steps to block minors from exposure to pornography and other graphic content, this move does nothing to seriously address the exploitation and abuse happening on the “exclusively pornographic focused” servers, which includes child sexual abuse and objectification, non-consensually shared intimate material, revenge pornography, and more alarming behavior. By continuing to allow this content to thrive on its platform, Discord is condoning and potentially profiting from such use—if the rumored sales pitch from Microsoft goes through, Discord stands to make billions of dollars while the men, women, and children being exploited simply move behind an arbitrary age-gate.

Thanks @Discord for updating your NSFW policies... Now, please work on eliminating CSAM and non-consensually shared material on your site. Click To Tweet

As one Discord representative admitted on Reddit, the company sees this more as a move to remain rated 17+ on the App Store rather than any proactive decision to crack down on the exploitative and abusive content proliferating many servers—all the doors are still there, just with a few more locks. And that’s assuming users are on board with self-moderation and properly tagging their NSFW servers, a problem Discord has yet to solve.

If Discord is serious about protecting the millions of children using their platform and removing the child sexual abuse materials, hardcore pornography, and non-consensually shared pornography from their platform, we encourage the following additional improvements:

  1. Develop and implement parental controls, so parents can monitor and streamline their child’s experience on Discord and ensure basic safety is being met.
  2. Automatically default minor-aged accounts to the highest level of safety and privacy available on the Discord platform.
  3. Develop and implement moderation strategies that proactively detect and remove pornography—especially in regard to servers dedicated to trading hardcore and non-consensual material.
  4. Provide education to all users and parents on the potential harms and risks associated with exploitation and abuse on Discord, and feature prominent reporting processes on all forms of Discord’s interface.
Here are the 4 things @Discord can still do to protect children. Click To Tweet

The millions of vulnerable youths using Discord are at risk of being harassed, groomed, abused, and exposed to harmful content and experiences on the platform. No longer can companies claim ignorance or avoid accountability—corporations have a responsibility to ensure their technology is not used for sexual exploitation. Discord has made a step in the right direction, but still have a long way to go before their platform is “a safe and friendly place for everyone.”

Email Discord Executives

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.