Discord Logo

The Problem

Discord, a widely popular communication platform for adults and children alike, consistently fails to address the extensive sexually graphic, violent, and exploitative content on its thousands of private and public channels. Discord’s inadequate age verification and moderation procedures means millions of children and teen users are left with near-unmitigated exposure to child sexual abuse material, nonconsensual pornography trading, and the predatory grooming that are rampant on its platform.

Demand that Discord do more to protect kids by taking action below!

What is Discord? 

Discord is a popular communication service which has grown exponentially since the COVID-19 pandemic channeled the world’s energy into digital spaces, going from just 56 million active monthly users in 2019 to more than 150 million users in 2021.  

Started in 2015 as a haven for gamers who were looking to connect with each other while playing video games together, Discord has pivoted to become a widely used text, video, and audio chat app that can be accessed via computers, tablets, and mobile phones. In 2021, over 30% of users claimed they use Discord for activities other than gaming. Discord itself has capitalized on this expansion of its brand by advertising Discord as a way for teachers to reach their classrooms, virtual book clubs to discuss their latest read, and even positioning Discord as a viable workplace alternative to Slack or Microsoft Teams. 

The main feature of Discord are its servers, which are chat rooms based around a particular interest or activity. Users can join servers to connect with others over shared interests in real time. Servers can be open to the public, though they are more commonly set to private by users whereby they require invites and special passwords to join. Moderation within these servers has, historically, been mostly left up to the members themselves with Discord simply relying on user reports and private server owners to catch bad behavior. Discord’s own Safety Center confirms as much by saying: “We do not monitor every server or every conversation.”  

This lack of clarity around moderation creates confusion, and the confusion masks a larger problem with safety on Discord: age verification. Discord’s age verification procedures are contradictory, as users were not even asked for their date of birth upon signup until mid 2020—which now remains the only form of age verification that Discord provides for its users. 

The laissez-faire moderation attitude from Discord has created an environment that fosters sexual exploitation and abuse on a massive scale.

Discord has had problems monitoring and removing bad actors in the past and it is undeniable that the pandemic and Discord’s subsequent brand expansion has created massive growth for the platform which has in turn opened doors to predators and abusers by allowing them a form of cover to settle comfortably within Discord’s servers. While entire servers and individual channels within a server can be age-gated with a “NSFW” tag, this feature is easily circumvented due to Discord’s lack of true age verification and moderation features. And finding these channels is not a difficult task—over 43,000 servers contain tagged NSFW content 

Worse yet, the problem is not just the sheer amount of graphic content embedded throughout Discord. The exploitative and harmful nature of said content is also deeply troubling. Pornography trading is popular on the platform, as users can share links and images of themselves and others. Entire servers on Discord are dedicated to users finding and sharing nonconsensual pornographic images of girls and women—material sometimes referred to as “revenge porn” or image-based sexual abuse. Discord made international news in 2020 when one server revealed that over 140,000 images of women and minors had been widely shared and distributed. 

While no official data is available to break down the age demographics of Discord users, the platform is extremely popular with one of the most vulnerable populations in the world: children and teenagers. With Discord boasting over 150 million active monthly users and given popularity of games like Fortnite, Minecraft, and Roblox, it’s fair to assume that millions of children are regularly using Discord.  

Although Discord does have some available safety settings, explained here in their Safety Center, NCOSE researchers found these to be far from adequate for properly protecting minors who use Discord. In Discord’s guidelines for “adult content”, they claim the highest levels of safety are on by default: 

“If you do not want to be exposed to adult content on Discord, or if you are under 18 years old, we recommend turning on the explicit media filter in your privacy settings. In your User Settings, under Privacy & Safety, choose “Keep me safe”. This setting is on by default and will ensure that images and videos in all direct messages are scanned by Discord and explicit content is blocked.” 

However, when NCOSE researchers made a Discord account with a birthday set to 13 years old, they found these safety settings were not actually on by default and that, instead, they still needed to be tracked down and toggled on within the account’s settings.  

Discord also provides no real way for parents to control their child’s experience on Discord via common-sense parental controls or strategies. In their official “Parent Guide”, Discord fails to provide proper warnings regarding the dangers waiting for children on their platform. Furthermore, Discord has made no announcements or moves to develop and implement meaningful parental controls in order to combat these dangers. Instead, they provide toothless advice such as:  

“Sharing personal images online can have long-term consequences and it’s important for teens to understand these consequences. Help them think about what it might feel like to have intimate photos of themselves forwarded to any number of peers by someone they thought they liked or trusted. Make them aware of the risk of sharing intimate pictures with anyone.  

Explicit content exists on Discord, as it does on many other online services. On Discord, users have to opt-in to seeing this content. Have a conversation with your teen about explicit content, what they may or may not be comfortable looking at, and whether they feel pressured to look at this content.” 

Nowhere does the guide include advice on how to circumvent grooming tactics, recognize predatory behavior, or warn against the harms of pornography and the guide completely fails to mention that when teens “share personal images online” it is automatically considered child sexual abuse material (sometimes referred to as “child pornography”) which can have severe legal consequences.  

Discord tries to hold itself up as a company and platform that treats others and its community with respect by claiming that it maintains its platform as “a safe and friendly place for everyone.” However, in addition to Discord’s troubling association with hosting and normalizing exploitative explicit images, Discord has facilitated a space for sexual grooming by abusers or sex traffickers which means it is anything but “a safe and friendly place” for minors.  

The evidence makes it clear that Discord has struggled to protect the children who use its platform from being targeted and contacted directly by adult predators.  

These Discord predators are notorious for using and abusing online gaming systems and social media sites like Discord to befriend and then prey upon vulnerable young people, often through mutual servers and direct messaging. One extreme case ended with two teenage boys being trafficked and held hostage for over a year after being groomed on Discord. A review of Discord from a concerned parent said this about the platform: 

“Seriously poor protection for children. Firstly lets dispel the myth that your child on here is safe as they can block people and only be found by people they choose to link with. They meet these ‘people’ on games who befriend them there and from there send them to discord groups and invites therefore bypassing the invite system. My child has gotten involved with a group who are far beyond the realms of any decency and the conversations people were having with my 14 year old have made me sick. I am just glad I have found out in time to try and prevent any further psychological problems. He has been slowly groomed and I guess it could have been only a matter of time before something dangerous happened. I would seriously not recommend this to any child or parent until it gets some sort of filters and security in place to stop people creating and running these sort of perverted and deviant groups.” 

Grooming is more than adults “simply” talking with children online. Research shows just how dangerous this open contact from predators to vulnerable minors really is—one study found that predators who actually made contact with child victims are more likely to use the Internet to locate potential sexual abuse victims and engage in grooming behavior. Groomers use a variety of tactics to gain their victim’s trust, including using pornography as a tool to manipulate the child into believing the sexual abuse is normal. One study noted: 

“Indeed, part of the grooming process is the normalisation of sexual activity with children and breaking down inhibitions. Offenders use child pornography to teach children how to masturbate, perform oral sex and engage in sexual intercourse. Sometimes, blackmail is also involved, usually at the later stage, after the child has been exposed to some sort of pornography, or after the child has performed sexual favours. The saturation of the Internet with such material may serve to ‘normalise’ this behaviour and probably makes it easier to objectify children as sexual artifacts.” 

  • A 12 year old girl was groomed for over two months and manipulated into leaving her California home in the middle of the night by a 40-year-old predator. 
  • Two teenage boys were found in a sex trafficking ring where the traffickers used Discord to contact the victims. 
  • A 22-year-old man was arrested when a 12-year-old girl he had been talking to on Discord disappeared with him in a car. 
  • Several predators groomed a 12-year-old boy by sending him explicit messages and even calling him over a period of six weeks before his mother found out and deleted Discord. 
  • A 25-year-old man extorted a 12-year-old girl into sending him explicit photos of herself and he then would threaten to publicly post the photos or report her if she did not send more. Authorities found this Discord predator was talking to other children aged 7-to-15 using the app.  
  • Posing as a teenager on Discord, one man blackmailed girls as young as 12 across the country into sending him explicit material of themselves.  
  • In Australia, a 20-year-old man solicited intimate images from a 12-year-old girl after grooming her through Discord with police reporting that his messages expressed a desire to engage in “violent and abhorrent behaviour.” 
  • Develop and implement parental controls so parents can monitor and streamline their children’s experience on Discord and ensure basic safety standards are being met.  
  • Automatically default minor-aged accounts to the highest level of safety and privacy available on the Discord platform. While Discord claims to already do this, NCOSE research revealed this is still not standard across all Discord accounts.  
  • Automatically block any and all minor-aged accounts from joining servers that contain NSFW content on both the mobile app and desktop versions of Discord. Discord currently has age-gating for individual channels within a server, but the fake 13-year-old Discord account used by the NCOSE research team was able to join a voice chat on the official Pornhub Discord channel.  
  • Develop and implement moderation strategies that proactively detect and remove pornography—especially for servers that are dedicated to trading hardcore and non-consensual material.  
  • Provide meaningful education to all users and parents on the potential harms and risks associated with exploitation and abuse on Discord, and prominently feature and highlight reporting processes on all forms of Discord’s interface. 

The millions of vulnerable youth using Discord are at risk of being harassed, groomed, abused, and exposed to harmful content and experiences on the platform. 

While Discord has some perfunctory safety settings and filtering options, the filter settings can be easily altered or bypassed and Discord has failed to take sufficient proactive steps to protect its minor users as evidence by the lack of available parental controls. This is an egregious betrayal of parents’ trust when they should be able to reasonably expect that safety settings will reliably protect their children. The fact that Discord relies so heavily on user reports and third-party bots to moderate content within its servers is laughable given the exploitative and abusive nature of many of its servers and the individuals that frequent said servers. 

It is abundantly clear that Discord is among the top technological platforms that are enabling and enhancing predators’ access to minors. This fact is greatly concerning as sexual harassment and assault continues to become more rampant in society. Even more disconcerting is the potential role Discord is playing in the facilitation of child abuse and sex trafficking. With its lack of robust safety settings, moderation strategies, and better guides for parents and users about the real dangers on its platform, it is evident that Discord is still not willing to prioritize child safety.  

Companies like Discord can no longer claim ignorance or avoid accountability—corporations have a responsibility to ensure their technology is not used for sexual abuse or exploitation.

Proof

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections.
POSSIBLE TRIGGER.

Pornography on Discord's Servers

Pornography on Discord's Servers

Due to the graphic nature of the descriptions & images found in this proof, we've compiled them into a PDF document
View Now

Sexual Grooming

Sexual Grooming

Due to the graphic nature of the descriptions & images found in this proof, we've compiled them into a PDF document
View Now

Reviews & Testimonies

Reviews & Testimonies

Due to the graphic nature of the descriptions & images found in this proof, we've compiled them into a PDF document
View Now

Unenforced Policies

Unenforced Policies

Due to the graphic nature of the descriptions & images found in this proof, we've compiled them into a PDF document
View Now

Progress

As progress is made in the efforts to protect families, help survivors, and expose the truth, you'll be able to find those details here.

Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.