Mainstream Contributors To Sexual Exploitation

Discord is a Haven For Sexual Exploiters

This messaging platform is popular with predators who want to groom children and find and trade child sexual abuse materials, as well as adult image-based sexual abuse.

Updated 6/22/2023: A few weeks after being named to the Dirty Dozen List for the third year in a row, Discord confirmed that it is testing parental controls—something NCOSE has been pressing on them to implement since 2021. These controls would purportedly allow for increased parental oversight of Discord’s youngest users through a “Family Center,” where parents would be able to see the names and avatars of recently added friends, which users their child has directly messaged or engaged with in group chats, and which servers they have recently joined or participated in. We will be pressure testing these controls once they are officially released!  

Predators don’t have to go to the dark web to find and trade child sexual abuse materials. They can simply go to Discord.

One day a young girl turned on her computer—and her life changed forever.

She logged into her Discord account, the popular gaming chat platform, and she was contacted by an adult man. They began talking, even having video chats, as he groomed her to eventually send him child sexual abuse materials (child pornography) of herself for months. And she wasn’t his only victim.

Tragically, cases like this are not uncommon. And often explicit abuse images like those of this young girl are further shared and traded on Discord.

Thanks to Discord, it’s easier than ever to view and trade child sexual abuse materials (child pornography.) And not only that, this platform also enables exploiters to directly contact and groom children, to share deepfakes and other forms of image-based sexual abuse, and it has shockingly ineffective parental controls.

Given how unsafe it is, NCOSE recommends that Discord ban minors from using the platform until it is radically transformed. Discord should also consider banning pornography until substantive age and consent verification for sexually explicit material can be implemented – otherwise, IBSA and CSAM will continue to plague the platform.

At the very least, Discord must prioritize and expedite the improvements listed below to ensure all users are safe and free from sexual abuse and exploitation.

See our notification letter to Discord for more information.

Take Action

Our Requests for Improvement


Evidence of Exploitation

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.

Countless Discord servers are dedicated to non-consensually sharing “leaked” sexually explicit images. There is also a growing trend of “deepfake” pornography on Discord. NCOSE researchers have found servers with sexual deepfake bots, allowing users to generate synthetic pornography by sending images of any woman to the bot via a chat function. Images were primarily hyper-realistic or anime. One server claimed to have trained over 15 million sexually explicit images, received via member ‘donations,’ since the beginning of 2023. Both minors and adults can join this server as there is no age/identity verification process.

Some channels were dedicated to photorealistic deepfake and generated pornography images. Many of the photos included young women who looked like minors.


Image-based sexual abuse (IBSA) is a broad term that includes a multitude of harmful experiences, such as non-consensual sharing of sexual images (sometimes called “revenge porn”), pornography created using someone’s face without their knowledge (or sexual deepfakes), non-consensual recording or capture of intimate nude images (to include so-called “upskirting” or surreptitious recordings in places such as restrooms and locker rooms via “spycams”), recording of sexual assault, the use of sexually explicit or sexualized materials to groom or extort a person (also known as “sextortion”) or to advertise commercial sexual abuse, and more.  

Note: sexually explicit and sexualized material depicting children ages 0-17 constitutes a distinct class of material, known as child sexual abuse material (CSAM – the more appropriate term for “child pornography”), which is illegal under US federal statute. CSAM is not to be conflated with IBSA.

See the full document of screenshot evidence here.

There is a significant amount of child sexual abuse material (child pornography) on Discord. In fact, upon entering what seemed to be a server for troubled teens, NCOSE researchers were immediately, inadvertently, exposed to dozens of images and links of what appeared to be child sexual abuse material. The images and links contained disturbing titles and comments by other users. The blatant abuse was alarming, some of which had been on the server for over two months.

Our researchers immediately reported the server to the National Center on Sexual Exploitation (NCMEC), who responded and processed our report, making it available for FBI Liaisons in less than 24 hours. Discord has yet to provide any further correspondence than an automated reply.

Evidence suggesting the buying, selling, and trading of CSAM on Discord is also prevalent on other social media platforms. Reddit actively hosts links to Discord servers dedicated to this illicit activity.

Some news articles covering this phenomenon include:

See the full document of screenshot evidence here.

Discord is a haven for sexual grooming by abusers and traffickers. The platform provides predatory adults ample opportunities for unmitigated interaction with minors through public servers, direct messages, and video/voice chat channels. Here are just a few examples of children being groomed and abused through Discord in the news from March 2022 to March 2023:

Sexual Exploitation and Grooming Cases that Escalated to Off-Platform Contact:

Discord is slow to respond when users report exploitative and predatory behavior.

One review referred to a server as a “groom hub” because of the number of minors and adults interacting with one another. This review was made over a year ago, and the server is still active as of March 2023.

See the full document of screenshot evidence here.

1. Discord does not default minor accounts to the highest safety settings available upon account creation.

Last year, NCOSE staff found contradictory language in Discord’s adult content guidelines. Discord claimed that the highest safety setting regarding direct messages, called “keep me safe,” is on by default, which it was not.

2. Discord does not require age-verification for many of its age-restricted features.

NCOSE recognizes Discord made important changes regarding minors accessing age-restricted content on their iOS devices. However, effective implementation is still lacking. While minor-aged accounts ostensibly cannot access age-restricted content on their iOS devices, Discord’s lack of robust moderation means that plenty of adult content is still available to minor-aged accounts. Because there is no age verification required to make any Discord account, minors can still easily pretend to be adults and access all of Discord’s content. With such significant loopholes, Discord’s age restrictions promise safety but fail to follow through.

Discord’s age verification policy is confusing and inconsistent. When an adult account is locked out from an age-restricted channel, Discord’s policies direct the user to demonstrate that they are an adult using a photo ID. However, using an adult account, NCOSE staff were able to access age-restricted content on iOS without having to go through any meaningful age verification.

3. Discord claims that verified and partner servers cannot include NSFW content – but has verified Pornhub, a server dedicated to sexually explicit content, and a company that has come under fire for hosting and profiting from child sexual abuse material and other non-consensual material.

4. Discord relies on user moderation and reports to monitor exploitative behavior– despite claiming a zero-tolerance policy regarding non-consensual sharing of sexual material and CSAM.

Discord does not have an in-app or browser reporting feature. Discord does not allow users to report entire servers. Users are required to submit reports to server moderators or submit a lengthy report to Discord’s Trust and Safety team. This is problematic considering the recent uptick in Discord moderators participating in predatory behavior, leaving users little recourse for safe reporting options.

5. Minor-aged accounts can still access servers containing age-restricted content even if designated age-restricted channels are blocked.

See the full document of screenshot evidence here.

Below are just a few reviews about the dangers of Discord.

Testimony of from a parent of a Discord user:

Testimony of a 13-year-old:

Testimony of a 14-year-old:

See the full document of screenshot evidence here.

Fast Facts

Discord’s policy: 13+, Apple App Store rating: 17+ , Google Play Store rating: T ; Bark recommends 15+

One of Bark's top 5 apps for most extreme sexual content two years in a row

Children 9 – 17-years-old had above average percentage rates on Discord for sexual interactions – including with adults

Recommended Reading

A 13-year-old boy was groomed publicly on Twitter and kidnapped, despite numerous chances to stop it

A Discord App Review for Parents

A creator offered on Discord to make a 5-minute deepfake of a “personal girl” for $65

The dark side of Discord for teens

Similar to Discord


Telegram is often cited as a Discord alternative. However, the 17+ instant messaging service is also knows for extensive harms to both children and adults. Read the Social Media Victims Law Center assessment of Telegram and this BBC investigation into Telegram as a hub for sharing women’s sexually explicit content without their consent.


Stay up-to-date with the latest news and additional resources

Bark Review of Discord

What Is Discord and Is It Safe? A Discord App Review for Parents - from Bark

This panel of digital safety experts will elevate some of the current and emerging digital dangers facing youth, share promising practices to prepare children on their digital journey, discuss the role and responsibilities of tech companies to their young users, and give us hope through evidence that change – at the individual and systemic level – is possible.

In this YouTube video, this content creator discusses “IP Grabbing, doxing, swatting, blackmail…and even some specific (illegal) servers. We also talk about how to stay safe and prevent these horrible things from happening to you.”

CyberFareedah says NO in this informative video on Discord’s risks, settings, and details on how to protect yourself and your kids online.


Help educate others and demand change by sharing this on social media or via email:


Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.