Discord is an online destination for exploitation, and even organized criminal sextortion networks have noticed. Discord’s design choices—private channels, limited default safeguards, and reactive enforcement—undermine its own “safety rules,” allowing serious harms like sextortion, grooming, CSAM distribution, and image-based sexual abuse to flourish. It’s time for them to take prevention seriously.
Discord is a pipeline for grooming → coercion → and sextortion
What started as a communication platform for gamers has become a vastly popular app for direct messaging, small group chats, and larger server communities, complete with text, voice, video, and screen sharing.
Discord is no stranger to the Dirty Dozen List, in fact this year marks its 5th time being named a mainstream contributor to sexual exploitation.
Sexual abusers return to Discord again and again, thanks to this company’s reputation for lax rule enforcement and dangerous design. Even registered sex offenders have been charged for targeting kids on Discord. They use its DMs, video calls, or small servers to gradually escalate conversations, often requesting sexual content which can then be used to coerce or blackmail (aka sextortion) children into further sexual abuse.
Often abusers initially meet minors on other social media or videogame sites (like Roblox or Snapchat) and then direct them to connect on Discord where further grooming and escalation take place.
Why is Discord an online destination for exploitation? Because by its very design Discord allows high-risk on-ramps for exploitation and policy enforcement is often reactive and lax. Several Discord users have reported to NCOSE that the platform places too much responsibility on its users to moderate and report harmful activity. This approach can create an environment where exploiters easily connect and share abusive material, knowing they are unlikely to be reported by others with similar intentions.
In fact, Discord is SUCH a reliable platform for abusers that organized sextortion criminal networks like 764 have been documented to use Discord to share tactics, recruit victims, and coordinate sextortion at a larger scale.
Predators also use Discord not only to obtain CSAM from children themselves, but to share and trade CSAM with each other, whether directly, via external links, or invite-only community groups on Discord. Discord has also become a popular platform for posting deepfakes, AI-generated images, and other forms of image-based sexual abuse.
The disturbing truth is none of this is news to Discord.
Despite constant reports and lawsuits about Discord being used for sexual exploitation, and its CEO even being called before lawmakers to testify about this problem, the problems persist.
Discord still does not default safety features to the highest possible setting for teens. In fact, despite promising a global rollout of “teen‑by‑default” safety settings that would give all users age‑appropriate protections and limit access to risky spaces, Discord postponed this launch. Discord has claimed that a global rollout is coming in late 2026, that it will provide teen safeguards and ensure age verification is conducted with high privacy standards.
But will Discord actually ensure these protections are effective? Or will they water down the protections and settle for a PR stunt?
We are calling on Discord to make good on its promise, and to launch teen-by-default safety settings at best-in-class standards.
WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.
A NCOSE investigator set up an account as a 10-year-old child in December 2025. See the report below:
Discord does a very poor job of performing ample age verification and parental consent on new accounts that children will be using on the Discord platform. This is a significant safety concern given that many children use Discord to communicate with one another while playing video games or for other social interactions.
Upon attempting to create a new account on Discord, I discovered that there was no parental or age verification check for a 10-year old child.
Post-account creation, the child is unable to discover NSFW accounts located within the native Discord Discover page, however, if a child simply locates an NSFW discord server through google, they will be able to accept a public invitation to join the Discord server in real time.
Discord’s own policies acknowledge that adult content is supposed to be restricted to users 18+, but in practice, [as of March 2026] access has long relied on self-attested age and simple click-through warnings. Users are merely asked to confirm they are over 18 before entering explicit channels.
Below is a censored screenshot of pornographic imagery on Discord available to a 10-year-old. More evidence is available for press or policymakers.
Discord’s Community Guidelines explicitly prohibit sexual interactions involving minors and grooming behavior. Yet the platform’s structure—public servers that funnel into private messages and smaller invite-only spaces—and lack of sufficient prevention and intervention measures creates an environment where grooming can predictably unfold out of sight.
Below is a mere sample of grooming, child sexual abuse, and CSAM sharing cases involving Discord.
March 2026: Florida Launches Civil Investigation into Discord over Child Safety Concerns
In March, 2026, Florida Attorney General James Uthmeier announced his office launched a civil investigation into Discord, alleging that the platform has become a place where child predators increasingly use the service to contact and exploit minors.
“We’ve brought investigations into Snapchat, into Roblox, and others,” AG Uthmeier said. “What we’ve learned today is all roads lead to Discord.”
The investigation includes subpoenas seeking internal records about Discord’s marketing to children, age‑verification practices, content moderation, and how the company responds to complaints and exploitation reports, reflecting claims that current safety measures are inadequate. Florida’s AG described Discord as potentially acting as a “safe haven” for predators.
Florida Attorney General James Uthmeier announced Thursday that his office has launched a civil investigation into Discord.🔽https://t.co/OlU1wYjErs pic.twitter.com/0jBZ3W7u2U
— ABC7 Sarasota (@mysuncoast) March 18, 2026
April 2025: New Jersey Sues Discord for Allegedly Failing to Protect Children and Misleading Parents
In April 2025, New Jersey Attorney General Matthew J. Platkin filed a lawsuit against Discord, Inc., in the Superior Court of New Jersey, alleging that it engaged in deceptive, unconscionable, and unlawful business practices that endangered children. The complaint claims Discord misled parents and users about the effectiveness of its safety features, failed to enforce its minimum age restrictions, and exposed minors to violent and sexual content and predators, including by allowing easy access through default settings and weak age verification. The AG’s office asserts these practices violate New Jersey’s consumer fraud laws and seeks civil penalties and injunctions requiring improved safety measures.
“Discord claims that safety is at the core of everything it does, but the truth is, the application is not safe for children. Discord’s deliberate misrepresentation of the application’s safety settings has harmed—and continues to harm—New Jersey’s children, and must stop,” said Cari Fais, Director of the Division of Consumer Affairs.
Discord’s rules allow pornography. Yet the company fails to provide any meaningful age or consent verification of people depicted in sexual content to prevent image-based sexual abuse.
We define image-based sexual abuse (IBSA) as the creation, manipulation, theft, extortion, threatened or actual distribution, or any use of sexualized or sexually explicit materials without the meaningful consent of the person/s depicted or for purposes of sexual exploitation. Learn more here.
Discord technically prohibits non-consensual sharing of sexual content. But its enforcement is primarily reactive, depending heavily on users or victims reporting the images after the fact. Which means Discord is primarily waiting until after the harm is done.
It could stop IBSA by either banning pornographic content and enforcing that ban, or by implementing strong age and consent verification systems for each person depicted in sexual content. Instead, it does neither. By permitting largely unrestricted sexual content without meaningful safeguards, Discord enables image-based sexual abuse to flourish without consequence.
Discord’s core features include hidden/private servers, encrypted/end-to-end-like DMs, voice/video calls, and livestreaming with limited proactive oversight. These create isolated spaces where predators can build trust, isolate victims, coerce explicit content, and apply threats (sextortion) without easy detection. Moderation is largely reactive (user-reported) and relies on server owners/moderators, leaving many spaces unmonitored or poorly managed. And Discord’s current DM practices still allow sextortion because strangers can reach minors through shared servers or accepted message requests, and parents are left unaware.
Poorly moderated video calls and livestreaming particularly create vulnerability for sextortion. Many young people post online asking for advice after they’ve experienced sextortion on Discord.
Below is a small sample of sextortion cases involving Discord.
Discord’s parental controls have significant gaps that make them largely ineffective in practice.
Parents can only see who their teen is chatting with or which servers they join, but they cannot read the actual messages or view shared images, and they can’t remove friends or kick teens out of risky servers. Teens must actively opt in to link a parent via Family Center, and they can disconnect at any time, meaning parents often have no oversight at all.
Combined with easy-to-fake age settings and the platform’s lack of traditional supervision tools like chat logs or screen-time limits, these loopholes mean that, despite the promise of teen safety features, parents often cannot meaningfully protect their children on Discord.
Prioritize image-based sexual abuse (IBSA) and child sexual abuse material (CSAM) prevention and removal.
How?
By instituting robust age and consent verification for every person depicted in sexually explicit content. If Discord doesn’t want to invest in this safeguarding approach, it should ban and prevent uploading or sharing of pornographic content.
Develop or improve proactive internal flagging and review process for high-risk activity related to sextortion and grooming, such as high-volume friend requests, image sharing, grooming language indicators, and more. Collaborate with law enforcement, survivors, and subject matter experts to develop a robust methodology.
Rollout “teen‑by‑default” safety settings with strict age verification standards so that minors accounts are consistently set to the highest level of safety and privacy available, including no access to livestreams, video-calls, or one-to-one direct messaging.
Report suspected child sexual exploitation to the National Center on Missing and Exploited Children (NCMEC) Cyber Tipline
NCMEC’s Take It Down service: Resource for minors to remove their sexually explicit content from online platforms
Thorn’s Guide to Identify Sextortion: What to do if someone is blackmailing you with nudes
Stop Non-Consensual Intimate Image Abuse (StopNCII) – Resource for adults to remove image-based sexual abuse from online platforms
The Bark Blog: What is Discord and is it safe?
Protect Young Eyes: Discord App Review
Spread the word to hold Big Tech accountable. Use these free resources to post on social media or share via email. Your voice can create change!
We use cookies
We use necessary cookies to make this site work and, with your consent, analytics and advertising cookies to understand usage and improve marketing. You can accept all, choose necessary only, or reopen your choices later. Privacy policy