Telegram prioritizes privacy for sexual exploiters. Telegram knows that it is a messaging app used by sexual abusers for child sexual abuse material, sex trafficking, sextortion, deepfake image-based sexual abuse, and more. Yet it continues to operate encryption without real safeguards—all in the name of ‘privacy’—disregarding victims’ privacy and human rights. Telegram has chosen a side.
Telegram, a messaging platform with optional end-to-end encryption for “secret chats,” has become infamous for child-sexual abuse material (CSAM), sextortion, sex trafficking, and image-based sexual abuse (IBSA)—including “deepfake” or AI-generated pornography and “revenge porn” or non-consensual sharing of sexual imagery.
Investigations and enforcement actions have found that networks involved in child sexual abuse material (CSAM), sex trafficking, and in other forms of exploitation routinely use Telegram’s channels, bots, and integrated payment systems to operate at scale—advertising victims and illegal services, coordinating distribution, and monetizing abuse through mechanisms like cryptocurrency and bot-driven marketplaces.
Facing mounting global pressure, Telegram has claimed it’s taking steps to strengthen moderation—expanding AI detection tools, increasing large-scale takedowns, and beginning to cooperate more with law enforcement following the 2024 arrest of its CEO. But these changes have not fundamentally disrupted abusive ecosystems, which rapidly re-form and adapt even after mass removals.
Telegram shields criminal activities from detection and ignores the privacy rights and human rights of those victimized by sexual abuse through its platform.
Telegram continues its negligible proactive CSAM or IBSA detection in private messages or groups. Its historically hands-off approach combined with end-to-end encryption continues to allow sexual exploitation networks to persist. Without systemic reforms, Telegram remains a significant vector in the spread and monetization of online sexual abuse.
WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.
A recent study provides some of the strongest empirical evidence to date that Telegram’s bot ecosystem supports nonconsensual image-based sexual abuse and AI-enabled sexual exploitation. It identifies entire categories of bots dedicated to “nudification” and illicit content access, and shows that these tools are embedded in a broader system of payments, referrals, and large-scale communities.
Most importantly, the research demonstrates that bots are not on Telegram by mere happenstance, they plug into Telegram’s core infrastructure. By enabling in-app content generation, monetization, and viral distribution, Telegram’s design allows nonconsensual sexual content and exploitation tools to be produced, sold, and scaled within a single platform ecosystem.
The study noted:
“Many bots facilitate the search, distribution, or generation of adult content. More alarmingly, the majority of those bots offer services to create deepfake images, undressing pictures, or swapping faces, which aligns with the surge of nudification websites (Han et al. 2025; Gibson et al. 2025) and non-consensual image sharing channels on Telegram…”
The paper goes on to discuss…
“While the majority of bots are benign, we uncovered bots which are used for illegal or concerning purposes, such as carrying out fraud (n=1,331, 4%), commercializing illicit goods and services, or providing access to questionable services, such as AI non-consensual deepfakes (n=1,539, 5%). Unlike traditional darknet marketplaces or forums on the Tor network that require desktop environments, Telegram offers a mobile-first, easier-to-use (for both developers and end users) platform that lowers the barrier to provide questionable or outright illicit services. We theorize that Telegram fills a gap between clearnet websites and hidden services. The growth of ‘gray’ AI services and the abundance of cryptocurrency/blockchain-related bots support this hypothesis.”
AI-CSAM
Nucleo with support from the Pulitzer Center’s AI Accountability Network, conducted an investigation in 2025 that “identified 83 bots on Telegram using keywords associated with deepnudes—the practice of generating sexual or pornographic images using AI without consent. Among all these groups found, 33 were active and functional, and 23 (70%) were capable of creating child sexual exploitation material (21 with children and adolescents and two only with adolescents).”
IBSA
“Men are leaking explicit photos of women on Telegram group chats with thousands of members.”
Read on Dazed Digital
AI-CSAM on Telegram
AI-generated CSAM is skyrocketing on Telegram, where lax moderation, encrypted private groups, and accessible bots have turned the platform into a dangerous hub for perpetrators to create, share, and monetize hyper-realistic child sexual abuse images and videos. Investigative reports and law enforcement findings show dedicated Telegram bots and channels advertising AI “nudify” tools and custom abuse content, often routing payments via crypto while evading detection.
Law enforcement sources have described entire chatrooms dedicated to “nudify” or “fakes” where users post non-exploitative images of children for others to process into CSAM. Telegram “nudifier” bots are explicitly called out as common vectors.
CSAM on Telegram
Telegram is frequently cited as a key distribution and coordination platform for CSAM networks. Below are simply a few recent examples:
55 men arrested in France in major operation to bust online pedophile ring
Date: May 22, 2025
Description: French authorities arrested 55 men across the country for exchanging CSAM imagery (including content involving children under 10) via a Telegram group linked to an incarcerated pedophile. The 10-month investigation involved undercover infiltration and analysis of thousands of exchanges; suspects included individuals from various professions, some with access to children.
Link: https://www.cnn.com/2025/05/22/europe/france-pedophile-ring-telegram-latam-intl
Former monk arrested after Telegram account linked to child sex abuse
Date: Feb 25, 2026
Description: Thai cyberpolice arrested a former monk after linking his Telegram account to child sexual abuse material and allegations of assault; the account was used to distribute abusive content and collect membership fees.
Link: https://thethaiger.com/news/national/former-monk-arrested-telegram-account-linked-to-child-sex-abuse?
Cyber police arrest Telegram group admin for child porn distribution
Date: Feb 1, 2026
Description: Thai cyber crime investigators arrested the administrator of a Telegram group that recruited paid members to access child pornography clips, seizing electronic devices as evidence.
Link: https://thethaiger.com/news/national/cyber-police-arrest-telegram-group-admin-for-child-porn-distribution
Child porn creators arrested in nationwide raids (including Telegram links)
Date: Mar 19, 2026
Description: Multiple men were arrested in coordinated raids in Thailand for selling child pornography online — with platforms involved including Telegram.
Link: https://www.bangkokpost.com/thailand/general/3220029/child-porn-creators-arrested-in-nationwide-raids
Boston Man Charged with Receipt of Child Pornography
Date: April 30, 2025
Description: “During a search of [the man’s] cell phone approximately 100 media files that depicted CSAM were allegedly found saved in Telegram Messenger. The minor victims in the files are alleged to be between approximately three and 10 years old.”
Link: https://www.justice.gov/usao-ma/pr/boston-man-charged-receipt-child-pornography
Image-based sexual abuse
Telegram has a long history of facilitating image-based sexual abuse. In 2022 it was reported that nude photos of women were “being shared to harass, shame and blackmail them on a massive scale.” The problem continues to persist.
Recently, a woman spoke out after she discovered sexual images of herself were circulating on Telegram after friends sent her screenshots. She stated: “It made me feel really upset to think that people would go out of their way to do this…Not only to me, but to younger girls too. I heard there were nudes of girls as young as 15 being shared. It’s disgusting.”
The Revenge Porn Helpline also commented on this incident, stating: “The Revenge Porn Helpline regularly receives reports regarding NCII on Telegram and we would encourage the platform to engage with us to remove this material quickly as well as join the StopNCII.org initiative to prevent the sharing of this abusive content.”
The manager of the Revenge Porn Helpline also noted: “Where possible we have reported the images to Telegram, however, they have been unresponsive to these reports.”
Sextortion
Sextortion cases frequently involve multiple platforms (e.g., Snapchat, Discord, TikTok alongside Telegram), and Telegram is commonly cited in investigations for private chats, group coordination, or sharing how-to resources among perpetrators.
See a small sample of examples below:
Peters Township senior charged in ‘large-scale’ catfishing, sextortion scheme
Description: Zachariah Abraham Meyers, an 18-year-old high school senior in Pennsylvania, was arrested and faces over 300 charges (including sexual extortion, sexual exploitation of children, and sexual abuse of children). He allegedly operated a catfishing network targeting at least 21 juvenile victims, coercing them into sending explicit images/videos via social media including Telegram (and TikTok). Victims were deceived using fake personas; one incident involved directing a minor to record sexual acts.
Date: February 23, 2026 (arrest around February 20, 2026)
Link: https://www.wtae.com/article/pennsylvania-student-sextortion-scheme-peters-township-300-charges/70436687
Man accused of online ‘sextortion’ targeting Somerset child
Description: Sean Melko, 22, was arrested in Pennsylvania (with assistance from out-of-state police) for allegedly committing sextortion against a Somerset juvenile. The suspect used Telegram and Discord to engage in child enticement and extortion threats involving explicit material. He faces multiple charges including extortion, dissemination of obscene matter to a minor, and enticing a child.
Date: March 13, 2026 (arrest reported March 9–13, 2026)
Link: https://www.yahoo.com/news/articles/man-accused-online-sextortion-targeting-224336351.html
Sextortionists who exploited victims using fake profiles on Telegram arrested
Description: Israeli local news reported on the arrest of two men—David Bracha, 26, from Rishon LeZion, and Guy David, 53, from Holon—for operating a sexual extortion scheme. They allegedly posed as women on Telegram (including in groups), lured victims into sending intimate photos, then blackmailed them for money. Police described it as a systematic, cynical operation exploiting victims through messaging apps.
Date: February 14, 2025
Link: https://www.jpost.com/israel-news/article-842159
In the Philippines, The Department of Information and Communications Technology (DICT) publicly placed Telegram under observation in February 2026 over its role in proliferating pornography, scams, and online sexual exploitation/abuse of children (OSAEC). Officials stated that preventing OSAEC is “non-negotiable” and warned of a potential nationwide ban if violations continued. ABS-CBN News (Feb 26, 2026)
In 2025, the Australia eSafety Commission reportedly fined Telegram A$957,780 (roughly US $640,000) for failing to answer basic questions about how it detects and reports child sexual abuse material (CSAM). The company took months (nearly 160 days) to respond to a formal transparency notice seeking details on its CSAM moderation systems, effectively blocking regulators from assessing whether meaningful safeguards were in place. The penalty underscored a growing global concern: that Telegram operates with far less visibility and accountability than other major platforms when it comes to identifying and reporting child exploitation.
In India, in 2025, Telangana’s Cyber Security Bureau (TGCSB) launched a dedicated Child Protection Unit, to address online child sexual abuse material. After a string of arrests, one of the investigating officers noted that Telegram was a primary platform where most CSAM content was traded and circulated. Investigating officers noted that the material — 70% of which originated from foreign sources — was routinely shared via Telegram channels and groups, alongside other social media. Accused often sourced the content there before distributing it further or uploading it to inactive websites, making Telegram a key hub that enabled the widespread exploitation and sharing of child pornography. In 2023, The Ministry of Electronics and Information Technology (MEITY) issued formal notices to Telegram (along with other platforms) demanding the prompt and permanent removal of child sexual abuse material from the Indian internet, with warnings of losing safe harbor protections and legal consequences for non-compliance.
In 2024 Telegram CEO Pavel Durov was arrested in France, where authorities alleged the platform’s lack of moderation and cooperation enabled crimes including the distribution of CSAM, in addition to terrorism and drug trade. Unlike prior regulatory actions, the French case pushed toward potential criminal liability, arguing that platform design and inaction can facilitate ongoing abuse. Official PIB (Press Information Bureau) release (Oct 6, 2023)
The DarkGram study is one of the largest empirical analyses of criminal activity on Telegram, examining 339 cybercriminal channels with over 23.8 million users. It found that Telegram hosts highly organized, large-scale networks distributing illicit material and services, including hacking tools, stolen data, scams, and exploit kits. The paper does not focus specifically on CSAM or sexual abuse, but its findings are directly applicable because the core design functionalities used for these criminal networks can also be utilized by sexual criminal networks. The exact features identified as aiding cybercriminals are the same ones used in CSAM distribution networks, sextortion rings, and nonconsensual image-sharing communities.
Disturbingly, the study found that even when there are “takedowns” of criminal channels or groups, those involved are “quickly migrating to new channels with minimal subscriber loss, highlighting the resilience of this ecosystem.” This reality highlights how reactive reports or ad hoc removals of these groups is an insufficient response by Telegram.
If they want to meaningfully combat these harms, Telegram would need to proactively identify and remove any sexual imagery unless it is verifying the age and consent of those depicted within it. They should also proactively and iteratively identify and review behaviors and language that are high-risk for CSAM sharing, sex trafficking, or sextortion. In short, Telegram would need a radical overhaul of its current structure and practices.
Report suspected child sexual exploitation to the National Center on Missing and Exploited Children (NCMEC) Cyber Tipline
NCMEC’s Take It Down service: Resource for minors to remove their sexually explicit content from online platforms
Thorn’s Guide to Identify Sextortion: What to do if someone is blackmailing you with nudes
Stop Non-Consensual Intimate Image Abuse (StopNCII) – Resource for adults to remove image-based sexual abuse from online platforms
The Bark Blog: Is Telegram Safe? An App Review For Parents
Protect Young Eyes: Telegram App Review
Spread the word to hold Big Tech accountable. Use these free resources to post on social media or share via email. Your voice can create change!
We use cookies
We use necessary cookies to make this site work and, with your consent, analytics and advertising cookies to understand usage and improve marketing. You can accept all, choose necessary only, or reopen your choices later. Privacy policy