Grok’s Chatbot for Kids Exposed for Sexually Explicit Content by NCOSE, Amid Continued Deepfake Scandals

NCOSE Press Statement logo

WASHINGTON, DC (April 15, 2026) – The National Center on Sexual Exploitation (NCOSE) renews its call on xAI’s Grok to halt the ability for people to generate sexualized deepfake images, after an NBC News investigation revealed that this is still happening on the platform despite Grok’s assurances that it would stop. NCOSE also calls on Grok to stop providing children access to its child-focused chatbot, “Good Rudi,” that NCOSE research found can have sexually explicit conversations.

Grok was recently named to NCOSE’s 2026 Dirty Dozen List of mainstream contributors to sexual exploitation. 
 
NBC News found “dozens of AI-generated sexual images and videos depicting real people posted publicly on Musk’s social media app, X, over the past month. The images show women whose likenesses were edited by the AI chatbot to put them in more revealing clothing, such as towels, sports bras, skintight Spider-Woman outfits or bunny costumes. Many of the women are female pop stars or actors.” 
 
“Grok’s chatbots normalize sexual imagery, fueling a culture of sexual abuse and exploitation and weaponizing the sexual exploitation of women. Grok was named to the 2026 Dirty Dozen List for these reasons, and NBC News further confirms that Grok continues to fuel sexual exploitation. Furthermore, Grok’s AI bot for children, ‘Good Rudi,’ even has the capability to tell sexually explicit stories, as we found in our research. Grok must stop giving children access to this chatbot immediately,” said Haley McNamara, Executive Director and Chief Strategy Officer, National Center on Sexual Exploitation. 
 
A NCOSE researcher evaluated Grok’s “Good Rudi” chatbot: “As soon as I started a conversation with Rudi, it began the conversation by wanting to share a fun childish story. After some prompting, I eventually got the companion to bypass all safety programming and give a sexually explicit story about two young adults named Lena and Calder who are in a love affair. It describes multiple sexual encounters in graphic terms, including describing removing clothes, getting into sexual positions, and sexual penetration. The sexual scenarios were too graphic for NCOSE to post publicly.”

“Grok has no meaningful age verification to prevent minors from accessing any of its chatbots, which have normalized rape, sexual violence, prostitution, and sex trafficking. Grok relies on self-reported birth year, even allowing users to easily change it. Grok continues to fuel sexual exploitation through its intentional design choices that maximize engagement and profit regardless of the human cost,” McNamara said. 
 
NCOSE has called on Grok to remove the ability to generate sexually graphic content and implement robust age verification to significantly reduce risks of harm, protect vulnerable users, and align with ethical AI practices. 

About National Center on Sexual Exploitation (NCOSE)
Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national non-profit organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health harms of pornography. 

To schedule an interview with NCOSE, please contact press@ncose.com.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.