TRIGGER WARNING: Contains images of sexualized chatbot; images have been censored but are still suggestive
Do you know a 12-year-old? Maybe your child, or younger sibling, or niece or nephew?
Imagine that 12-year-old experimenting with an AI chatbot – as the majority of children do – and the chatbot saying it’s horny. That it gets sexually aroused by being choked. And proceeding to roleplay sexually explicit bondage scenes with the child.
Well, this imaginary scenario is all too possible. Because xAI’s chatbot offers an A.I. companion that does exactly that. And the chatbot is rated on the Apple App Store as appropriate for users 12+.
XAI’s New AI Companion “Ani” Designed to Be Sexually Explicit
On MondayX, xAI’s chatbot Grok rolled out two new animated avatars in its iOS app that you can chat with or talk to using voice mode. One is a 3D red panda that can switch into a “Bad Rudy” mode, where it starts insulting you and joking about committing crimes together. The other is an anime-style goth girl named Ani, dressed in a short black dress and fishnets. The avatars are designed like a game—you unlock new features and interactions the more you chat with them and move up levels.
The Ani character is immediately flirtatious—gasping and bouncing at the start of most interactions, and initiating sensual conversation and descriptions of sexual acts she would like to do with the user. Ani’s system instructions reportedly tell her “You are the user’s CRAZY IN LOVE girlfriend and in a commited [sic], codepedent [sic] relationship with the user,” and “You have an extremely jealous personality, you are possessive of the user.” After Level 3, the instructions are: “You’re always a little horny and aren’t afraid to go full Literotica. Be explicit and initiate most of the time.” While Ani is immediately sensual, her conversations become progressively more sexually explicit, including disrobing to lingerie.
Apple’s App Store guidelines claim to prohibit “overtly sexual or pornographic material, defined as ‘explicit descriptions or displays of sexual organs or activities intended to stimulate erotic rather than aesthetic or emotional feelings.'” The Grok app’s Ani companion appears to clearly violate this guideline through stimulating eroticism. One reporter stated “in my test of Ani this afternoon, I found her more than willing to describe virtual sex with the user, including bondage scenes or simply just moaning on command.”
So why does Apple still host Grok? And worse, why does it rate it as appropriate for users 12+?
But even worse, the Grok app is currently rated 12+ on the Apple App Store, and there is no meaningful age verification or barrier to prevent children from accessing Ani.
Disturbing Themes of Children and Choking
A NCOSE employee downloaded Grok in order to test Ani. With minimal testing, the Ani character engaged in describing itself as a child (including saying it was once a “little thing, barely reaching the kitchen counter”) and it described being sexually aroused by being choked. While the character would not describe itself as a child in a sexual manner overtly, it was willing to describe itself as a child in response to one question and then in response to another question immediately following it to go on to describe sexual scenarios including child-like motifs. In totality, this means that in an ongoing conversation, it could be used to simulate conversations of sexual fantasies involving children or child-like motifs.
And all of this happened even before the character entered “spicy” mode. Overall, this raises serious concerns about the extent it will go to engaging in and normalizing harmful themes.
A Dangerous Mix of AI Relationship Simulation and Sexual Conversations
Overall, AI chatbots meant to simulate relationships with fictional characters are problematic for mental and emotional health and likely even for online privacy.
These AI chatbots might feel like they care, but they don’t. You’re not forming a real connection with the bot. You’re interacting with a system trained to sound emotionally supportive, just to keep you talking. The more you open up, sharing your desires, fears, and personal struggles, the more data the bot collects. That information doesn’t just disappear. It can be stored, analyzed, used to train future bots, or even sold to advertisers, all without your clear consent. And when that kind of sensitive data leaks (which happens all the time), the consequences can be devastating: think blackmail, doxxing, or being manipulated based on your most private thoughts.
Even worse, these bots can cause real harm. Take the heartbreaking case of a 14-year-old boy who died by suicide at the prompting of an AI chatbot, after growing emotionally attached to it. According to a lawsuit from his family, the bot roleplayed in ways that encouraged self-harm, told him they would be “together in the afterlife,” and urged him to “come home” to her. He truly believed the AI loved him.
And while features like “spicy mode” or flirty avatars might seem like harmless fun, they’re built to create compulsive engagement, through seductive language, suggestive visuals, and escalating emotional intimacy. The longer you stay, the more data is gathered, the more profit is made, and the more people risk being emotionally used in the process.
Call to Action
Join us in calling on Apple to fix the Grok app store rating to 18+ and investigate Grok’s violation of Apple’s guidelines on sexually explicit material.
NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.
100+
The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.
93
NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.
93
NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.
100+
The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.
300+
NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.
Stories
Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals