WASHINGTON, DC (January 7, 2026) – The National Center on Sexual Exploitation (NCOSE) is calling on the Department of Justice (DOJ) and Federal Trade Commission (FTC) to investigate X after its Grok AI chatbot reportedly created child sexual abuse material (CSAM) and has been allowing users to create nonconsensual intimate images (NCII) of women that have been widely shared on the platform.
“As other countries are now doing, U.S. authorities must investigate X. U.S. federal laws prohibit the creation and distribution of child sexual abuse material, even virtually created CSAM in certain circumstances, such as when it depicts an identifiable child, or depicts a child engaged in sexually explicit conduct. X may also have violated the Take It Down Act for failing to remove and/or continuing to generate non-consensual intimate images of people without their consent. We urge the DOJ and FTC to investigate child exploitation crimes and violations of the Take It Down Act,” said Dani Pinter, Chief Legal Officer and Director of the Law Center for the National Center on Sexual Exploitation.
The Take It Down Act allows for FTC enforcement against a platform for violations and the DOJ can prosecute for violations of federal laws.
“Despite X’s claims that it takes CSAM violations seriously, the mere fact that its AI chatbot is permitted to undress images of people without their consent means it cannot prevent sexualized images of minors. Business Insider reported that the xAI model was being trained on user requests for CSAM and child exploitation themed content. Had X rigorously culled such content from its training models then and banned users requesting illegal content, this would not have happened. X was warned about these risks and yet it did not take necessary steps to prevent Grok from generating child sexual abuse imagery and other abuse images.
“X’s flagrant behavior and callous attitude towards children and women should be a wakeup call to Congressional leaders that they must prioritize passage of legislation to confront the tidal wave of sexual exploitation created by Big Tech,” Pinter said. “We are only starting to see the monster Big Tech has created, but if not confronted now, AI-generated sexual exploitation will only grow.”
The National Center on Sexual Exploitation Law Center, the Haba Law Firm, and the Matiasic Firm represent two young boys whose child sexual abuse material (child pornography) was posted on X without their knowledge or consent and that X refused to remove in a lawsuit alleging that X profited from their sexual abuse and claimed immunity for hosting it. A Ninth Circuit Court of Appeals ruling in August 2025 found that X does not have Section 230 immunity for certain products liability and negligence per se claims based on its abysmal child exploitation reporting mechanisms and X’s failure to report John Doe #1’s and John Doe #2’s CSAM (“child pornography” under the law) when it had actual knowledge of this content on its website.
About National Center on Sexual Exploitation (NCOSE)
Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national non-profit organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health harms of pornography.
To schedule an interview with NCOSE, please contact press@ncose.com.


