Kids Are Using AI to Violate Their Peers – and Microsoft’s GitHub is the Cause
Microsoft’s GitHub is the primary source of AI-generated child sexual abuse material and image-based sexual abuse.
Microsoft’s GitHub is the primary source of AI-generated child sexual abuse material and image-based sexual abuse.
WASHINGTON, DC (August 26, 2024) – The National Center on Sexual Exploitation (NCOSE) said that Telegram CEO Pavel Durov, who was arrested in France, should
Driven by feedback from survivors and advocates, Google has announced enhanced protections against deepfake and AI-generated pornography.
WASHINGTON D.C.—The National Center on Sexual Exploitation (NCOSE) and the PHASE Alliance™ (Prevention and Healing Against Sexual Exploitation) will host the Coalition to End Sexual
WASHINGTON, DC (August 1, 2024) – The National Center on Sexual Exploitation (NCOSE) has called on OpenAI CEO Sam Altman to forbid the use of
WASHINGTON, DC (July 31, 2024) – The National Center on Sexual Exploitation (NCOSE) commended Google for implementing significant updates to its policies and processes to
Currently, there is NO federal criminal penalty for distributing sexual images nonconsensually. The TAKE IT DOWN Act would resolve this.
Episode 2 of “The Movement” has launched! We discuss a new bill addressing image-based sexual abuse, a helpful resource published by the NCOSE Research Institute, our petition to the Supreme Court, and more.
WASHINGTON, DC (June 18, 2024) – The National Center on Sexual Exploitation (NCOSE) supports the Take it Down Act, introduced today by Senators Ted Cruz
WASHINGTON, DC (June 4, 2024) – The National Center on Sexual Exploitation (NCOSE) said that X’s (formerly Twitter) announcement that users must mark their “consensually-produced”
WASHINGTON, DC (May 24, 2024) – The National Center on Sexual Exploitation (NCOSE) said that tech companies must heed the new White House directive to
“Boys for Sale.” This was the text displayed on a Reddit post found by a NCOSE researcher this month. The post also featured an image