TAKE IT DOWN Act Approved by Senate: Hope for Survivors of Image-Based Sexual Abuse!
The TAKE IT DOWN Act, a key bill that would protect survivors of image-based sexual abuse, unanimously passed the Senate!
The TAKE IT DOWN Act, a key bill that would protect survivors of image-based sexual abuse, unanimously passed the Senate!
On this episode of The Movement, we’re exploring the ways new technology can aid the fight against exploitation.
Microsoft’s GitHub is the primary source of AI-generated child sexual abuse material and image-based sexual abuse.
Driven by feedback from survivors and advocates, Google has announced enhanced protections against deepfake and AI-generated pornography.
WASHINGTON D.C.—The National Center on Sexual Exploitation (NCOSE) and the PHASE Alliance™ (Prevention and Healing Against Sexual Exploitation) will host the Coalition to End Sexual
OpenAI is considering allowing their tools to be used to create “NSFW” content. Help us tell them why this would be a mistake!
WASHINGTON, DC (August 1, 2024) – The National Center on Sexual Exploitation (NCOSE) has called on OpenAI CEO Sam Altman to forbid the use of
WASHINGTON, DC (July 31, 2024) – The National Center on Sexual Exploitation (NCOSE) commended Google for implementing significant updates to its policies and processes to
Thanks to you, Apple removed 4 nudifying apps! Yet hey continue to host CSAM on iCloud and endanger children with lackluster safety features.
Predators use AI to create child sexual abuse material (“child pornography”) of existing or fictitious children. What can we do?
WASHINGTON, DC (March 19, 2024) – The National Center on Sexual Exploitation (NCOSE) called on the Department of Justice (DOJ) to launch an investigation into
If Taylor Swift has no protection against deepfake pornography, what does that mean for the rest of us?