Apple Removed Nudifying Apps but Continues to Host and Enable Child Sexual Abuse Â
Thanks to you, Apple removed 4 nudifying apps! Yet hey continue to host CSAM on iCloud and endanger children with lackluster safety features.
Thanks to you, Apple removed 4 nudifying apps! Yet hey continue to host CSAM on iCloud and endanger children with lackluster safety features.
Predators use AI to create child sexual abuse material (“child pornography”) of existing or fictitious children. What can we do?
WASHINGTON, DC (March 19, 2024) – The National Center on Sexual Exploitation (NCOSE) called on the Department of Justice (DOJ) to launch an investigation into
If Taylor Swift has no protection against deepfake pornography, what does that mean for the rest of us?
WASHINGTON, DC (January 26, 2024) – The National Center on Sexual Exploitation (NCOSE) called out X (formerly Twitter), Telegram, and other deepfake sites for perpetuating
Thanks to Microsoft-owned GitHub, any of us could be sexually exploited in less time than it takes to brew a cup of coffee.
Meet Gracie. Gracie is an artificial intelligence chatbot that engages with and deters sex buyers.
Washington, DC (April 13, 2023) – The National Center on Sexual Exploitation (NCOSE) joins the call by technology industry leaders like Bill Gates, Elon Musk, Steve