
Fake Love, Real Harm: How AI Girlfriends Pose Risks of Sexual Exploitation
“AI Girlfriends” or “AI Companions” might seem harmless on the surface, but their capabilities pose grave risks for sexual exploitation.
“AI Girlfriends” or “AI Companions” might seem harmless on the surface, but their capabilities pose grave risks for sexual exploitation.
Discord is a hotspot for child sexual abuse material and image-based sexual abuse (i.e. deepfake pornography).
Twitter reviewed and verified child sexual abuse material. They then declared it was not against their policies and would not be taken down.
The TAKE IT DOWN Act, a key bill that would protect survivors of image-based sexual abuse, unanimously passed the Senate!
Meta’s platforms, namely Instagram, are advertising a plethora of “nudifying” apps to its users, including to children.
On this episode of The Movement, we’re exploring the ways new technology can aid the fight against exploitation.
Microsoft’s GitHub is the primary source of AI-generated child sexual abuse material and image-based sexual abuse.
Teen Vogue, fueled by Big Tech lobbyists, claims that removing the Internet’s liability shield will harm children. Learn why they’re wrong.
Predators use AI to create child sexual abuse material (“child pornography”) of existing or fictitious children. What can we do?
“Boys for Sale.” This was the text displayed on a Reddit post found by a NCOSE researcher this month. The post also featured an image
Pornhub has released its annual “Year in Review”. But here’s what the year REALLY looked like for them.
Meta’s dangerous new policy nullifies its ability to detect and report child sexual exploitation.