As the youngest woman and youngest Latina to ever serve in U.S. Congress, Alexandria Ocasio-Cortez commands respect across the nation. Yet, even she was not immune to being violated by AI-generated sexually explicit deepfakes.
Ocasio-Cortez was in a meeting with her staffers, discussing legislation and scrolling through X, when she saw the photo: an AI-generated image of someone forcing her into a sexual act.
Shock and horror coursed through her. Trauma from a physical sexual assault she’d endured flooded back to the surface.
Months later, when Ocasio-Cortez told the story, she was still shaken: “There are certain images that don’t leave a person, they can’t leave a person,” she said. “… [O]nce you’ve seen it, you’ve seen it.”
She continues: “It parallels the same exact intention of physical rape and sexual assault, [which] is about power, domination, and humiliation. Deepfakes are absolutely a way of digitizing violent humiliation against other people.”
Ocasio-Cortez knows she is far from the only one who has experienced this kind of AI-generated image-based sexual abuse. That is why she is co-leading the effort to pass the DEFIANCE Act, a bill that gives survivors of AI-generated IBSA the ability to sue their exploiters.
This week we celebrate a momentous step forward: The Senate has unanimously passed the DEFIANCE Act! This is an incredible victory for survivors across the nation.
What is the DEFIANCE Act?
The DEFIANCE Act grants survivors of AI-generated IBSA (i.e. sexual deepfakes) the ability to sue their exploiters. Anyone who uses AI to generate and/or distribute sexually explicit images of someone without their consent can be found liable.
This law expands on an existing law regarding IBSA that is not generated with AI, commonly known as “revenge pornography.” As of today, it is illegal and grounds for a civil lawsuit if a person distributes authentic sexually explicit images of a person without their consent. However, there is currently no private right of action for individuals when a perpetrator uses AI to generate sexually explicit images of them.
The Act also explicitly states that even if a person consented to the creation of a sexually explicit image or video using AI, that does not mean they consent to its distribution or monetization. Under this legislation, victims are able to collect damages and any profit made by the offender’s distribution of the media if the alleged perpetrator is found liable.
This law goes hand-in-hand with the TAKE IT DOWN Act, passed in May 2025. The TAKE IT DOWN Act requires platforms to remove content that qualifies as image-based sexual abuse, including AI-generated content, within 48 hours of the media being flagged by a user. This law allows for enforcement by the Federal Trade Commission (FTC) and makes possessing and distributing AI-generated image-based sexual abuse a federal crime.
Legislative Response to Grok Headlines and Increasing Image-Based Sexual Abuse
The NSFW AI chatbot from X, Grok, has been dominating headlines lately as users flock to it to create deepfake pornography and AI-generated CSAM. The global response has been swift. In the UK, Ofcom has opened an investigation into X. Indonesia and Malaysia have even blocked the chatbot.
Now, the U.S. is following suit by prioritizing legislation that protects victims of AI-generated IBSA, as Grok has put the problem at center stage.
Please ensure this important legislation makes it to the finish line by asking your House Representatives to support the DEFIANCE Act!

