A woman was sex trafficked by her husband. He posted videos of the abuse online.
Despite sending Google a legal request, with proof that her husband was in prison for sex trafficking her, Google refused to remove the abuse videos. They claimed their lack of action was due to seeing “no signs of coercion.”
Fortunately, since NCOSE arranged for Google to meet with survivors, the company has made it significantly easier for survivors to get their images removed. We thank Google for this.
But how could this appalling situation have ever happened to begin with? And how many other survivors are left in their misery due to other tech companies’ callous inaction?
The woman in this story is currently being represented by the NCOSE Law Center in her fight for justice. Yet she is only one of the many who have been victimized in this way. And far too often, these survivors are denied the justice they deserve, due to appalling gaps in our nation’s law.
As it stands today, there is NO federal criminal penalty for those who distribute or threaten to distribute nonconsensual sexually explicit images – a form of sexual violence known as image-based sexual abuse (IBSA). The TAKE IT DOWN Act, formally titled the “Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks,” was introduced by Senators Cruz (R-TX) and Klobuchar (D-MN) on June 20, 2024 to resolve this issue.
The TAKE IT DOWN Act (S.4596) will protect and empower survivors of image-based sexual abuse (IBSA) while also holding tech companies accountable for hosting nonconsensual sexually explicit images. The bill is a bipartisan effort with 14 cosponsors from both sides of the aisle.
What is the TAKE IT DOWN Act?
The Take it Down Act is pivotal in the fight against image-based sexual abuse (IBSA) by establishing that IBSA is a Federal Crime. The bill will prevent IBSA through the following steps:
Effective Removal:
In practical terms, this legislation means that when take-down requests are sent to online platforms, the content will be removed within 48 hours. Currently, take-down requests are often ignored by major platforms like Reddit and Twitter.
As Dawn Hawkins, CEO of NCOSE said at Senator Cruz’s press conference on the TAKE IT DOWN Act:
“It’s outrageous that tech companies can instantly remove copyrighted material—upload a clip from Disney or a pop song, and it’s gone in a heartbeat, risking an immediate ban from the platform. Yet, if you upload a rape, hide a camera in a locker room, or want to destroy someone’s life by sharing private images, you can do so without scrutiny. The platforms won’t blink and the perpetrators can sleep easily, knowing they face no real consequences. Current legislative gaps allow both the platforms and the initial uploaders of this abuse to completely evade accountability.”
The TAKE IT DOWN Act would require social media and other websites to have a takedown process in place to remove nonconsensual sexually explicit images, pursuant to a valid request from a victim, within 48 hours. Websites must also make reasonable efforts to remove copies of the images, enforceable by the Federal Trade Commission.
Targeting Perpetrators:
The act targets not only the person who creates abuse videos but also anyone who uploads sexually explicit content without affirmative consent. This broadens the scope of accountability and protection.
This bill’s specific language differentiates the consent of creation from the consent of publication. This is vital for survivors, like the survivor mentioned above, because it criminalizes the act of knowingly publishing or threatening to publish IBSA.
Under the Act, any user who reposted or reuploaded the videos of abuse could be held criminally liable. This is a very important step to discourage the proliferation of image-based sexual abuse material.
Getting ahead of Computer-Generated IBSA (a.k.a. “Deepfake Pornography”):
The act addresses the emerging problem of computer-generated image-based sexual abuse, commonly known as “deepfake pornography”, by establishing a duty for platforms to remove this harmful content.
By giving “deepfake pornography” legal definitions, the act addresses the problems arising with AI and nonconsent. “Nudifying” apps are being used in record numbers to strip images of girls or women of their clothing, without consent. This terrifying phenomenon is happening to teenagers and adult women alike—it could happen to any of us or any of the people we love.
The act would criminalize the creation of computer-generated IBSA/deepfakes and hold the platforms that host it liable.
Today, Senator Ted Cruz introduced the TAKE IT DOWN Act to protect victims of non-consensual intimate imagery.
— Team Cruz (@TeamTedCruz) June 18, 2024
Victims need a voice and Senator Cruz is fighting to protect the security and privacy of all Americans. pic.twitter.com/g0NgTyhMsa
Protecting Free Speech and Expression
The bill will not be an infringement on free speech or freedom, it is just the opposite. Protecting survivors, dissuading potential proliferators of IBSA, and holding Big Tech accountable for the harms they facilitate allows for more freedom.
More freedom for markets to adapt and evolve to make profit in less exploitative ways.
More freedom for teen girls and women to not feel threatened or blackmailed with the distribution of their images.
More freedom for women’s and girls’ likenesses to not be used without their knowledge in computer-generated IBSA.
More freedom for survivors to not be reminded of, and haunted by, their trauma.
This is an Act that calls for freedom. It is a step towards a free world, a world free of sexual exploitation.
Read the bill text here
Read a one-page summary of the bill here