Taylor Swift Isn’t the Only One Threatened by AI-generated/Deepfake Pornography…

Imagine a day in the life of the talented singer and songwriter, Taylor Swift.

What do you envision it to be like? Multibillion dollar album sales? Breaking records at the Grammys? NFL football and Superbowl LVIII, anyone?

Whatever grand view comes to mind, one is likely not to consider what it must have been like to suddenly learn that images of you have been used to create violent AI-generated pornography, without your consent. But tragically, this is exactly what happened to Taylor Swift just last month.

What is AI-generated/deepfake pornography?

AI-generated pornography, sometimes called “deepfake pornography” is a rapidly growing form of image-based sexual abuse (IBSA) where everyday images are turned into pornography. According to a December 2023 research study by Home Security Heroes, there has been a staggering increase of 550% in the availability of deepfake pornography between 2019 and 2023.

So here’s the scary truth: If someone has a digital image or video of you, taken with or without your consent, it can be manipulated into deepfake pornography or other types of synthetic sexually explicit material (SSEM). And it could look so realistic that no one would be able to tell the difference.

Currently there are three main categories of SSEM:

  1. the alteration of an image of an identifiable person utilizing so-called “deepfake” technology so that a person not in pornography appears to be in pornography, and
  2. the “nudifying” of a person so that they are “stripped” of their clothing via “nudify apps” or other technology,
  3. AI-generated amalgamated pornography which utilizes mass collections of images of persons scraped from assorted websites and AI to create “new” pornography, typically depicting an unidentifiable, computer-generated “person.” 

So, if this has just happened to mega star Taylor Swift, what does it mean for the rest of the us?

Celebrities are not the only ones being targeted in synthetic sexually explicit material. Recent research has shown that SSEM, such as “nudifying” technology, is increasingly being used to target people in the general public. In 2019, Sensity uncovered a deepfake ecosystem with a DeepNude bot at the center and according to self-reports from users of the bot, 70% of targets were “private individuals whose photos are either taken from social media or private material.

Reflecting the deeply ingrained misogyny among the producers of this material, 99% of all synthetic sexually explicit material features women. However, men, LGBTQ+ individuals, and children are being victimized as well. Truly, no one is safe from this threat.

Congress must pass laws to protect against AI-generated/deepfake pornography

At this point in time, there are NO federal laws to protect people from being victimized through AI-generated/deepfake pornography, or any form of image-based sexual abuse! This is a terrifying shortcoming and we are working hard to fix it.

Help us urge Congress to pass these three critical bills which would help protect against image-based sexual abuse, including AI-generated/deepfake pornography:

  1. The PROTECT Act, recently introduced, would require websites that allow sexually explicit material to obtain verified consent from individuals both uploading content or appearing in uploaded content, and would also require websites to remove images uploaded without consent.
  2. The DEFIANCE Act, which was also introduced into the Senate recently, solidifies the right to relief for individuals victimized through “intimate digital forgeries” (e.g. AI-generated/deepfake pornography)
  3. The SHIELD Act, which was passed out of the Senate Judiciary Committee and awaits a vote in the Senate, makes it a criminal offence to knowingly distribute (or attempt or threaten to distribute) ‘an intimate visual depiction” of an individual without their knowledge or consent.
  4. The Preventing Deepfakes of Intimate Images Act, which was introduced in the House of Representatives, which creates a civil right of action for victims of AI/digital depiction deepfakes and establishes this as a crime.

TAKE ACTION NOW, urging your Congressional Representatives to co-sponsor these crucial bills!

Who is the key corporate player enabling this society-wide digital sexual exploitation? Microsoft’s GitHub.

GitHub is a leading platform for software development and collaboration, and arguably the most prolific space for Artificial Intelligence development. Unfortunately, it is also a major contributor to AI-generated/deepfake pornography.

GitHub hosts three of the most notorious technologies for creating AI-generated/deepfake pornography—DeepFaceLab, DeepNude, Unstable Diffusion. It also contains direct links to dedicated deepfake pornography websites and community forums.

GitHub has radically changed the ease with which people can be violated through AI-generated/deepfake pornography. In less time than it takes to brew a cup of coffee, a realistic sexual image of you can be created and distributed without your consent or knowledge through the Artificial Intelligence technologies GitHub hosts. A few clicks can devastate a person and entirely shift the trajectory of their life. This is the stuff nightmares are made of, and it is increasingly possible thanks to GitHub. (Learn more about GitHub here.)

GitHub needs to become part of the solution, NOT the problem

When there is so much good to be pursued and achieved in the world through innovation and technology, why is GitHub paving the way to destroying people’s lives? Why are they perpetuating this sub-culture that enable sexual violation of individuals and their families?

Leading and innovative companies have such an opportunity to rise up and elevate humanity, and it comes down to a simple choice: What foundation is GitHub going to choose to build? One that will reach new heights for society or one that will surely make lives crumble to the ground?

GitHub has a massive responsibility to become part of the solution to the scourge of image-based sexual abuse, rather than a key perpetuator of the problem.

TAKE ACTION NOW, demanding GitHub stop facilitating AI-generated pornography!

The Numbers


NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.


The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.


NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide


Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.