The Terrifying Reality of GitHub: How Any of Us Can be Exploited in Seconds

[Jump to Action]

Let’s play a game.

Raise your hand if there’s a picture of your face online.

Now, raise your other hand if there’s one of your daughter, your girlfriend, your sister, or any other loved one.

It’s probably safe to assume that most or all of you have both hands in the air right now (if you were good sports and played along!). That was fun.

But now comes the punch line. And unfortunately, it isn’t fun at all.

Did you know that, with new developments in technology, any of those photos could be used to create synthetic pornographic videos or images (a.k.a. “deepfake pornography”) of you or your loved ones? Did you know that this pornographic material could be so realistic that no one would be able to tell the difference?

Thanks to Microsoft-owned GitHub, it now takes less than 60 seconds to sexually violate someone in this way. This very trauma has happened to countless women and even children

And the terrifying problem only continues to grow.

What is GitHub?

There’s been a lot of buzz in the news lately about Artificial Intelligence. GitHub, owned by Microsoft, is arguably the most prolific space for Artificial Intelligence development.

Described as a mix between Google Docs and a social media app for programmers, GitHub is a platform designed for individuals to collaborate, manage, and share codes for developing software. GitHub is the largest code-hosting community globally and is the most prominent code-hosting platform.

Unfortunately, GitHub is also a significant contributor to the creation of sexually exploitative technologies, such as those used to victimize women and children through image-based sexual abuse (including but not limited to “deepfake pornography”) and synthetic child sexual abuse material.

The exponential growth in innovation and creative solutions offered by artificial intelligence are to be celebrated. However, technological advances must not supersede or come at the cost of people’s safety and well-being. Technology should be built and developed to fight, rather than facilitate, sexual exploitation and abuse.

By providing a space not only to create but collaborate and amplify abuse, GitHub is not only condoning criminal conduct – it is directly contributing to it.

Take Action! Call on GitHub to stop facilitating sexual exploitation!

GitHub Facilitates Image-based Sexual Abuse (IBSA)

Image-based sexual abuse (IBSA) is a broad term encompassing a range of harmful activities that weaponize sexually explicit or sexualized materials against the persons they depict. IBSA includes the creation, theft, extortion, threatened or actual distribution, or any use of sexually explicit or sexualized materials without the meaningful consent of the person or persons depicted and/or for purposes of sexual exploitation.

GitHub has earned itself a title as a significant contributor to the growing crime of image-based sexual abuse. In particular, GitHub plays a massive role in the creation of synthetic sexually explicit material (SSEM). A type of IBSA, synthetic sexually explicit material involves the alteration of image/s of a person so that a person not in pornography appears to be in pornography ( “deepfake pornography”), or so as to “strip” the person depicted of their clothing (via “nudify apps”).

GitHub is a hotbed of sexual deepfake repositories (where the codes are) and forums (where people chat) dedicated to the creation and commodification of synthetic media technologies, as well as nudify apps. Open-sourced repositories dedicated to IBSA thrive on GitHub, allowing users to replicate, favorite (“star”), and collaborate on sexually exploitative technology without repercussions.

GitHub hosts three of the most notorious technologies used for synthetic sexually explicit material: DeepFaceLab, DeepNude, and Unstable Diffusion. DeepFaceLab, the most popular technology for creating sexual deepfakes (estimated to be used in the creation of 95% of all sexual deepfakes) has been starred on GitHub over 38 thousand times and replicated over 8.7 thousand times. Further, DeepFaceLab’s GitHub repository contains direct links to the most prolific sexual deepfake website in the United States: Mr.DeepFakes. In fact, GitHub is one of the top 10 referral sites for Mr.DeepFakes.

GitHub has made it shockingly easy for anyone to become a victim of image-based sexual abuse. All you need is a few images of their face or a short video clip. With that, in less time than it takes to brew a cup of coffee, you can create synthetic sexually explicit material of the person that will likely haunt them for the rest of their lives.

Because if there’s one thing that survivors of IBSA have repeatedly noted: it’s that the abuse and trauma is of a perpetual nature. This is because the images are often circulated online, making it virtually impossible for the survivors to ever permanently remove them. 

Thanks to Microsoft-owned GitHub, ANY of us could become victims of deepfake pornography. The abusive material can be created in less time than it takes to brew a cup of coffee. @github, stop facilitating sexual exploitation! Click To Tweet

Github Facilitates Child Sexual Abuse Material (CSAM) 

The sexually exploitative technologies hosted on GitHub can also be used to create sexually explicit images of children—i.e. child sexual abuse material (CSAM).

For example, In April 2023, nudifying technology was used to generate CSAM of five teenage girls by male classmates. The male peers stole photos from the 13-year-old girls’ social networks and used a bot to strip the images of the girls naked. The images were then circulated among other peers at their middle school.

Additionally in 2023, a Quebec man created at least seven videos of child sexual abuse material using deepfake technology. He was also found to be in possession of 545,000 computer files of child sexual abuse material. According to reports, this was believed to be the first case involving deepfake technology being used for child sexual exploitation in Canada.

Although GitHub uses automated detection tools such as PhotoDNA to proactively scan for CSAM, they do not have comprehensive moderation practices to remove technology found to have generated CSAM. Thus, these technologies continue to thrive.

ACTION: Demand GitHub Stop Facilitating Sexual Exploitation!

We currently live in a terrifying reality where any of us can be sexually exploited in seconds. This HAS to change. Please take 30 seconds to contact GitHub, urging them to proactively remove sexually exploitative technologies!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Download. Share.
Speak Out. Sign Up.

Download. Share.
Speak Out. Sign Up.

Action Center header image

Download.
Share.
Speak Out.
Sign Up.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.