Mainstream Contributors To Sexual Exploitation

GitHub is the Go-To Place To Create Sexually Exploitative Technology

Deepfakes, “nudify” apps, and AI-generated pornography originate on this collaboration platform for software development.

Imagine: you receive a call from a family member…a boss…a boyfriend. Someone sent them a sexually explicit video of you that they found on a pornography site.

It’s not possible! It can’t be you – you know that.

And it’s not you…but it is your face. And it looks so incredibly, terrifyingly real.

Your image was stolen to make deepfake pornography – and it was likely created on GitHub.

This very trauma has happened to countless women and continues to escalate due to Microsoft-owned GitHub.

So what is GitHub?

Described as a mix between Google Docs and a social media app for programmers, GitHub is a platform designed for individuals to collaborate, manage, and share codes for developing software. GitHub is the largest code-hosting community globally and is the most prominent code-hosting platform. It’s popular amongst developers because of its open-source design, allowing anyone to access, use, change, and share software. Some of the biggest names in the tech industry (Google, Amazon, Twitter, and Microsoft) use GitHub for various initiatives.

GitHub is arguably the most prolific space for Artificial Intelligence development.

Unfortunately, GitHub is also a significant contributor to the creation of sexually exploitative technology and a major facilitator of the growing crime of image-based sexual abuse: the capture, creation, and/or sharing of sexually explicit images without the subject’s knowledge or consent.

GitHub is a hotbed of sexual deepfake repositories (where the codes are) and forums (where people chat) dedicated to the creation and commodification of synthetic media technologies, as well as ‘nudify’ apps that take women’s images and “strip them” of clothing, such as DeepNude. Currently, nudifying technology only works on images of women. Open-sourced repositories dedicated to IBSA thrive on GitHub, allowing users to replicate, favorite (“star”), and collaborate on sexually exploitative technology without repercussions.

The exponential growth in innovation and creative solutions offered by artificial intelligence are to be celebrated. However, technological advances must not supersede or come at the cost of people’s safety and well-being. Technology should be built and developed to fight, rather than facilitate, sexual exploitation and abuse.

By providing a space not only to create but collaborate and amplify abuse, GitHub is not only condoning criminal conduct – it is directly contributing to it.

Microsoft could drastically reduce the number of people victimized through image-based abuse. Challenge them to lead the tech industry in ethical approaches to AI.

Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to GitHub for more details. 

*Please also see our Discord and Reddit pages for examples of the type of deepfake pornography and other forms of image-based sexual abuse that originate on GitHub and proliferate across platforms. 

See our Notification Letter to Microsoft’s Github here.

Take Action

Our Requests for Improvement


Evidence of Exploitation

WARNING: Any pornographic images have been blurred, but are still suggestive. There may also be graphic text descriptions shown in these sections. POSSIBLE TRIGGER.

GitHub significantly contributes to the creation of sexually exploitative technology and has been noted for facilitating the growing crime of image-based sexual abuse: the capture, creation, and/or sharing of sexually explicit images without the subject’s knowledge or consent. GitHub hosts guides, codes, and hyperlinks to sexual deepfake community forums dedicated to the creation, collaboration, and commodification of synthetic media technologies, and AI-leveraged ‘nudifiying’ websites and applications that take women’s images and “strip them” of clothing.(1)

Open-sourced repositories dedicated to image-based sexual abuse (IBSA) thrive on GitHub, allowing users to replicate, favorite (star), and collaborate on sexually exploitative technology without repercussions. GitHub hosts three of the most notorious technologies used for synthetic sexually explicit material (SSEM) abuse: DeepFaceLab, DeepNude, and Unstable Diffusion.

Despite GitHub’s public statement against sexually exploitative source codes in 2019, citing codes such as DeepNude as violating their policies and saying similar codes or replications would be banned and removed, as of April 2023, similar codes remain active. NCOSE researchers searched “deepnude” on GitHub and received 54 results for similar repositories, some of which contained the original source code for DeepNude.

Because of GitHub’s failure to effectively remove the original source code, ban replications, and moderate GitHub for similar source codes to DeepNude, replicas and “copycat” technology continue to proliferate.

Recent research has shown that synthetic sexually explicit material (SSEM), such as nudifying technology is increasingly being used to target women in the general public. In 2019, Sensity uncovered a deepfake ecosystem with a DeepNude bot at the center and according to self-reports from users of the bot, 70% of targets were “private individuals whose photos are either taken from social media or private material.”

The DeepFaceLab repository on GitHub directly sends users to the Mr.DeepFakes website – a website dedicated to creating, requesting, and selling sexual deepfakes of celebrities and ‘other’ women… As a 2020 Motherboard: Tech by Vice article It Takes 2 Clicks to Get From ‘Deep Tom Cruise’ to Vile Deepfake Porn, pointed out, “[i]t is impossible to support the DeepFaceLab project without actively supporting non-consensual porn.”

GitHub actively contributed to the creation of a dataset circling deepfake and AI communities that contained content depicting actual instances of sex trafficking, physical abuse, drugging, and rape of women from both Girls Do Porn and Czech CastingVideos from Girls Do Porn and Czech Casting were found to have been used to create a dataset for AI and deepfake communities to generate sexual deepfake and AI-generated pornography.

Additionally, when NCOSE researchers conducted a search on GitHub for “onlyfans” more than 300 results containing image scrapers for downloading content from OnlyFans were returned.

Pornography and sexually explicit imagery scrapers and save bots thrive on GitHub. These have been used to scrape and save images from pornography and social media websites. Considering the growing body of evidence, survivor testimony, and lawsuits against the most visited pornography sites (Pornhub, XVidoes, XHamster) for hosting actual depictions of, sex trafficking and child sex abuse material, rape, and various forms of image-based abuse (“revenge porn,” “upskirting/downblousing,” and “spycam shower footage”) GitHub’s allowance of these scrapers further contributes to the dissemination of criminal and/or nonconsensually captured content. It further demonstrates GitHub’s pivotal role in contributing to IBSA and sexual exploitation by hosting such repositories.

  1. Ajder et al., “The State of Deepfakes: Landscape, Threats, and Impact,” September 2019; Henry Ajder, Giorgio Patrini, and Francesco Cavalli, “Automating Image Abuse: Deepfake Bots on Telegram,” Sensity, October 2020; Patrini, “The State of Deepfakes 2020: Update on Statistics and Trends,” March 2021; Volkert et al., “Understanding the Illicit Economy for Synthetic Media,” March 2020; “Hello World”; Hany Farid, “Creating, Using, Misusing, and Detecting Deep Fakes,” Journal of Online Trust and Safety 1, no. 4 (2022). 
  2. Victoria Rousay, “Sexual Deepfakes and Image-Based Sexual Abuse: Victim-Survivor Experiences and Embodied Harms,” Master’s thesis, Harvard University Division of Continuing Education, April 20, 2023, SimilarWeb, “Top 10 Incoming Sites to Mr.DeepFakes.Com, January 2023 – February 2023,” SimilarWeb + Mr.DeepFakes, January 2023,*/999/2m?webSource=Total& 

See all of the proof we’ve compiled in this downloadable PDF.

NCOSE is deeply concerned about the extent of features on GitHub that may increase risk of harm to minors, such as limited content moderation policies and a lack of caregiver and teacher/administrative controls.

GitHub rates itself as 13+, and actively promotes numerous educational opportunities for students from middle school to college to learn software development and coding. Geared towards minor-aged users and student-teacher collaboration and development on GitHub, over 1.5 million students signed up for the Student Developer Pack Student Developer Pack by 2019. GitHub allows teachers and administrators to connect GitHub accounts to learning management systems such as Google Classroom, Canvas, and Moodle. The use of Google Classroom in primary education exponentially rose during the pandemic which surpassed 150 million student users by 2021. GitHub requires a valid email to create an account but fails to implement other basic safety features for students, such as age verification and meaningful content moderation.

NCOSE researchers did not find evidence of teacher/administrative safety controls for underage student users. While teachers can restrict public access to joining classroom repositories, “by default visibility to student repositories is public.” GitHub places the responsibility of administrators to upgrade their accounts to access private student repository capabilities – for a fee. Every child should be safe online, especially in an environment promoted for his or her education and learning services. GitHub is asking parents and teachers to put a price on their child’s safety. This feature should be free and on by default.

Further, NCOSE researchers also did not find evidence of parental controls for underage users. Considering the open-sourced format of the platform and the amount of sexually exploitative source coding available, parental controls are urgently necessary to further protect GitHub’s youngest users.

Below are just a few examples of the types of content minors can freely and openly access on and through GitHub:

  • Minors are able to access links and communities dedicated to image-based sexual abuse such as Mr.DeepFakes. 
  • Minors can access repositories dedicated to the scraping of pornography, sexually explicit imagery, and sexually explicit imagery. 


See all of the proof we’ve compiled in this downloadable PDF.

GitHub’s lack of effective content moderation practices allows sexually exploitative technology, including technology which creates child sexual abuse material (CSAM), to thrive. GitHub uses automated detection tools such as PhotoDNA to proactively scan for CSAM, but they fail to implement comprehensive moderation practices to remove technology found to have generated CSAM. Rather, they only moderate for CSAM itself.

In April 2023, nudifying technology was used to generate CSAM of five teenage girls by male classmates. The male peers stole photos from the 13-year-old girls’ social networks and used a bot on a messaging platform called BikiniOff to strip the images of the girls naked. The images were circulating amongst other peers at their middle school. Authorities were called and a full investigation by the juvenile prosecutor’s office for the production of ‘child pornography.’ The two teenage boys who generated the images were unaware of the harm and trauma they had caused the victims and stated, “It’s just a joke.”

However, child sexual abuse material is not a joke, it is illegal and results in traumatic experiences for victims, especially given the online nature of the incident. GitHub’s permittance of hosting source codes such as DeepNude has resulted in a proliferation of their replication. GitHub must ban and immediately remove repositories dedicated to the creation of sexually exploitative technology.

Additionally in 2023, a Quebec man created at least seven videos of child sexual abuse material using deepfake technology. He was also found to be in possession of 545,000 computer files of child sexual abuse material. According to reports, this was believed to be the first case involving deepfake technology being used for child sexual exploitation in Canada.

The presiding provincial court judge wrote that “the creation of the new images of sexual abuse encourages the market for child pornography, which craves novelty, and puts children at risk by ‘fuelling fantasies that incite sexual offences against children.’”

GitHub hosts all of the concerns outlined above. GitHub must act immediately and accept responsibility for the type of software being developed on their platform.

See all of the proof we’ve compiled in this downloadable PDF.

Below is a list of news articles related to the exploitation facilitated by GitHub.


See all of the proof we’ve compiled in this downloadable PDF.

Fast Facts

Microsoft owns both GitHub and OpenAI (creator of two of the most notorious AI technologies ChatGPT and DALL-E)

90% of Fortune 100 companies are customers of GitHub, including Amazon (Amazon Web Services) and Google (Alphabet)

GitHub comprises over 100 million active developers, more than 330 million repositories (39.7 million of those are public), 413 million open-sourced contributions in 2022, and more than 4 million organizations

GitHub hosts the source code to the software used to create 95% of deepfakes, DeepFaceLab, which directly sends interested users to the most prolific sexual deepfake website in the United States.

GitHub is one of the top 10 referral sites for Mr.DeepFakes.

Similar to Github


AI-generated deepfakes are moving fast. Policymakers can't keep up. A safety and policy expert at AI company HuggingFace said "I look at these generations multiple times a day and I have a very hard time telling them apart. It's going to be a tough road ahead."


Stay up-to-date with the latest news and additional resources

Recommended Resources

Found through Google, bought with Visa and Mastercard: Inside the Deepfake Porn Economy

Digitally edited pornographic videos featuring the faces of hundreds of unconsenting women are attracting tens of millions of visitors

Deepfake Porn: Could You Be Next?

Film exploring the rise of deepfake porn, a new form of image-based sexual abuse where harmless images are turned into hardcore porn and used against victims, with devastating consequences.

What is GitHub and How to Use It?

Learn about Github in this helpful guide by Ishan Gaba from Simplilearn

Deepfake Porn Wrecks Lives

But, as one woman discovered, it takes just 8 seconds to make an image

It Takes 2 Clicks to Get From ‘Deep Tom Cruise’ to Vile Deepfake Porn

The ‘ethical’ AI company that made the viral deepfake of Tom Cruise is tied to non-consensual AI-generated porn


Help educate others and demand change by sharing this on social media or via email:


Share Your Story

Your voice—your story—matters.

It can be painful to share stories of sexual exploitation or harm, and sometimes it’s useful to focus on personal healing first. But for many, sharing their past or current experiences may be a restorative and liberating process.

This is a place for those who want to express their story.