Mainstream Contributors To Sexual Exploitation
Deepfakes, “nudify” apps, and AI-generated pornography originate on this collaboration platform for software development.
Imagine: you receive a call from a family member…a boss…a boyfriend. Someone sent them a sexually explicit video of you that they found on a pornography site.
It’s not possible! It can’t be you – you know that.
And it’s not you…but it is your face. And it looks so incredibly, terrifyingly real.
Your image was stolen to make deepfake pornography – and it was likely created on GitHub.
This very trauma has happened to countless women and continues to escalate due to Microsoft-owned GitHub.
So what is GitHub?
Described as a mix between Google Docs and a social media app for programmers, GitHub is a platform designed for individuals to collaborate, manage, and share codes for developing software. GitHub is the largest code-hosting community globally and is the most prominent code-hosting platform. It’s popular amongst developers because of its open-source design, allowing anyone to access, use, change, and share software. Some of the biggest names in the tech industry (Google, Amazon, Twitter, and Microsoft) use GitHub for various initiatives.
GitHub is arguably the most prolific space for Artificial Intelligence development.
Unfortunately, GitHub is also a significant contributor to the creation of sexually exploitative technology and a major facilitator of the growing crime of image-based sexual abuse: the capture, creation, and/or sharing of sexually explicit images without the subject’s knowledge or consent.
GitHub is a hotbed of sexual deepfake repositories (where the codes are) and forums (where people chat) dedicated to the creation and commodification of synthetic media technologies, as well as ‘nudify’ apps that take women’s images and “strip them” of clothing, such as DeepNude. Currently, nudifying technology only works on images of women. Open-sourced repositories dedicated to IBSA thrive on GitHub, allowing users to replicate, favorite (“star”), and collaborate on sexually exploitative technology without repercussions.
The exponential growth in innovation and creative solutions offered by artificial intelligence are to be celebrated. However, technological advances must not supersede or come at the cost of people’s safety and well-being. Technology should be built and developed to fight, rather than facilitate, sexual exploitation and abuse.
By providing a space not only to create but collaborate and amplify abuse, GitHub is not only condoning criminal conduct – it is directly contributing to it.
Microsoft could drastically reduce the number of people victimized through image-based abuse. Challenge them to lead the tech industry in ethical approaches to AI.
Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to GitHub for more details.
*Please also see our Discord and Reddit pages for examples of the type of deepfake pornography and other forms of image-based sexual abuse that originate on GitHub and proliferate across platforms.
GitHub significantly contributes to the creation of sexually exploitative technology and has been noted for facilitating the growing crime of image-based sexual abuse: the capture, creation, and/or sharing of sexually explicit images without the subject’s knowledge or consent. GitHub hosts guides, codes, and hyperlinks to sexual deepfake community forums dedicated to the creation, collaboration, and commodification of synthetic media technologies, and AI-leveraged ‘nudifiying’ websites and applications that take women’s images and “strip them” of clothing.(1)
Open-sourced repositories dedicated to image-based sexual abuse (IBSA) thrive on GitHub, allowing users to replicate, favorite (star), and collaborate on sexually exploitative technology without repercussions. GitHub hosts three of the most notorious technologies used for synthetic sexually explicit material (SSEM) abuse: DeepFaceLab, DeepNude, and Unstable Diffusion.
Despite GitHub’s public statement against sexually exploitative source codes on the in 2019, citing codes such as DeepNude as violating their policies and would be banned and removed from GitHub if similar codes or replications appeared, iii as of April 2023, similar codes remain active.iv NCOSE researchers searched “deepnude” on GitHub and received 54 results for similar repositories, some of which contained the original source code for DeepNude.
Because of GitHub’s failure to effectively remove the original source code, ban replications, and moderate GitHub for similar source codes to DeepNude, replicas and “copycat” technology continue to proliferate.
Recent research has shown that synthetic sexually explicit material (SSEM), such as nudifying technology is increasingly being used to target women in the general public. In 2019, Sensity uncovered a deepfake ecosystem with a DeepNude bot at the center and according to self-reports from users of the bot, 70% of targets were “private individuals whose photos are either taken from social media or private material.”
The DeepFaceLab repository on GitHub directly sends users to the Mr.DeepFakes website – a website dedicated to creating, requesting, and selling sexual deepfakes of celebrities and ‘other’ women… As a 2020 Motherboard: Tech by Vice article It Takes 2 Clicks to Get From ‘Deep Tom Cruise’ to Vile Deepfake Porn, pointed out, “[i]t is impossible to support the DeepFaceLab project without actively supporting non-consensual porn.”
GitHub actively contributed to the creation of a dataset circling deepfake and AI communities that contained content depicting actual instances of sex trafficking, physical abuse, drugging, and rape of women from both Girls Do Porn and Czech Casting. Videos from Girls Do Porn and Czech Casting were found to have been used to create a dataset for AI and deepfake communities to generate sexual deepfake and AI-generated pornography.
Additionally, when NCOSE researchers conducted a search on GitHub for “onlyfans” more than 300 results containing image scrapers for downloading content from OnlyFans were returned.
Pornography and sexually explicit imagery scrapers and save bots thrive on GitHub. These have been used to scrape and save images from pornography and social media websites. Considering the growing body of evidence, survivor testimony, and lawsuits against the most visited pornography sites (Pornhub, XVidoes, XHamster) for hosting actual depictions of, sex trafficking and child sex abuse material, rape, and various forms of image-based abuse (“revenge porn,” “upskirting/downblousing,” and “spycam shower footage”) GitHub’s allowance of these scrapers further contributes to the dissemination of criminal and/or nonconsensually captured content. It further demonstrates GitHub’s pivotal role in contributing to IBSA and sexual exploitation by hosting such repositories.
See all of the proof we’ve compiled in this downloadable PDF.
NCOSE is deeply concerned about the extent of features on GitHub that may increase risk of harm to minors, such as limited content moderation policies and a lack of caregiver and teacher/administrative controls.
GitHub rates itself as 13+, and actively promotes numerous educational opportunities for students from middle school to college to learn software development and coding. Geared towards minor-aged users and student-teacher collaboration and development on GitHub, over 1.5 million students signed up for the Student Developer Pack Student Developer Pack by 2019. GitHub allows teachers and administrators to connect GitHub accounts to learning management systems such as Google Classroom, Canvas, and Moodle. The use of Google Classroom in primary education exponentially rose during the pandemic which surpassed 150 million student users by 2021. GitHub requires a valid email to create an account but fails to implement other basic safety features for students, such as age verification and meaningful content moderation.
NCOSE researchers did not find evidence of teacher/administrative safety controls for underage student users. While teachers can restrict public access to joining classroom repositories, “by default visibility to student repositories is public.” GitHub places the responsibility of administrators to upgrade their accounts to access private student repository capabilities – for a fee. Every child should be safe online, especially in an environment promoted for his or her education and learning services. GitHub is asking parents and teachers to put a price on their child’s safety. This feature should be free and on by default.
Further, NCOSE researchers also did not find evidence of parental controls for underage users. Considering the open-sourced format of the platform and the amount of sexually exploitative source coding available, parental controls are urgently necessary to further protect GitHub’s youngest users.
Below are just a few examples of the types of content minors can freely and openly access on and through GitHub:
See all of the proof we’ve compiled in this downloadable PDF.
GitHub’s lack of effective content moderation practices allows sexually exploitative technology and IBSA to thrive. GitHub implements automated detection tools such as PhotoDNA to proactively scan for child sexual abuse material (CSAM), but GitHub fails to implement comprehensive moderation practices permitting technology found to have generated CSAM but only moderates for CSAM itself.
In April 2023, nudifying technology was used to generated CSAM of five teenage girls by a male classmates. The male peers stole photos from the 113-year-old girls’ social networks and used a bot on a messaging platform called BikiniOff to strip the images of the girls naked, generating child sexual abuse material (CSAM). The images were circulating amongst other peers at their middle school. Authorities were called and a full investigation by the juvenile prosecutor’s office for the production of ‘child pornography.’ The two teenage boys who generated the images were unaware of the harm and trauma they had caused the victims and stated, “It’s just a joke.”
However, child sexual abuse material is not a joke, it is illegal and results in traumatic experiences for victims, especially given the online nature of the incident. GitHub’s permittance of hosting source codes such as DeepNude has resulted in a proliferation of their replication. GitHub must ban and immediately remove repositories dedicated to the creation of sexually exploitative technology.
Additionally in 2023, a Quebec man created at least seven videos of child sexual abuse material using deepfake technology. He was also found to be in possession of 545,000 computer files of child sexual abuse material. According to reports, this was believed to be the first case involving deepfake technology being used for child sexual exploitation in Canada.
The presiding provincial court judge wrote that “the creation of the new images of sexual abuse encourages the market for child pornography, which craves novelty, and puts children at risk by ‘fuelling fantasies that incite sexual offences against children.’”
GitHub hosts all of the concerns outlined above. GitHub must act immediately and accept responsibility for the type of software being developed on their platform.
See all of the proof we’ve compiled in this downloadable PDF.
Below is a list of news articles related to the exploitation facilitated by GitHub.
See all of the proof we’ve compiled in this downloadable PDF.
Microsoft owns both GitHub and OpenAI (creator of two of the most notorious AI technologies ChatGPT and DALL-E)
90% of Fortune 100 companies are customers of GitHub, including Amazon (Amazon Web Services) and Google (Alphabet)
GitHub comprises over 100 million active developers, more than 330 million repositories (39.7 million of those are public), 413 million open-sourced contributions in 2022, and more than 4 million organizations
GitHub hosts the source code to the software used to create 95% of deepfakes, DeepFaceLab, which directly sends interested users to the most prolific sexual deepfake website in the United States.
GitHub is one of the top 10 referral sites for Mr.DeepFakes.