Donate Now

Kids Are Using AI to Violate Their Peers – and Microsoft’s GitHub is the Cause

By:

Updated 10/11/2024:

After this blog was published, In September, the White House made an announcement about AI model developer’s voluntary commitments to combat image-based sexual abuse (IBSA), which included a public pledge from Microsoft’s GitHub to address the issue. The announcement drew our attention to a policy change GitHub had quietly made in May, prohibiting projects that are “designed for, encourage, promote, support, or suggest in any way” the creation of IBSA.

Although NCOSE repeatedly reached out to multiple Microsoft and GitHub contacts asking if they had made any of the changes requested through the Dirty Dozen List campaign, they all neglected to mention this new policy, or even acknowledge our emails or the campaign.

Nonetheless, we joyfully celebrate this victory with GitHub, which YOUR voice and action made possible!

Read more about Github’s new policy and their public commitment to combat IBSA here.


Jessica*, a 15-year old from New Jersey, returned to her history class one day to find a cluster of girls whispering to each other. She approached them to ask what they were discussing. What she heard next was horrifying:

Boys at her high school were using images of their female peers’ faces and editing them onto AI-generated nude bodies.

The girls decided to discuss the problem with administrators at the school, but doubtful that she would personally fall victim to this sexually exploitative behavior, Jessica continued to go about her day, focusing on her schoolwork and extra-curriculars. As time went on, she made the devastating discovery that she was one of the girls whom her male classmates had created AI-generated nude photos of.

As she left the front office following her meeting with the administration, she noticed a few of the girls, whose faces were in the pictures, crying. But what really made her stomach churn was the group of boys laughing at them from just a few feet away.

“I didn’t think my classmates could do this to me,” she said.

*Survivor name changed

The Alarming Prevalence of AI-Generated Sexual Abuse Images

In an era where major technological developments occur practically daily, it forces many of us to be wary of the information we allow to be public.

But now, it seems, nobody is safe.

With the abuse of newly developed AI technology, it doesn’t matter if you’ve never taken a nude photo in your life—you could still be victimized through image-based sexual abuse (IBSA) or child sexual abuse material (CSAM).

This traumatic experience for women and young girls is becoming far too common. In fact, recent research from Thorn, a leading child safety organization, found that 1 in 10 minors say their friends or classmates have used AI to generate nude images of other kids.

Microsoft’s GitHub Stands at the Root of The Problem

There is one corporation which is at the root of all this sexually exploitative AI technology: Microsoft’s Github.

Microsoft’s GitHub is arguably the world’s most prolific space for artificial intelligence development. Described as a mix between Google Docs and a social media app for programmers, GitHub is a platform designed for individuals to collaborate, manage, and share codes for developing software.

However, an unsettling fact that Microsoft refuses to acknowledge is that GitHub has been a significant enabler of AI-generated CSAM and IBSA.

GitHub is a hotbed of sexual deepfake repositories (where the codes are) and forums (where people chat) dedicated to the creation and commodification of synthetic media technologies. These technologies have aided the development of “nudifying apps”, where users can digitally “strip” women of their clothing in images. Currently, this type of nudifying technology only works on images of women, underscoring the highly gendered nature of sexual violence.

The platform also hosts codes and datasets which are used to create AI-generated CSAM. Unfortunately, this phenomenon has become drastically more prevalent in the past year. In December of 2023, the Stanford Internet Observatory discovered over 3,200 images of suspected CSAM in the training set of LAION-5B, a popular generative AI platform called Stable Diffusion. This dataset was available on GitHub.

Whether it’s teenage boys creating AI-generated “nudes” of their female peers, or adult predators indulging their pedophilic appetites—the problem of AI-generated CSAM is clearly exploding.

GitHub Has Neglected to Make Any Requested Safety Changes

In 2023, GitHub was placed on NCOSE’s Dirty Dozen List—an annual campaign that calls out 12 mainstream contributors to sexual abuse and exploitation. The Dirty Dozen List mobilizes concerned citizens (people like YOU!) to contact these 12 entities, urging them to make much-needed safety changes. As a result, many Dirty Dozen List targets have made sweeping improvements to their policies and practices.

Yet GitHub has done nothing. 

Even after being placed to the Dirty Dozen List for a second time, in 2024, GitHub failed to even acknowledge the campaign—let alone make the safety changes requested.

GitHub Markets to Children, Fueling the Rise in AI-Generated CSAM

Disturbingly, GitHub is marketed for ages 13 and up. Given this, is it any wonder that 1 in 10 minors now say their friends and classmates have created AI-generated nudes of other children?

While GitHub presents opportunities for kids to educate themselves on coding and software development, the lack of safety standards and high rates of abuse on the platform make it highly inappropriate for users of this age. Furthermore, GitHub does not have an age verification feature, rather just a valid email address is all that is required to set up an account. This means that users even younger than the age of 13 could access this highly dangerous platform.

Teen girls are being sexually exploited by their peers, their privacy and mental health torn to pieces. Women are losing their jobs and reputations because of sexually explicit images that are entirely fabricated. Countless lives are being destroyed because of the technologies that are created on Microsoft’s GitHub.

Microsoft is the second richest company in the world—they have the resources to fix this. How much longer will they turn a blind eye to the way their platform is wreaking havoc on the lives of women and children?

ACTION: Call on Microsoft’s GitHub to Stop Enabling Sexual Exploitation!

The rampant abuses propagated by Microsoft’s GitHub are relatively uncharted territory. With AI technology developing so rapidly, it is imperative that we act NOW before the problem becomes so much worse. Please take 30 SECONDS to complete the quick action below!


Learn more about how Microsoft’s GitHub is contributing to sexual exploitation here.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.