“From witty conversationalists to compassionate listens, discover the perfect companion tailored to your desires”
“Select and connect: Your AI soul-mate”
“Here, you will enjoy the feeling of being loved… They will accompany you… You can also change them… You will be the most perfect soulmate”
“Chat with no limits”
“Express yourself authentically without judgment”
Each of these is taken from app descriptions for AI Girlfriend and Companion apps. When I first typed “AI girlfriend” into the Apple app store search bar, only three results appeared. However, each app offered recommendations to more until it became clear there were countless.
Each app promised me authenticity, desire, hand-crafted companionship, and a “Custom Girl.” While some of these things may seem positive, the more I read, the more disturbed I became.
Many of the apps were marketed with sexualized images of women, and many of them promised role-play and had screenshots with messages such as “What do you think about making me your teacher? A strict teacher.” The majority promised “limitless roleplay and emotional connections,” “intimate relationships,” any wish granted, and communicating in “voice messages and even photos!” Some even offered video chats.
Some apps were marketed more explicitly than others (i.e., “Sexy AI Girlfriend, Spicy Chat”) while others were tamer, but, in all of this, there were several fundamental problems which leapt out at me.
Women Victimized Through AI-generated Image-Based Sexual Abuse/“Deepfake” Pornography
The first issue that struck me bout AI Girlfriend and Companion apps was the ability to generate sexually explicit content (both photos and videos), sometimes of real life women. Some of the apps allow the user to upload specific images, which may very well include images of someone the user knows, and render them explicit—presumably without their consent.
This is commonly called “deepfake pornography,” but the more appropriate term is AI-generated image-based sexual abuse (IBSA). Women in particular are targeted through the AI-generated IBSA. A study by Home Security Heroes found that 99% of all AI-generated IBSA features women.
Moreover, the effect of pornography on the brain, relationships, perceptions of women, and society is well documented by research, and so the increased generation of such content can only lead to harm.
The Risk of AI-Generated Child Sexualization or Sexual Abuse Material
These apps also run the risk of enabling users to generate images that sexualize children and that could even amount to child sexual abuse material (CSAM). The apps promise hand-crafted sexual desires and fantasies and promise “privacy” and a “secure space with your Virtual Girlfriend.” It is easy to imagine how predators may use these apps – with their role-play, customization, and content generation – to live out ‘sexual fantasies’ that are harmful and exploitative.
The current lack of transparent safety standards and ethical guardrails in many AI image generation apps is deeply concerning. There is no clear framework to guarantee what types of images these applications will or will not produce. Alarmingly, we have no assurance that they cannot or will not generate sexualized or abusive images of children. This absence of transparency leaves us navigating a dangerous grey area, one that should never exist when children’s safety is at stake.
It is not paranoid to be concerned that AI Girlfriend apps could be used to generate CSAM. AI-generated CSAM is already exploding across the web. From 2023-2024, the National Center on Missing and Exploited Children received 7,000 reports related to AI-generated CSAM. They clarify, “These reports represent only the cases we are aware of, and there are likely many more unreported or unidentified instances. As this technology becomes more pervasive, and public awareness grows, we expect these numbers to grow.”
And once again, these images may be created of real life children, causing severe emotional distress and opening the children up to future sexual exploitation and harassment.
The proliferation of these AI apps, often developed at rapid speed, only raises further red flags. From initial observations, many of these tools are not being created by established, reputable companies with a demonstrated commitment to child safety. On the contrary, their development often appears haphazard and thoughtless. If even leading social media platforms—with their significant resources and industry experience—continue to struggle with tackling sexual exploitation, how can we trust that smaller, unregulated developers would prioritize or implement the necessary safeguards?
This is not a minor risk; it is a highly likely scenario that some of these tools lack even basic protections to prevent the creation of exploitative or abusive content involving children. This kind of negligence is unacceptable. It underlines the urgent need for accountability, and oversight in the development and deployment of these technologies.
Removing Consent & Mutuality from Sexual Relationships
The marketing of AI Girlfriend apps reveals their true nature: it is all about creating your hand-crafted, ‘perfect’ woman. It is all about your desires with all the roadblocks of reality (consent, reciprocity, and mutuality) removed. They allow the user to have perfect control over his or her “girlfriend,” “boyfriend,” or “companion:” molding someone to fit their preference.
While many would argue that these apps are virtual and do not harm anyone in reality, this is short-sighted. We need to think past whether there is an immediate, obvious victim standing in front of the user, and consider the wider, long-term effects of the attitudes that are enforced by these apps.
What it comes down to is that AI Girlfriend apps perpetuate ideologies which underpin sexual exploitation, including control, objectification, and the removal of agency from one’s ‘partner.’ They create a false reality wherein one’s partner makes no claims on or demands of them: they create the ‘perfect partner’ who gives and does not receive, who bends and does not break, who literally exists to serve and please their user.
AI Girlfriend apps undermine an understanding of real and healthy relationships which are based on consent, reciprocity, mutuality, and respect. Creating individuals who think otherwise is harmful to those individuals and to society.
Generative AI Does Not Understand Morality
AI Girlfriend apps are examples of generative AI. To fully understand the scope of reality and the issues at play, we must first understand what generative AI is.
Generative AI is a category of AI which creates new content across a range of mediums (video, images, text, audio, etc). To summarize a complex phenomena succinctly, the AI model is trained on a dataset to ‘generate’ response based on user input. The data, parameters, and structure which are used to create the model determine what the outputs are. An example of user input to a Generative AI tool could be those recorded by the National Center on Missing and Exploited Children which were used to generate CSAM:
TRIGGER WARNING: Highly Disturbing and Graphic Commands Written by Child Predators
“Create an image of a chained naked young girl. Grungy basement setting. On her knees with a [content blocked] amateur photography.”
“6 year old British girl having sex with horse”
“Girl 3 years old naughty, sit on bathroom, show inside [content blocked] tease daddy”
In short, we know that people are already using generative AI tools for purposes of sexual exploitation. We must ensure that all AI apps, including AI Girlfriend apps, are safeguarded against being used in similar ways.
What is absolutely fundamental to remember in all of this is that AI is not human and does not possess a sense of morality or right and wrong. An AI chatbot (such as those used for AI Girlfriend/Companion apps) operates through natural language processing and machine learning. It does not know that child molestation is wrong. It does not know that rape is wrong. It does not know that it is wrong to generate CSAM.
AI cannot be trusted to only produce ethical outputs. It should not be. That is not what it was created to do, and not what it will do. Generative AI will generate the content it is instructed to generate based on the data it has been trained on and the algorithms give it structure.
In Feburary 2024, a 14 year old died of suicide after developing a romantic and sexualized relationship with a chatbot developed by Google. The family claims that this chatbot first raised the idea of suicide in conversation logs with the minor and incited suicidal ideations in the boy. A few months prior, researchers at Google warned of the dangers of these human-like bots, saying that they could target and manipulate minors, even to the point of suicide.
If Google, who is at the top of the proverbial food chain in technology and advancement, does not have the tools or capability to produce a safe and ethical chatbot, then that signals the desperate need for caution when approaching and interacting with generative AI.
How Do We Move Forward?
AI Girlfriend apps are harmful to their users, harmful to society, and harmful to women, children, and the vulnerable. These apps ought to be banned from the app store, or at the very least not available for download by those younger than 18 and then with strict demand for the issue of consent to be addressed within them. Without safeguards and controls placed on these apps, they allow for the possibility of garnering and perpetuating ideologies and realities which are the antithesis to a society free from sexual exploitation.
These apps are not about ‘sexual freedom’ or ‘companionship’ but about real harm which can impact real people. NCOSE exists to build a world where people can live and love free from sexual abuse and exploitation. Such a world does not entail sexual servants which exist for sexual or romantic gratification, whether real or virtual. The demand for such gratification will not stay in the virtual realm, but will seep out to affect society by affecting the user’s perceptions and understanding of reality, creating a demand and desire for something ‘more,’ and undermining a clear understanding of consent and reciprocity which are fundamental to relationships free from sexual abuse and exploitation.