Elise (pseudonym) was using Meta’s virtual reality headset one day, when something troubling came across her screen: A message from a man she didn’t know, requesting explicit photographs. He followed up by asking if she wanted to come over to his house.
Elise was 9 years old.
This same man was later found guilty of exploiting another child online and sentenced to 35 years in prison.
According to former Meta researchers, interactions like this are all too common on Meta Quest, the virtual reality headset developed by the tech giant. Cayce Savage, a Meta researcher turned whistleblower, says she estimates that every single child who enters a social space on Meta Quest will be either sexually propositioned, sexually abused, or exposed to sexual content. She says, “I see it every time I use the headset.”
Given that Meta makes a large portion of its income from child users (many who are under the platform’s designated age limit, like Elise), it’s clear that they have consistently turned a blind eye to dangerous situations like this.
But Meta actually does do research on child exploitation on its platforms—research that they instruct employees to erase and doctor in order to preserve their reputation while still making millions off the backs of vulnerable children.
Meta’s Virtual Reality Tool is “Full of Underage Children”
Former researchers blew the whistle on the tech giant at a recent hearing held by the Senate Judiciary Committee. These whistleblowers testified that that Meta has been actively trying to cover up any evidence of child exploitation on their platforms, especially on their virtual reality tool.
“From my first days in virtual reality labs, Meta leadership and legal teams were in complete control of the research I was conducting. This was crucial research, because this was a largely untested technology, but I soon learned that Meta had no interest in VR safety unless it could drive interaction and thus profit,” said Dr. Jason Sattizhan, former staff researcher for Meta.
Cayce Savage, another former staff researcher at Meta, testified alongside Sattizhan at the hearing. Savage, who also worked heavily in research for virtual reality said that Meta’s VR tool was rife with minors, many under the age of 13, which is the platform’s age requirement.
“Meta purposely turns a blind eye to this knowledge, despite it being obvious to anyone using their products,” she said. “If Meta were to acknowledge the presence of underage users, they would be required to kick off those users from their platform in order to remain COPPA compliant.”
COPPA refers to the Children’s Online Privacy Protection Act, which requires websites to obtain parental consent before collecting data on users under the age of 13.
Savage continued: “This isn’t happening because it would decrease the number of active users.”
Sen. Josh Hawley recalled Meta CEO Mark Zuckerburg’s testimony in front of the Senate in January of 2024, where he said, “We don’t allow people under the age of 13 on our service and if we find them, we remove them from our service.” Zuckerburg continued: “We don’t want users under the age of 13.”
But Savage said VR is “full of underage children” and this fact is “apparent to anyone who uses the product.”
When prompted by Hawley if Meta CEO Mark Zuckerburg was aware of the number of children on his VR platform, Savage responded, “The only way that he would not be aware is if he had never used his own headset.”
Pushes by Meta Executives to Erase Research on Harms in VR
Sattizhan and Savage both testified that during their time at Meta, high-level executives frequently told them to erase or doctor research that showed children were being harmed on the platform. Sattizhan was asked to conduct research on the Meta VR headset in Germany to test if it was safe for German users.
“When our research uncovered that underage children using Meta VR in Germany were subject to demands for sex acts, nude photos, and other acts that no child, should ever be exposed to, Meta demanded that we erase any evidence of such dangers that we saw.”
Savage testified that she was told not to investigate harms that children were experiencing in VR. “I was made to feel I was risking my job if I pressed the matter.”
Virtual Reality Feels Like Real Life
There’s a reason it’s called virtual reality. It’s because it feels real. The user puts the headset on and they’re playing games and having interactions as if it’s right in front of them. As if it’s happening right in their living room. While virtual reality is not inherently dangerous, when harms do occur in the metaverse, there is an extra layer of gravity to them, given that VR is designed make users feel as though they are experiencing these interactions in real life. We are not against VR, but like all tech, we want it to be designed safely.
Australian-based survey data collected by the eSafety Commissioner found that negative experiences were extremely common in virtual reality: 71% of users reported negative experiences in VR in the last 12 months. Common experiences include the following:
Repeated unwanted messages or online contact from someone other than cold calling/marketing (16%), getting sent unwanted inappropriate content online e.g. porn or violent content (13%), someone attempted to groom me (9%), threats to share private photos of you online or electronically (9%), private photos/videos (nude/semi-nude/sexual) of you shared online or electronically without your consent (8%).
Further, the study points out how VR is different from other interactive online platforms because of the physicality of it. Certain behaviors can “manifest differently” in VR than in another online setting, especially when they are sexually exploitative or unwanted.
For example, many users (61% in this study) reported using haptic technology to enhance their VR experience. Things such as haptic suits, gloves, and backpacks can make the VR experience more immersive by allowing users to feel physical sensations like vibrations, impacts, or even temperature changes that are taking place in the metaverse.
However, this amplifies the effect of unwanted sexual touching (which 9% of users reported experiencing) because the haptics make it seem as though someone is physically touching you, despite it taking place remotely. Put simply: a person can be physically, sexually assaulted in VR and never have to even meet their abuser in real life.
Sattizhan described witnessing scenarios exactly like this in his research:
“The audio that’s transmitted isn’t just solicitation or speech. There will also be instances that we have seen, where you can hear people sexually pleasuring themselves, transmitted over audio in a spatial sense, as you are being surrounded and brigaded and harassed. So, it’s not just simple statements, it is actually the transmission of the motion and the audio of sex acts.”
Meta Must Takes Action to Stop Abuse on VR and all of its Platforms
Sadly, due to Section 230 of the Communications Decency Act, Meta has repeatedly gotten away with ignoring harms that users experience on their platforms.
Section 230 gives Tech near-blanket immunity from such abhorrent behaviors that take place on their websites by third parties. And as these former Meta employees have exposed, Meta has no intention of trying to mitigate harms on their platforms if it inhibits financial returns. This is why we must compel them to do so through legislative action.

