Roblox: A Tool for Sexual Predators, A Threat for Childrens’ Safety

S.U. started playing Roblox when she was 9 or 10. She was manipulated, exploited, abused, and by 11 years old she had attempted suicide twice. 

She is not the only one.  

Children 12 and under make up the largest demographic of Roblox’s users. But the company is doing an astonishingly poor job of protecting their young userbase. This is why we have added Roblox to our Dirty Dozen List for the second year in a row, and are inviting you to join us in urging them to change!

What is Roblox? 

With over 70 million daily active users worldwide, Roblox is a gaming platform that allows users to program and play games, which are called “experiences.” The most popular of these experiences, by far, are categorized as “Role-playing.” This allows users actualize and enact their dreams on screen. 

The reality is that Roblox’s child safety measures are grossly insufficient. Adults can easily impersonate minors due to no age verification requirements. Also, the chat restrictions on minors’ accounts are easily subverted through the use of signs and other, unmoderated forms of communication. 

Roblox is an under-moderated platform that exposes minors to content that should be restricted for their protection. The Roblox Corporation prioritizes the return on investment for their shareholders over the safety of their minor users, and their Terms of Use attempt to place the entirety of responsibility on the parents and children themselves, instead of taking on any accountability for designing their products safely. 

Child Sexual Exploitation and Grooming on Roblox 

Our efforts to end sexual abuse and exploitation at NCOSE are rooted in prevention efforts. Roblox’s experiences are being used as predators’ playgrounds—the platform often acts as a first point of contact between abusers and their victims. Improving safety standards on platforms like Roblox is key in preventing sexual exploitation for the millions of young children who use them. 

A simple Google search easily yields results about grooming and exploitation occurring on Roblox. To list just a few recent examples:  

  • In October 2023, an 11-year-old girl from New Jersey was kidnapped by a man she played video games with on Roblox.  
  • In June 2023, a minor “under 16” was sexually abused in California by a 21-year-old man she met on Roblox.  
  • In April 2023, a 13-year-old boy from Utah met a predator on Roblox, was groomed publicly on Twitter, then kidnapped and sexually assaulted before being rescued.  
  • In April 2023, a 14-year-old girl from Ohio was sexually assaulted by a man she met on Roblox. The man posed as a 17-year-old on the platform and convinced the girl to send him nude images before picking her up from school and sexually assaulting her.  

Sexualized Themes and Activities 

Sexualized themes and activities are common on Roblox, and children can easily encounter them. NCOSE researchers experienced this when they created a fake account for a 10-year-old boy, and were exposed to virtual sex acts and strip clubs

Upon creating a fake account for an 11-year-old girl, NCOSE researchers were humped by a character in a bunny costume, solicited for group sex acts, witnessed what looked like prostitution, and more.  

It is unacceptable that, even with all the documented harms, Roblox still does not automatically default settings for children to the safest measures. The “parental controls” Roblox offers are grossly inadequate and oftentimes a parent is unaware their child even has an account. 

Flawed Safety Measures 

NCOSE has identified seven major flaws in Roblox’s child safety measures: 

  1. Lack of Age Verification: Roblox does not verify the age of its users. This allows children to be exposed to harmful content and adult users to lie about their ages to gain easier access to minors. 
  2. Inadequate Filtering: Sexual content and extreme violence are not being filtered effectively, even with parental controls in place. 
  3. Limited Visibility: Roblox’s Parental Controls do not give parents full visibility into their child’s activity on the platform. 
  4. Limited customization: The filtering system, blocked words, and experiences lack customization options. Also, parents cannot stop strangers from friending their children. 
  5. Insufficient Guidance: Roblox’s parental controls do not give parents enough information about how to effectively monitor their child’s activity while also putting the entirety of the responsibility on the parents. 
  6. Lack of Transparency: Many parents trust Roblox’s parental controls. Roblox owes it to parents to explain the shortcomings of their safety features and clarify how their systems are commonly worked around (using signs, for example). 
  7. Complicated set up: Parents must go to multiple places within the Roblox app to restrict content access and who can interact with their children. Instructions for setting up parental controls are not easy to follow, and again, the responsibility of keeping children safe is forced upon their parents. 

As of May 10th, 2024, Roblox Corporation’s market cap is over $20B, and certainly has the financial capacity to implement more comprehensive child safety measures. The corporation’s moral compass is directed toward profitability, sacrificing child safety across the globe. 

NCOSE’s requests for improvement: 

  1. Default minors’ accounts to the highest safety settings. 
  2. Disable direct messaging between children and adults with whom are strangers; consider blocking access to “Roblox Connect” and other means of direct messaging for children 15-years-old and under. 
  3. Expand caregiver’s tools to help parents and guardians protect their children; require more extensive permissions to be granted by parents and guardians for children 12-years-old and under to access the platform. 
  4. Implement age verification measures to restrict adult users from lying about their age to gain access to children. 
  5. Improve prevention and moderation measures to block/remove all inappropriate content that contains sexual themes. At minimum, do not allow minors to access such games. An additional age rating of 17+ must be added to continue allowing these sexual and violent themes to proliferate. 
  6. Monitor and enforce a system of predatory behavior identification. Regularly review, take action against violators, and monitoring reports escalated to the board of directors. 

S.U. was virtually approached and coerced by adults on Roblox who posed as other children or as moderators of Roblox experiences. S.U.’s first point of contact with her abusers was on Roblox. How many other children are victims? 

We cannot allow corporations like the Roblox Corporation to continue to prioritize their profit margins over the safety of children. 

ACTION: Take 30 SECONDS to email Roblox executives, using the quick action button below.  

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.

The 2024 CESE Global Summit Starts In...

Days
Hours
Minutes
Seconds

Don't Miss Out!