Sarah (pseudonym) allowed her 9-year old daughter to play a game on Roblox thinking it was just an innocent children’s gaming platform. After all, 42% of its users are under the age of 13. Tragically, she later discovered that the supposed “children’s” gaming network is littered with sexual exploitation.
Sarah’s daughter was approached by another Roblox user whom she thought was “really trendy and cool.” Not long after that, this user lured the 9-year-old off of Roblox and onto a messaging app, where he bombarded her with sexual images, videos, and text messages describing “really disgusting things,” Sarah recalled.
Tragically, this experience is far too common. Pedophiles know that millions of children play Roblox for fun and creative online games. These child predators flock to the gaming platform as a means to manipulate, groom, and extort vulnerable children.
On this week’s episode of “The Movement,” we’ll be discussing some of the recent improvements Roblox made to combat the pervasive sexual abuse that occurs on its platform—and many more thrilling victories!
Roblox Makes Improvements to Parental Controls
Roblox, the popular children’s gaming platform, is a breeding ground for exploitation and was recently described as a “pedophile hellscape for kids”. It is rife with racist and misogynistic games; groups in which pedophiles distribute child sexual abuse material (CSAM); sex-themed games, some that even feature “red-light districts” (as a reminder, the vast majority of users on Roblox are under the age of 18); and it is a hotbed for the grooming of child users.
Recently, after two years on the Dirty Dozen List, the gaming platform made some significant changes you have joined us in calling for since 2023, including defaulting minor accounts to stricter safety settings and limiting access to certain chat features and content. NCOSE is hopeful these changes will curb the harmful effects that Roblox has long been inflicting on its child users. However, more must be done to protect ALL minors from the colossal amount of sexual exploitation that occurs on Roblox. Here is a brief summary of the updates:
Restrictions for Users Under Age 13
- Under 13 accounts are now automatically defaulted to stricter safety settings (e.g., limiting access to certain chat features). However, there is no age verification at account sign up, so kids could easily lie about their birthdate.
- Users under the age of 13 will no longer be able to directly message others on Roblox outside of games or experiences (also known as platform chat).
- Roblox introduced a built-in setting that will limit users under age 13 to public broadcast messages only within a game or experience.
Parent Privileges
- For users under 13, parents can create their own “parent account” and connect it to their child’s account, giving them the ability to control a lot of the safety settings on their kid’s account, such as monitoring screen time, friends, etc.
- The caveat is that previously set up parental controls no longer work – parents are no longer able to set a parent PIN, use Account Restrictions, or receive account-related notifications to their parental email unless they set up a parent account. Furthermore, for users aged 13 and up, parents are not permitted to set up parental controls on their child’s Roblox account, which leaves those child users vulnerable to sexual exploitation by predators on Roblox. NCOSE thinks this is a big mistake and that teens need to be protected by Roblox too.
- Forcing parents to create accounts further inflates Roblox’s user numbers and leaves children without the privilege of informed, involved parents more vulnerable.
Updated content maturity settings
- Roblox updated its content maturity ratings to provide parents with more clarity on the type of content available on its platform. Users under the age of 9 will now need a parent’s permission to access experiences with content maturity level, “Moderate,” which may contain images and themes like moderate violence or moderate crude humor.
- There are new age restrictions on certain experiences for users under age 13 based on the type of user behaviors sometimes found in those experiences. These new restrictions apply to experiences primarily designed for socializing with users outside of their friends list and experiences that allow free-form writing and drawing, such as on a chalkboard or a whiteboard or with spray paint.
While we are happy to see Roblox taking some long overdue steps to protect their youngest users, they must extend those protections to all minors. Parents: you should still remain vigilant if your children and teens play on this dangerous platform.
TAKE IT DOWN Act UNANIMOUSLY Passes the Senate!
In a massive victory, The TAKE IT DOWN Act passed the Senate unanimously this week! It now moves to the House for consideration.
Imagine waking up to find sexually explicit images of yourself, shared without consent, for the world to see. This is the devastating reality for countless people who have been victimized through image-based sexual abuse (IBSA)
It is an uphill battle for survivors to have these images removed from websites. The TAKE IT DOWN Act changes this entirely, ensuring that tech platforms remove IBSA within 48 hours of receiving a victim’s removal request – this is a life-saving solution for survivors.
The fight against IBSA is a fight for the dignity, privacy, and safety of all individuals. The TAKE IT DOWN Act ensures that tech platforms act swiftly and decisively to combat this insidious form of abuse. And the TAKE IT DOWN Act rightly criminalizes the act of uploading IBSA.
We urge the U.S. House to quickly consider three bills that combat image-based sexual abuse – TAKE IT DOWN Act, DEFIANCE Act, SHIELD Act – and to ensure they become law by the end of the year.
Pam Bondi, Veteran Anti-trafficking Advocate, Nominated for Attorney General
We at NCOSE are thrilled by the nomination of Pam Bondi for Attorney General of the United States! Bondi’s career as an attorney shows a significant focus on combatting human trafficking. As Florida’s first female Attorney General, she championed many anti-trafficking initiatives, hosted groundbreaking conferences to combat sexual exploitation, and championed innovative solutions that other states have since followed.
Our hope is that with a fierce anti-exploitation advocate in the nation’s highest law enforcement position, we can make substantial, viable change to propel this movement forward.
NCOSE Files Amicus Brief in Support of Texas Age Verification Law
Free Speech Coalition v. Paxton is a case up for review by the Supreme Court. The Free Speech Coalition (FSC), which lobbies on behalf of the pornography industry, challenged a Texas law that protects children from accessing harmful online pornography. This law would require pornography websites to verify that users are 18 years or older before allowing them access to the site.
The Court’s decision on this case will impact the ability of elected officials and communities to protect children from technology-distributed pornography. It will set the tone for how laws like this are interpreted in states across the country.
We commend the members of the Texas State Senate for enacting and defending this law. The NCOSE Law Center filed an Amicus Brief on behalf of 15 members of the Texas State Senate in a legal challenge that will be heard by the U.S. Supreme Court on January 15, 2025.
Combatting Sextortion on Social Media
Thousands of youth – especially teen boys – have fallen victim to sextortion on Instagram and Snapchat. Tragically, many have even died by suicide after manipulative criminals threatened them and their families if they didn’t do what was demanded.
Finally, after years of pleading for change, these two platforms have taken steps to reduce the risk of minors being victimized through sextortion. Instagram made teen accounts private by default – and kids 15 and under will need parents’ permission to make it public.
Snapchat added in-app warnings to alert teens to possible suspect friend requests and invested resources in identifying and removing accounts showing signs of sextortion behavior. While these platforms remain extremely risky, we hope these improvements may save lives.
Now, we need to call for legislation that will require ALL social media companies to prioritize child safety! Urge your representatives to pass the Kids Online Safety Act (KOSA)!
The Dire Need for the Kids Online Safety Act: Watch the Issue Briefing
The Kids Online Safety Act is vital to protect children online. With the legislative session on the brink of conclusion, the time is NOW to get this bill passed. We cannot wait any longer.
In the time it takes for this bill to be marked up and voted on AGAIN in the next Congress, how many more children will die because of child predators on the Internet?
But don’t just take it from us. Take it from survivor parents who have lost their children at the hands of social media. Watch our issue briefing to hear their stories and why, to them, KOSA is personal.
Additionally, if you are a person of prayer, consider joining in a Day of Prayer for KOSA’s passage, on Dec. 10 at 11 AM ET via Zoom, or pray individually if you cannot attend.