Campaign to #FixAppRatings Highlights Danger of Grooming and Sexual Abuse on Apps

By:

Last December, we sat down with survivors of domestic minor sex trafficking. They knew about our work at NCOSE trying to change policies that facilitate sexual exploitation and said they had something they wanted us to understand. The oldest was 15.

These young girls shared their heartbreaking stories with us, all of which included the common thread of Snapchat and Instagram used by their sex traffickers to groom them and then to sell them to others for sex over and over again.

These girls also showed us message after message from random men—strangers—that they were receiving with flirtatious messages, “dick pics,” and some even blatantly asking for sex acts in exchange for money. The girls were getting these direct messages even though some of their profiles were set to private.

Upon hearing their accounts, we reached out to other survivors and to survivor-serving organizations to see if this was a trend. It seems to be as many had similar experiences of being groomed and then sold over these popular platforms.

I then started reaching out to youth-serving groups to see if other young people were getting such attention from strangers. Many shared with me that they were. Random strangers, mostly adult men, sending unsolicited private direct messages to teenage girls.

We have long known and advocated for solutions to help protect young people from sexually explicit media and hardcore pornography on the Internet. But, upon speaking with many young people about the grooming and sexual assault hurled at them on these platforms, they also made it very clear that they are troubled by other dangers. Many popular apps, including Instagram and Snapchat, are riddled with pornographic videos and images, as well as other content glamorizing and pushing risky sexual behaviors—like group sex or sex with strangers. Some of these teens told me, “There is just nowhere we can go without being bombarded by this kind of stuff.”

Many kids might be able to just ignore this kind of attention or scroll past the porn. But, why should we put the responsibility on them, with their not-yet-mature brain, to try to withstand the barrage of dangers coming at them online?

A part of the solution:

Right now, app developers self-rate their own apps based on their own whims. There’s no governing body that oversees the ratings process. As a result, many apps popularly used by children are given misleading, I would even say dishonest, ratings with generic explanations. This leaves parents with a false sense of security and minimal warnings when trying to decide what apps are appropriate for their family. How are parents supposed to help protect their kids when big tech is hiding the dangers rampant on their platforms?

Both of the apps discussed above, Snapchat and Instagram, are rated as appropriate for all users older than age 12.

Steam, another popular app, with more than 35 million monthly users under age 18, rates itself as appropriate for age 12+. There are thousands of games with nudity, including games with graphic depictions of extreme sexual violence, encouraging players to rape, sex traffic, and make pornography of others. The parental controls are abysmal on the site, allowing kids to bypass them within just two clicks.

Netflix, for example, claims they are appropriate for anyone over age 4 on the App Store. Yet, as we have pointed out before, Netflix is concerning for a number of reasons, too. They have poor parental controls enabling kids to simply click one button to get into any other accounts. Netflix’ algorithms also regularly recommend TV-MA, R, and NC-17 content right next to kids cartoons and other child-centered content. A number of the originally-produced content depicts graphic sexual violence, normalizes sex trafficking of minors and shows gratuitous nudity.

It's time to #FixAppRatings to be more transparent & accountable for the risks therein! We need an independent app rating review board to hold ratings accountable, the way the movie and video game industry has. Share on X

The National Center on Sexual Exploitation, together with Protect Young Eyes, Utah State Senator Todd Weiler, child safety activist Melissa McKay, and dozens of other organizations are calling for a #FixAppRatings solution!

We have a ratings system for film and television and there’s one for videogames. Why not for apps? We are starting a national movement calling on big tech to be part of the solution in the following ways:

  1. The Creation of an Independent App Ratings Board to review the 500 most popular apps with the Power to Impose Sanctions for Non-Compliance
  2. Release of Intuitive Parental Controls for iOS, Android, and Chromebooks. Eliminate the loopholes and complexities. Provide school and bedtime selective app shutoff. Give parents default settings based on age.

Learn more about this campaign and JOIN US at FixAppRatings.com

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.