Senate Hearing Will Highlight How to Fix App Ratings for Kids

On July 9, The Senate Judiciary Committee is holding a hearing on “Protecting Innocence in a Digital World” to talk about the predatory targeting, grooming, sex trafficking, hardcore pornography, and more that makes up the experience of so many young users of platforms like Snapchat, TikTok, and Instagram.

Chairman Lindsey Graham, Senator Dianne Feinstein and Senator Mike Lee are to be commended for their leadership and action on these critical issues. If you’re interested in learning more or watching the hearing live, you can do that here.

This hearing is something we at the National Center on Sexual Exploitation (NCOSE) have been asking for and working toward for some time because we believe that children are not well-accounted for or protected from these practices by the companies behind these platforms. The hearing will address the harms of apps and social media for the grooming of children for sexual abuse, sex trafficking, and child pornography and pornography exposure, and the need for further study of these harms and technological solutions. The hearing will also address a solution that we and our allies have put together which entails the creation of an independent app ratings board and system that includes default settings for users based on their age.

An independent review process for apps is an important step forward in helping protect our children because we believe that companies and corporations need to provide better transparency into their products—especially those heavily aimed at/used by children. For parents, it’s difficult to protect children in the digital space and help them steward their digital footprints well. The problem is exacerbated when the companies behind the technology our children are using have little to no incentive or accountability when it comes to providing reliable, transparent ratings for their products. 

A young child draws a picture on paper next to a digital tablet with an app on the screen

Far from being a minor inconvenience, this lack of transparency and accountability has a disproportionate impact on children in real and disquieting ways. Not only do these platforms sometimes serve as a prime hunting grounds for sexual predators—and even registered sex offenders—to stalk and groom their prey, but they also provide cheap, alternate venues for human traffickers to find new customers while exploiting trafficking victims of all ages (or to find new victims to traffic).

One such example is a group of 15-year-old girls who survived sex trafficking and who told NCOSE that girls were often forced to perform sex acts, or nude dances, on Instagram Live in a manner that made it look as if they were consenting during the act, although their trafficker stood in the corner the whole time. Which means that, while Instagram’s app description for itself quietly notes that “mild sexual content” may be encountered by users, children as young as 12 (and, let’s be real, not infrequently kids who are even younger than that) can access child pornography produced using minors who are being trafficked using Instagram.

An independent app ratings system may not stop every instance of sexual exploitation that can occur via smartphones and the apps created for them, but what it can do is help shrink the number of times that it does happen. When parents and guardians are equipped with information about the different possibilities for abuse that exist within an app, they can make better decisions about when and where their children have access to those apps—if at all—and can have better conversations with their children about the safe use of apps that they choose to allow. When that happens, fewer children will be vulnerable to the predatory advances of bad actors who know very well how to exploit the many weaknesses and dark corners of the underregulated tech space.

Having improved ratings can also shore up parental control settings on devices so that when safety settings are turned on they can block children from being able to access apps that are chronically exposing them to serious harm.

That’s why it’s important that the Senate is taking an interest in bringing better transparency and accountability into the tech space. That’s why it’s important that we continue advocating for real, practical solutions. That’s why it’s important for you to be a part of the solution. 

Let’s fix app ratings.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.