teens take a selfie in a city using a smartphone
July 8, 2019

Senate Hearing Will Highlight How to Fix App Ratings for Kids

On July 9, The Senate Judiciary Committee is holding a hearing on “Protecting Innocence in a Digital World” to talk about the predatory targeting, grooming, sex trafficking, hardcore pornography, and more that makes up the experience of so many young users of platforms like Snapchat, TikTok, and Instagram.

Chairman Lindsey Graham, Senator Dianne Feinstein and Senator Mike Lee are to be commended for their leadership and action on these critical issues. If you’re interested in learning more or watching the hearing live, you can do that here.

This hearing is something we at the National Center on Sexual Exploitation (NCOSE) have been asking for and working toward for some time because we believe that children are not well-accounted for or protected from these practices by the companies behind these platforms. The hearing will address the harms of apps and social media for the grooming of children for sexual abuse, sex trafficking, and child pornography and pornography exposure, and the need for further study of these harms and technological solutions. The hearing will also address a solution that we and our allies have put together which entails the creation of an independent app ratings board and system that includes default settings for users based on their age.

An independent review process for apps is an important step forward in helping protect our children because we believe that companies and corporations need to provide better transparency into their products—especially those heavily aimed at/used by children. For parents, it’s difficult to protect children in the digital space and help them steward their digital footprints well. The problem is exacerbated when the companies behind the technology our children are using have little to no incentive or accountability when it comes to providing reliable, transparent ratings for their products. 

A young child draws a picture on paper next to a digital tablet with an app on the screen

Far from being a minor inconvenience, this lack of transparency and accountability has a disproportionate impact on children in real and disquieting ways. Not only do these platforms sometimes serve as a prime hunting grounds for sexual predators—and even registered sex offenders—to stalk and groom their prey, but they also provide cheap, alternate venues for human traffickers to find new customers while exploiting trafficking victims of all ages (or to find new victims to traffic).

One such example is a group of 15-year-old girls who survived sex trafficking and who told NCOSE that girls were often forced to perform sex acts, or nude dances, on Instagram Live in a manner that made it look as if they were consenting during the act, although their trafficker stood in the corner the whole time. Which means that, while Instagram’s app description for itself quietly notes that “mild sexual content” may be encountered by users, children as young as 12 (and, let’s be real, not infrequently kids who are even younger than that) can access child pornography produced using minors who are being trafficked using Instagram.

An independent app ratings system may not stop every instance of sexual exploitation that can occur via smartphones and the apps created for them, but what it can do is help shrink the number of times that it does happen. When parents and guardians are equipped with information about the different possibilities for abuse that exist within an app, they can make better decisions about when and where their children have access to those apps—if at all—and can have better conversations with their children about the safe use of apps that they choose to allow. When that happens, fewer children will be vulnerable to the predatory advances of bad actors who know very well how to exploit the many weaknesses and dark corners of the underregulated tech space.

Having improved ratings can also shore up parental control settings on devices so that when safety settings are turned on they can block children from being able to access apps that are chronically exposing them to serious harm.

That’s why it’s important that the Senate is taking an interest in bringing better transparency and accountability into the tech space. That’s why it’s important that we continue advocating for real, practical solutions. That’s why it’s important for you to be a part of the solution. 

Let’s fix app ratings.

National Center on Sexual Exploitation

Founded in 1962, the National Center on Sexual Exploitation (NCOSE) is the leading national organization exposing the links between all forms of sexual exploitation such as child sexual abuse, prostitution, sex trafficking and the public health crisis of pornography.

Further Reading

Related