Donate Now

Deceptive to the Core: How Apple App Store Age Ratings Mislead Parents

By:

*Parvati’s story is a composite story, based on common experiences of parents and caregivers.


Parvati* thought she was being a diligent parent. Before letting her 12-year-old daughter download any app, she always checked the age ratings on the Apple App Store. Parvati only ever allowed apps that were rated as 12+.

But she was being misled.

Imagine Parvati’s horror when the supposedly 12+ app exposed her daughter to hardcore pornography. When it showed her advertisements about drugs, gambling, and sexual roleplay. And, worst of all, when an adult predator groomed and sexually exploited Parvati’s daughter—all through an app that Apple said was safe for kids.


A majority of parents and caregivers rely on Apple’s App Store age ratings and descriptions to determine what apps are safe and appropriate for their children to use. But the App Store’s age ratings are misleading and descriptions are grossly inaccurate.

When caregivers aren’t informed, kids pay the price.

Given that roughly 90% of US teens own an iPhone, Apple has a major responsibility as the primary gatekeeper between children and the vast, often dangerous online world.

NCOSE and ally Protect Young Eyes have been calling on Apple to fix their app ratings since 2019. Since then, other leading child safety organizations, such as 5Rights and Canadian Centre on Child Protection, and several US state attorneys general have also called out the App Store for deceptive age ratings and app descriptions.

The multiple risks to children on multiple apps—including the most popular—are at this point extensively documented by law enforcement, policymakers, investigative reports, and watchdog data. Yet Apple has chosen to keep parents in the dark.

This is why NCOSE named the Apple App Store to our 2023 Dirty Dozen List—an annual campaign that calls out 12 mainstream corporations whose problematic policies and practices are facilitating sexual exploitation. Many of you joined us in taking action, asking Apple to fix their deceptive app ratings.

Yet more than seven months since the 2023 Dirty Dozen List reveal, Apple still refuses to correct the blatantly false app information in its App Store.

Further, Apple still neglects to make much-needed common sense safety changes to their other products. They persistently refuse to scan for child sexual abuse material (CSAM) or to automatically blur sexually explicit images and videos sent to teens.

The Problem with the Apple App Store

The App Store’s age ratings and descriptions are deceptive and fail to reflect the harmful experiences and content minors may encounter in the app.

Documented dangers and risky features on apps such as easy exposure to and connections with adults (and therefore predators), pornography and other sexually explicit content, illegal drug activity, and potential financial sextortion (a rapidly growing crime) are not included in the current app descriptions. Even apps that have been exposed as particularly rife with these harms are sometimes labeled 12+ for “infrequent or mild mature or suggestive themes” and “infrequent mild language.”

Furthermore, despite knowing the age of the account holder (based on Apple ID), Apple promotes inappropriate or dangerous 17+ apps in advertisements within children’s apps. These include kink and hookup apps, “chatroulette” apps that pair random strangers together to chat over webcam, advertisements that reference drugs and gambling, and more. Clearly, Apple does not enforce its own Developer Guidelines, which state that ads must be appropriate for the app’s age rating. And a user whom Apple knows to be a child can download the mature apps being advertised by simply clicking a box claiming they are 17+. 

The App Store’s inaccurate age ratings are problematic not only because parents may make the wrong choice about what apps to permit for their child, but also because these age ratings “trigger” several aspects of Apple’s parental controls (called Screen Time), blocking apps based on the designated age rating.

Apple has refused to implement safety measures other industry leaders are using to protect kids (keep reading), very clearly putting the onus almost entirely on parents and even children to protect themselves on Apple devices. Therefore, the very least Apple can do is give families accurate information in the App Store so caregivers and kids could make better-informed decisions.

ACTION: Call on Apple to Fix App Ratings!

Keep reading below the action form for tips for parents, updates on other Apple safety issues, and more ways you can help.

An Interim Solution for Parents: The App Danger Project

Parents may be wondering: so what do I do now? If I can’t trust the App Store’s age ratings, how do I know what apps are appropriate for my kids? 

As a starting point, we suggest consulting the App Danger Project, a new initiative collecting and elevating user reviews about child sexual exploitation instances and risks of child sexual exploitation on apps.

Updates Since the Dirty Dozen List Reveal

While Apple has made no changes to the App Store since being placed on the 2023 Dirty Dozen List, they did partially concede to improvements NCOSE has been requesting regarding their nudity blurring feature (called Communication Safety for users 12 and under).

As of September 18, 2023, images and videos containing nudity are now automatically blurred for kids 12 and under in iMessage, FaceTime, Air Drop, and Photos picker. This is also now available to teens and adults as an opt-in Sensitive Content feature. Further, if a young child 12 and under is about to send a nude image, a warning message and resources will pop up.

@endexploitation iOS 17 comes with new safety features! It’s exciting that Sensitive Content Warning filters are now automatically turned ON for 12 and under accounts, with the ability to be turned on for accounts of all ages. Additional resources are also listed in the Sensitive Content Warning section. While this is a good step forward, there’s still more to be done. This setting should be on by default for everyone, especially kids age 13-17. Explicit content should just be totally blocked for all minor accounts, and parents should also be notified if their child sends or receives explicit material. #greenscreen #endexploitation #exploitationawareness #exploitation #stopchildexploitation #ios17 #ios17features #ios17update #onlinesafety #onlinesafetyforkids #techsafety #techsafekids ♬ Aesthetic – Tollan Kim

These are significant improvements for which we have been publicly applauding Apple. Previously, the nudity blurring feature had to be to be turned on by parents, was unavailable to anyone over 13, only detected still images, and only worked in iMessage.

However, while we are grateful for these changes, we remain baffled and disappointed that Apple does not automatically block nudity for all minors—not simply blur it for 12 and under, still allowing them to click and open the image/video. Further, Apple was initially going to send a notification to parents of children 12 and under if they chose to view or send a nude image, but backtracked on that common sense safety measure.

We also would like to know why Apple made the choice to NOT automatically turn the nudity-blurring feature on by default for teens. Nor do teens receive a warning if they are about to send a nude image—not only a potentially criminal act, but one that leaves them open to all kinds of abuse. There are multiple studies noting the risks and harms of sending and receiving nude images/videos (sometimes called “sexting”) for young people—including sextortion, diagnosable symptoms of PTSD, criminal consequences. Receiving unsolicited nude images/videos is also a form of sexual harassment, even when it pertains to adults. So really, Apple should be automatically blurring nudity for everyone (allowing adults the option to open it if they choose) and blocking nude images for minors (not allowing them to open it).

Another matter of grave concern is the fact that Apple persists in its refusal to scan for child sexual abuse material (CSAM), despite 90% of Americans agreeing that Apple has a “responsibility to identify, remove, and report child sexual abuse images and videos on all of their platforms.”

 We’ve been pressing on Apple for years to do this, most recently together with a coalition of child safety experts called The Heat Initiative.

ACTION: Tell Apple to Get Rid of Child Sexual Abuse Material!

Join NCOSE and the Heat Initiative in demanding Apple detect, report, and remove child sexual abuse material in iCloud. Take action here. 

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.