Mainstream Contributors To Sexual Exploitation
Apple App Store age ratings and descriptions mislead parents about the content, risks, and dangers to children on available apps. When caregivers aren’t informed, kids pay the price.
Updated on 9/19/2023: Our contacts at Apple reached out to let us know about major improvements to their nudity blurring feature that NCOSE has been asking for and advising Apple about since the feature was originally announced nearly two years ago. Beginning September 18, 2023, with the iOS17 update, images and videos containing nudity will be automatically blurred for kids 12 and under in iMessage, FaceTime, Air Drop, and Photos picker—a feature that previously had to be turned on by parents, was unavailable to anyone over 13, only detected still images, and only worked in iMessage. This tool will also be available for teens and adults as an opt-in feature. Additionally, Apple has already made their blurring technology available to other apps for free through API (application programming interface)! What this means is that apps accessed through iOS could apply the nudity blurring feature to their platform. You can read our press release on Apple’s changes for more details!
*While not a focus of the DDL, NCOSE is deeply concerned and disappointed that Apple does not scan for child sex abuse material (CSAM). We’ve been pressing on Apple for years to do so, most recently together with a coalition of child safety experts.
With almost 90% of US teens owning an iPhone, Apple can rightly be called a primary “gatekeeper” to what America’s children are accessing online…as well as who is accessing them. Caregivers trust and rely on Apple’s App Store age ratings and descriptions to determine what apps are safe and appropriate for their children to use. App age ratings also “trigger” several aspects of Apple’s parental controls (called Screen Time), blocking entire apps or content based on the designated age.
Yet at a time when child safety experts and mental health professionals – including the United States Surgeon General – are sounding the alarm that our kids are in crisis due in large part to social media, Apple’s app descriptions remain vague, hidden, and inconsistent: further jeopardizing our already-at-risk children.
Documented dangers on apps such as risky features, exposure to adult strangers (including predators), harmful content, illegal drug activity, concerns about healthy child development, easy access to explicit content, and most recently, an explosion of financial sextortion are not included in the current app descriptions. Even the social media and gaming apps that have been exposed as particularly rife with predatory activity, sexual interactions, and pornographic content (videos, images, language) are labeled 12+ for “infrequent or mild mature or suggestive themes” and “infrequent mild language.” US federal law requires children to be at least 13 years old to use social media and games not specifically designed for youth.
Furthermore, despite knowing the age of the account holder (based on Apple ID), Apple App Store suggests and promotes sex-themed 17+ apps to children apps (including kink, hookup, adult dating, and “chatroulette” apps that pair random users) which they can download by clicking a box noting they are 17+. Apple does not enforce its own Developer Guidelines, which state that ads must be appropriate for the app’s age rating, exposing children to mature in-app advertisements that reference gambling, drugs, and sexual role-play for apps rated 17+, even when the app is rated 9+ or 12+. And there is no system in place to report apps that fail to adequately explain the types of content a user might experience.
With Apple’s near limitless resources, there can be no excuse for the deception of consumers, of caregivers on such a massive scale: It’s time Apple finally fixes its app rating system.
Review the proof we’ve collected, read our recommendations for improvement, and see our notification letter to Apple for more details.