Meta’s Broken Promises: Children Pay the Price

By:

Maya was only 12 years old when she first started receiving direct messages on Instagram from a “nice” man she didn’t know. This man was 28 years old – over twice her age. A seemingly harmless virtual connection for Maya quickly led to the darkest moments of her life. The 12-year-old girl was groomed, sex trafficked, and raped repeatedly for months, all under the control of this man.  

Even worse? Maya is only one of countless other children experiencing this nightmare on Instagram and other Meta platforms.  

And the nightmare continues. Meta has been consistently informed of the rampant child sex trafficking, such as what happened to Maya, taking place on its platform. Yet they’re standing by idly, holding the door open for child predators.  

Meta is Not True to its Word

Meta owns Facebook, Instagram, Threads, Messenger, and WhatsApp, making it the most influential Big Tech company there is. With this massive power comes big responsibilities. However, Meta is not living up to the responsibilities they claim to uphold, and their actions are simply not lining up with the promises they make when it comes to protecting against the harms of sexual exploitation.  

So, what does Meta promise to do in order to protect children from online predators? How are they breaking those promises and what is the reality of what’s actually happening on our children’s phones?  

The following outlines five of the many promises Meta has made and how its actions, or lack thereof, have led to countless stories like Maya’s.  

5 of Meta’s Broken Promises 

Promise #1 – “Meta will work to ensure that no more parents would have to endure the kind of loss that [you] and others live with.”  

This was a statement Mark Zuckerberg made before an audience of parents whose children had lost their lives due to the harms they encountered on social media. But since making this promise, Meta has not only failed to prevent further harm, they have ramped up the harms and made them worse.  
 
In a recent Senate Judiciary Hearing, Meta whistleblower Cayce Savage said:  

“Meta has promised it would change. I’m here to tell you today that Meta has changed, but for the worse. Meta has spent the time and money it could have spent making its products safer [on] shielding itself instead. All the while developing emerging technologies which pose even greater risk to children than Instagram.”  

Savage and another Meta whistleblower, Jason Sattizhan, went on to explain how Meta suppressed, altered, and even destroyed research that showed just how harmful their products were to children. They also explained how Meta’s new VR technologies have served to increase the harms caused to children, manyfold.  

Savage’s claims that Meta chooses to invest money in shielding itself, instead of making its products safer, are further substantiated by how much money Meta spends on political lobbying. In the last three years, Meta and Google spent nearly $90 million in lobbying, especially to stop the Kids Online Safety Act (KOSA), a bill that creates accountability for tech platforms to keep children safe online.  

Mary Rodee, a mother whose son died by suicide after sexual extortion on Facebook said, “If KOSA were law, it would have saved my son’s life.” Meta falsely promises parents they care about their children, all while investing millions in stalling child safety bills. 

Promise #2 – “We work to find, remove, and report child sexual abuse material and disrupt the networks of criminals behind it.” 

moderator at Meta reported that “on one post I reviewed, there was a picture of this girl that looked about 12, wearing the smallest lingerie you could imagine. It listed prices for different [sexual acts]. It was obvious it was trafficking.” This moderator’s supervisor later told her no further action had been taken in this case.  

In another case, New Mexico investigators created a fake FB account of a 13-year-old girl named Issa Bee. One of Issa’s Facebook friends added her to a group chat, “providing pornographic videos and naked photos of underage girls.” Issa reported this group to Facebook multiple times, yet it remained active. After Issa’s last report, Facebook simply instructed Issa to leave the group. 

Promise #3 – “At Meta, child protection is always a top priority.” 

In 2020, an employee at Meta asked colleagues in an internal chat what the company was doing to address child grooming. Another employee responded“Somewhere between zero and negligible. Child safety is an explicit non-goal this half.”  

Examples have already been given of Meta’s blatant failure to prioritize child protection. Another chilling example was when it was revealed that Meta was internally aware that the People You May Know (PYMK) algorithm “contributed up to 75% of all inappropriate adult-minor contact.” Meta is claiming child protection is their top priority, yet they won’t even implement a safety feature as simple as turning off PYMK between adults and children.  

Promise #4 – “We’re committed to making Facebook, Instagram, Messenger and Threads safe places.”  

In 2023, Meta began rolling out end to end encryption on its platforms. They did this despite being warned that this would virtually nullify their ability to detect child sexual abuse material (CSAM), leading to 92% of CSAM reports from Facebook and 85% from Instagram being lost.  

In 2023, a Meta child safety expert strongly advised against moving forward with end-to-end encryption on youth accounts due to the increased protection it would give child predators.  He was quickly removed from his role and Meta proceeded with implementation.  

With end-to-end encryption in place on youth accounts, evidence of child exploitation has become inaccessible to law enforcement. Does this sound like Meta is working to make its platforms safe places?  

Promise #5 – “We invest in the best tools and expert teams to detect and respond to suspicious activity.”  

UC Berkeley professor Hany Farid partnered with Microsoft to invent PhotoDNA, the technology Meta still uses today to identify harmful content like child sexual abuse material. But in 2023, Farid said that Meta could be doing much more to invest in better safety tools, such as technology that “flag suspicious words and phrases on unencrypted parts of the platform – including coded language around grooming.” Farid said Meta’s failure to combat sexual exploitation on its platform “is, fundamentally, not a technological problem, but one of corporate priorities.” 

Recent research from Fairplay, Meta whistleblower Arturo Béjar, and other child safety organizations found that two-thirds (64%) of the safety tools on Instagram`s teen accounts were ineffective, with just 17% working as described by Meta.

Additionally, when interviewed, former Meta moderators revealed their negligent protocols for identifying child sexual abuse material:

“If a human trafficker is using a codeword for selling girls, we didn’t get into that. We didn’t really get trained on those. You don’t even give it a second thought or even dig into that kind of stuff at all.” 

In 2024, a serious case of child sexual abuse on Facebook was discovered not by Meta moderators, but by Kik Messenger, another social media platform. If other social media companies are finding child abuse content on Meta’s platforms, Meta’s tools and protocols on detecting child sexual exploitation are clearly not even up to industry standard, let alone the best in the industry as claimed.  

What’s the Takeaway? 

Meta’s promises don’t line up with their actions. And who’s paying the price for their irresponsibility? Maya. Your friends. Your neighbors. Your children.  

Meta’s business model of profit over child safety is an affront to human dignity. As a former Facebook employee turned whistleblower, Frances Haugen, said“If the platforms actually wanted to keep these kids safe, they could.”  

Meta must be held accountable for the harms it inflicts on children and be required to make its platforms safe for children. However, because of Section 230 of the Communications Decency Act, Meta has been allowed to ignore blatant sex trafficking.  

The courts have interpreted Section 230 as providing blanket immunity for Big Tech so they cannot be sued for facilitating rampant sexual abuse and exploitation. These tech corporations, such as Meta, have no incentive to make their products safer until Section 230 is repealed.  

ACTION: Call on Congress to Repeal Section 230! 

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.