Meta Blinds Itself to 20+ Million of Cases of Child Sexual Abuse per Year

Catalina* was one of the fortunate ones.

When her stepdad shared video footage of him sexually abusing her through Messenger, Meta detected it. The video was reported to the authorities and Catalina was rescued.

But from now on, there won’t be any more fortunate ones. 


Last week, Meta (the parent company of Facebook, Instagram, and WhatsApp) made the truly devastating decision to nullify its ability to detect and report child sexual exploitation on its platforms. This means the tens of millions of reports Meta makes each year to the National Center on Missing and Exploited Children will be almost entirely lost.

Children will no longer be identified and saved from their abusers.

Predators will trade child sexual abuse material (CSAM, the more apt term for “child pornography”) with impunity.

Kids will be groomed in secret, without any chance of intervention.

This is the reality that Meta has chosen. It has chosen to condemn millions of children to sexual abuse, without hope of prevention or relief.

Pedophiles and other criminals are rejoicing.

Meta Implements End-to-End Encryption (E2EE) with No Ability to Detect Child Sexual Exploitation

On Wednesday of last week, Meta rolled out defaulted end-to-end encryption (E2EE) on Messenger and Facebook chats and videos, without providing a means to detect child sexual exploitation. It has also announced plans to do the same for Instagram Direct.

E2EE is â€śa method of secure communication that prevents third parties from accessing data while it’s transferred from one end system or device to another.” In the age of hacking and concerns about online privacy, the appeal of this technological tool is clear.

Unfortunately, end-to-end encryption can also act as a cloak for criminal activity, including child sexual exploitation, allowing it to go on unseen and therefore unreported. Under Meta’s current approach to E2EE, it is estimated that 92% of CSAM reports from Facebook and 85% from Instagram will be lost. It is important to note that it is highly likely exploitation will greatly increase on Meta’s platforms (which are already consistently noted as the most dangerous for children) as criminals will be emboldened by the secrecy E2EE affords them.

Further, under E2EE, it is not currently possible to detect other kinds of sexual exploitation, such as grooming, sex trafficking and prostitution, and adult image-based sexual abuse. We guarantee the world will see a rise in these heinous crimes perpetrated across Meta and any other platforms that adopt E2EE.

The FBI and other law enforcement agencies, child safety experts across the globe, and survivors all repeatedly warned Meta about this and have been begging them for years to not implement end-to-end encryption—and certainly not without proper safeguards in place. But Meta did not listen.

Read what other leading child safety organizations and experts are saying about this profit-driven move by Meta:

We Need a Nuanced Approach to Privacy That Does Not Sacrifice Millions of Children

While of course privacy is necessary, it cannot come at the cost of the privacy and safety of children and other vulnerable populations. Technology expert and creator of the first tool to detect CSAM online, Professor Hany Farid, states it perfectly:

There is nothing in the offline world that is immune to a lawful warrant. There is nothing that is absolute in terms of your privacy. If there is a lawful warrant, you can search my body, you can search my home, you can search my car, you can search my place of business, you can search my bank, you get access. And we have deemed that appropriate in liberal democracies because we have this trade-off between law enforcement, security and individual privacy.

What is most unfortunate, is that there are ways to keep messages private without anyone seeing them, while still scanning for known child sex abuse material through a method called “hashing”—which is basically a digital fingerprint given to known images and videos of CSAM that are entered into a database that companies can access. The hashes can be searched for in messages without anyone ever actually seeing the content of the message. (For a great primer how hashing works with encryption, read more here).

It is also possible for end-to-end encrypted platforms to continue scanning for CSAM, using a method called “client side” scanning. However, Meta has been vehement against using this method—but without indicating what other safeguards it would implement.

This is why NCOSE is calling on Meta to immediately reverse its implementation of defaulted E2EE on Facebook and Messenger, and commit to not implementing E2EE on its other platforms. Meta must take a more nuanced approach to online privacy, which does not sacrifice millions of children and vulnerable adults to sexual abuse. If it refuses to reverse course, it must at the very least ban E2EE for minor accounts and assure users of significant safeguards to prevent and identify CSAM, IBSA, sex trafficking, and any other form of sexual abuse and exploitation.

Meta and Apple Dodge Accountability and Set a Terrifying Example

End-to-end encryption allows tech companies to pull a veil over the criminal activity occurring on their platform. The criminal activity still happens, but nobody can see it. Even the tech company itself can’t see it, which means it is “cleared” from the responsibility of doing anything about it.

This deliberate turning of a blind eye was the path Apple took from the beginning. Yet until now, Meta stood in favorable contrast to Apple. Apple hid child sexual exploitation under the cloak of E2EE, and therefore unfairly escaped scrutiny. Meta, however, took pains to detect child sexual abuse material and report it, even while knowing that it might get bad press when people saw how much of this illegal content was being traded on its platforms. This is why Meta’s platforms accounted for over 85% of total reports made to the National Center on Missing and Exploited Children in 2022—because it was actually making a considerable effort to report. Now, however, Meta is throwing all that out the window and joining the cowardly likes of Apple.

It is worth noting that Meta rolled out E2EE in the midst of increased scrutiny from Congress and the media regarding the harms its platforms pose to children. Meta’s response to this scrutiny has been to tie its own hands—effectively saying, “If we can’t see it, we can’t do anything about it!”

It is nothing short of an emergency that two of the largest, most influential tech companies have chosen to take this destructive approach. We urgently need laws regulating how E2EE can be implemented, or Meta and Apple’s example will spread like wildfire—leaving the most vulnerable to suffer the burns.

Please TAKE ACTION NOW, asking Congress to hold Meta accountable for this dangerous policy and to require tech platforms to prioritize child safety!


*Composite story based on common survivor experiences

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.