Donate Now

WhatsApp Has A Child Pornography Problem

By:

First it was Tumblr, now it’s WhatsApp.

Last November, it was discovered that child sexual abuse images (i.e. child pornography) was being shared on the popular social media platform Tumblr. As a result, Apple removed the Tumblr app from their App Store.

In December, Tumblr decided the only way to truly correct this issue was to remove all pornographic content from their platform.

Now, WhatsApp is facing the same problem. WhatsApp, owned by Facebook, is one of the most popular messaging apps in the world. Early last year, WhatsApp hit 1.5 billion monthly users with over 60 billion messages sent per day.

It was recently discovered that some WhatsApp users were creating group chats to share child sexual abuse images (child pornography). While these groups aren’t searchable natively within the app, multiple third-party apps were created to discover and provide links to join groups. Over 130,000 accounts have already been banned for participating in these groups.

There are two major factors that contribute to this issue:

1. WhatsApp allows adult pornography on their platform.

While child pornography is banned on WhatsApp, adult pornography is allowed. As Tumblr learned first-hand, permitting adult pornography opens the door for a toxic environment. Boundaries are slowly pushed and content becomes gradually more extreme.

Now, obviously, age is the key element here. However, we know from testimonies of former prosecutors of child pornography that the downward spiral into viewing more extreme content, such as child pornography, sometimes starts with “regular” adult pornography. This is due to the need for escalation… As platforms become a gathering place for consumers of pornography to share content, it is not a surprise that child pornography would eventually show up.

2. WhatsApp lacks proper moderation.

Facebook has a team of 20,000 employees monitoring content posted on the platform. However, none of these employees work for the Facebook-owned WhatsApp. Instead, WhatsApp, with just 300 employees, handles all of its own moderation separately. How could just 300 employees monitor 60 billion messages each day? The answer is they can’t. And, as many platforms have learned, relying on automated moderation simply isn’t enough.

It’s time for WhatsApp to follow Tumblr’s example and remove all adult content from the platform. Until they do, it is impossible to imagine a foreseeable future in which the issue of child pornography goes away.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.