What’s Up with WhatsApp?

What Exactly is WhatsApp?

WhatsApp is the most popular messaging platform in the world. It allows users to send private messages, voice notes, images and more. Bought by Meta (formally Facebook) in 2014, it made $5 billion dollars in 2020 and boasts over 2 billion users. It is free, easy to use and it has revolutionized the way we connect via our smart phones.  

It is also lauded among many for its privacy features. WhatsApp uses what is called end-to-end encryption, which means that no one, not even WhatsApp can see the messages that are sent and received in a WhatsApp conversation, thus protecting people’s privacy. It also only allows users aged 16 and over.  

On the face of it, WhatsApp seems like a safe and reliable platform, especially for parents whose children want to be able to communicate with their friends and join popular WhatsApp groups.

FYI: WhatsApp is Owned by Meta

So then, what is up with WhatsApp?  

Its parent company Meta is on NCOSE’s 2022 Dirty Dozen List, an annual campaign calling out twelve mainstream entities for facilitating or profiting from both adult and child sexual abuse and exploitation. But what has this got to do with WhatsApp, a simple messaging service?  

The answer is: a lot. 

WhatsApp Was Not Made for Kids

Child online safety organization Thorn, reports that out of 2002 surveyed minors aged 9 to 17 years old: 

  • 55% of 9-to-12-year old’s have used WhatsApp, with 39% using it at least once per day. 
  • 40% of 13-to-17-year old’s have used WhatsApp, with 16% using it at least once per day. 
  • WhatsApp usage by minors has increased by 20% between 2019 and 2020. 

For starters, no child under the age of 16 should be using WhatsApp according to its own regulations, yet these figures clearly show that this is not the case. WhatsApp does not put in place age verification mechanisms, which would ensure that all of its users are meeting age requirements.  

Even more disturbing, WhatsApp is now within the top three platforms where children report experiencing harmful behavior.  

Again, based on a Thorn report assessing how minors respond to online harmful behavior:  

  • Snapchat (38%), WhatsApp (36%), and Instagram (35%) are [also] among the platforms on which the greatest number of users have had a potentially harmful experience of any kind. 
  • These platforms are also the ones with some of the highest rates of users who report having had a sexually explicit interaction: 23% Snapchat, 22% Instagram, 21% WhatsApp  
  • 11% of the minors surveyed reported a ‘sexual interaction with someone they believed to be an adult’ on WhatsApp

This includes adults sending explicit sexual photos and images to children as well as the grooming and solicitation of children to send nude and explicit photos and videos of themselves.

Reporting and Safeguards are Lacking on WhatsApp  

Even so, WhatsApp reporting mechanisms are weak at best. 

If someone clicks to report a WhatsApp user, they must take the following path

  1. Open the chat with the user you wish to report. 
  2. Tap More options > More > Report. 
  3. Check the box if you would like to also block the user and delete messages in the chat. 
  4. Tap REPORT. 

OR

  1. Open the chat with the user you wish to report.
  2. Long press an individual message.
  3. Click on the overflow menu.
  4. Tap report.
  5. User receives a confirmation notification and the option to block.

** Note: WhatsApp receives the last five messages sent to you by the reported user or group, and they won’t be notified. WhatsApp also receives the reported group or user ID, information on when the message was sent, and the type of message sent (image, video, text, etc.). 

There is no specified reporting for child sexual abuse materials, online grooming, inappropriate behavior towards a minor, and other options related to online safety. This is a barrier for reporting and prevents WhatsApp from prioritizing urgent matters of criminal activity.

Illegal Activity Hosted on WhatsApp

Furthermore, WhatsApp is used to share and distribute Child Sexual Abuse Materials (CSAM), (commonly known as ‘child pornography’). 

The extent of the abuse and the dangers that lurk online for children is shocking. The above statistics and statements are almost incomprehensible.  

But shocking as these may be, as NCOSE’s Dirty Dozen List reveals, none of this is new to the big tech companies that own and run these online platforms. In fact, it is just the cost of “doing business.” Public safety takes a backseat in the pursuit of the bottom line. 

This callous disregard for the harms its corporation causes is certainly true of Meta.

WhatsApp Has Refused to Make Meaningful Changes

WhatsApp is a prime example. Despite calls for better safety features from child advocates, legislators and law enforcement, WhatsApp refuses to make changes to its platform, particularly in the area of end-to-end encryption and privacy.  

This protection of privacy is important. However, end-to-end encryption makes it extremely difficult to scan for and identify the sharing of child sexual abuse materials on WhatsApp. Therefore, while WhatsApp may be protecting user privacy, it is also protecting those who are exploiting children and committing crimes against them. 

In 2021, when Apple initially announced the now postponed idea to scan for known (hashed) images of child sexual abuse material on its devices before they were uploaded to the cloud—they were met with backlash and derision by privacy absolutists and their tech counterparts: including by the Head of WhatsApp, Will Cathcart, who stated: 

“I think this is the wrong approach and a setback for people’s privacy all over the world. People have asked if we’ll adopt this system for WhatsApp. The answer is no. Apple has built software that can scan all the private photos on your phone—even photos you haven’t shared with anyone. That’s not privacy.” 

Online Safety Technology Does Not Affect User Privacy

However, Professor Hany Farid who co-developed the PhotoDNA tool used to “hash” verified child sexual abuse materials, argues not only does this technology not affect privacy, it is also similar to technology WhatsApp already uses to scan text messages for spam and malware. Farid argues that

“Cathcart seems to be comfortable protecting users against malware and spam, but uncomfortable using similar technologies to protect children from sexual abuse.” 

Child sexual abuse materials that are found online are most often gleaned from scanning all the material that people send online. This cannot be done sufficiently with end-to-end encryption, as the content cannot be fully seen. 

WhatsApp argues that it can still scan for child sexual abuse material in the unencrypted parts of its platform and that it scans for code words related to child sexual abuse and can identify potential predators that way. However, according to law enforcement, neither of these approaches are enough.

Experts and the Public Agree that Online Safety Should be the Priority 

And it is not just law enforcement or child advocates who support safer platforms for children, it is the public.  

As John Carr argues, and ECPAT research reveals, when asked about the issue of privacy and its impact on detecting CSAM: 

  • 68% of respondents are in favor of the European Union strengthening legislation to mandate tech companies to turn on automated tools that can detect and identify images of child sexual exploitation. 
  • 76% of respondents were willing to give up some of their personal privacy online to allow for automated technology tools to scan and detect images of child sexual abuse and detect other forms of sexual exploitation of children.

WhatsApp needs to step up. It cannot keep prioritizing privacy at the cost of protection. It can still provide robust privacy while ensuring the safety of its most vulnerable users by: 

  • Improving and investing in the privacy of child sex abuse victims by preventing and disrupting the sharing of CSAM through messages. 
  • Turn on privacy and safety settings by default for minor users. 
  • Improve reporting mechanisms to allow easy, specified reporting for child sexual abuse materials, online grooming, inappropriate behavior towards a minor, and other options related to online safety. 
  • Work with survivors of sexual exploitation and other subject matter experts to better understand the impact WhatsApp has when it fails to prevent and disrupt sexual crimes.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.