David & Goliath: How You’ve Moved Behemoth Corporations in 2024

There are over 2.2 billion active Apple devices worldwide.

Meta has almost 4 billion users on its family of apps, which includes Facebook, Instagram, WhatsApp and Messenger.

Cash App has around 57 million monthly active users.

These staggering numbers highlight the popularity of only a few of our 2024 Dirty Dozen List targets. It’s easy to throw up your hands in defeat, thinking you will never stand a chance against these tech giants. But then again, with the vast audiences that these tech companies serve, doesn’t that make it all the more important to hold them accountable? Their actions affect billions of people!

The numbers can be discouraging to some, but to relentless advocates like you, they encourage the pursuit of justice!

In 2024, you have once again caused monumental victories in the fight against corporate-facilitated sexual exploitation.

Major Moves from Meta

This past year, Meta has made many significant changes in response to being named to the Dirty Dozen List. Last April, they announced they would automatically blur nude images in Instagram direct messages for minors under the age of 18. This is a direct result of NCOSE’s advocacy and comes a day after Meta was placed on the 2024 Dirty Dozen List! NCOSE calls on Meta to implement the same changes for all minors on Facebook and WhatsApp.

In October 2024, Instagram made changes to block accounts that show signs of potential scammers from following teen users. These suspicious accounts will also no longer be able to see certain features of a person’s profile that are often used for blackmail in sextortion schemes, including follower or following lists, posts they’ve been tagged in, users they’ve tagged in posts, and users who have liked their posts.

Additionally, Meta will be rolling out a new feature on Instagram direct messages and Messenger that will not allow ephemeral videos or pictures shared in a private message to be screenshotted or screen-recorded. If image or video is sent in Instagram DM or Messenger using the ‘view once’ or ‘allow replay’ features, it can’t be screenshotted or recorded on their device and won’t be viewable on the web browser. These features are vital in preventing instances of sextortion, as Instagram is the top platform where sextortion occurs.

Finally, ahead of the House Energy and Commerce Committee’s markup of KOSA, Meta introduced Instagram Teen Accounts, which include defaulted safety settings for teen users. Teen accounts are automatically set to private and are not able to interact (be messaged, tagged, or mentioned) with users whom they are not connected with. Further, teen accounts are automatically set to the highest content restrictions to limit exposure to inappropriate content and users under 16 need a parent’s permission to change these settings. NCOSE requested these changes after naming Meta to the 2024 Dirty Dozen List.

An Intelligent Opponent: Progress Against AI-Generated Sexual Exploitation

Google

Many of these tech giants play a large role in facilitating AI-generated image-based sexual abuse (AI-generated IBSA, commonly called “deepfake pornography”). Most prominently, Google, which is the world’s most popular search engine by far, made key changes based on critical feedback from experts and survivor advocates. Prominent changes include filtering “deepfake” explicit images when a user successfully reports said image. Google’s systems will automatically scan for and remove duplicate sexually explicit images that have already been successfully removed, providing more comprehensive protection for victims.

Google also updated the ranking algorithm for search results, demoting AI-generated explicit content and promoting high-quality, non-explicit content instead, ensuring harmful material is less likely to appear prominently in search results. Websites with a high volume of removal requests will be demoted in search rankings, discouraging them from hosting explicit fake content and protecting individuals from repeated exposure. You can read more about these updates here.

Following a NCOSE-informed Forbes exposé, Google removed over 120 YouTube videos and 11 channels promoting AI-generated IBSA, at least 27 ads for “nudifying” bots/apps from Google Ads, and several AI-generated IBSA apps from the Google Play Store. Additionally, only one week after the Forbes article was published, Google implemented a new policy banning ads and promotions for synthetic sexually explicit media, bots, and sites, showcasing a significant response to the advocacy and reporting efforts by NCOSE.

Further, Google strengthened its online child safety measures by adopting Safety by Design Generative AI principles and enhancing AI safety protocols to stem the creation and dissemination of AI-generated child sexual abuse material (CSAM). They also expanded the Priority Flagger Program to include instances of AI-generated CSAM reports and announced partnerships to support the U.S. Department of Homeland Security’s Know2Protect campaign and the National Center for Missing and Exploited Children (NCMEC)’s No Escape Room Safety to combat sexual extortion (sextortion).

Microsoft’s GitHub

Until last year, Microsoft’s GitHub was a major player contributing to the rise of AI-generated IBSA. Last May, just a month after the Dirty Dozen List was released, GitHub quietly rolled out a policy prohibiting projects that are “designed for, encourage, promote, support, or suggest in any way” the creation of IBSA.

As a result, several repositories hosting IBSA code have since been removed from GitHub, including DeepFaceLab, which hosted the code used to create 95% of deepfakes and sent users directly to the most prolific sexual deepfake website, MrDeepfakes. This is a major victory, which your voice and action made possible! Read more about Github’s new policy here.

Following this, in September, GitHub signed a White House voluntary public commitment to combat IBSA in the private sector.* Among other companies who signed the commitment were Cash App, Meta, and Snap, all targets of the 2024 Dirty Dozen List.

*This commitment was signed under the Biden administration and is no longer publicly available on whitehouse.gov.

Snapchat Snaps Back at Sextortion

This year, Snapchat made significant moves to combat the rampant sextortion that occurs on the platform. NCOSE met with Snapchat’s Global Head of Platform Safety, Jacqueline Beauchere, who shared several new safety features for teens that were rolled out, which include expanded in-app warnings, enhanced friending protections, simplified location-sharing, and blocking improvements.

In response to NCOSE’s recommendations, Snap will now issue a pop-up warning if someone who has been blocked or reported by others or is from a region outside of the teen’s typical network is trying to friend them. This is a significant change to reduce the risk of sextortion.

A Relentless Battle with Roblox Yields a (Small) Victory

Roblox, despite 42% of their users being under the age of 13, has been notoriously poor with protecting their child users. Described as an “X-rated pedophile hellscape,” the need for change on this platform is dire. After being named to the Dirty Dozen List for two years in a row, Roblox has (finally) updated their parental controls and safety settings.

Accounts for children under 13 will be defaulted to stricter safety settings. This includes being unable to message users via “platform chat,” which is a direct message outside of games or experiences. Content will also feature maturity ratings and users under the age of 9 will not be able to access content with maturity level “moderate,” which may include moderate violence or crude humor. Parents can also create their own Roblox account to connect to their child’s account in order to monitor screen time, friends, and access to content.

Many of these changes have been requested by NCOSE for almost two years, so we are thankful for these notable updates. However, they are not adequate for a platform where the vast majority of users are minors. Much more must be done to protect children on Roblox.

Pressure on Cash App Results in Hiring of an Anti-Human Exploitation Program Manager

Cash App was named to the 2024 Dirty Dozen List for being the top financial service used for sextortion, buying and selling of CSAM, and other forms of sexual exploitation. The company immediately responded to our Dirty Dozen List letter, acknowledging NCOSE’s recommendations. Two months later, they posted a position for an Anti-Human Exploitation Program Manager, a role that has been filled by a former employee of International Justice Mission and the National Center for Missing and Exploited Children (NCMEC).

Cash App leaders, including the new Anti-Human Exploitation Program Manager, met with NCOSE back in October. They shared with us that they’ve made additional progress based on our recommendations including requiring birthdays, using machine learning to estimate age, making teen accounts “unfindable” without knowing their exact “cashtag,” and expanding parental controls (including the options of blocking certain accounts and preventing teens from making peer-to-peer payments). The company has also updated their user policy to ban IBSA and are expanding partnerships with law enforcement and NCMEC.

Telegram CEO’s Arrest Serves as Catalyst for App Reform

A few months after Telegram was named to the Dirty Dozen ListCEO Pavel Durov was arrested in France as part of a larger investigation into “complicity” in illegal transactions and possessing and distributing child sexual abuse material (CSAM). Following his arrest, Durov announced that Telegram had removed “problematic content” and updated its terms of service to make it clear that Telegram would share IP addresses and phone numbers with law enforcement in response to valid legal requests. NCOSE renews its call for the U.S. Department of Justice to investigate Telegram.

Four months later, Telegram announced a partnership with the UK’s Internet Watch Foundation (IWF) to help combat child sexual abuse that has run rampant on the platform. Telegram will now use IWF’s hashing technology to proactively detect and remove known CSAM being shared in public parts of the site. The platform will also deploy tools to block non-photographic depictions of child sexual abuse, such as known AI child sexual abuse imagery, and tools to block links to webpages known to host CSAM. This is a significant step for Telegram, which has previously refused to engage with such child protection schemes.

NCOSE named Telegram to the Dirty Dozen List for serving as a safe haven for criminal communities including sexual torture rings, sextortion gangs, deepfake bots, and more. While NCOSE applauds this first step toward preventing child sexual abuse on the platform, Telegram must do more to combat grooming of children and image-based sexual abuse as well. With nearly 30 million downloads of Telegram in the U.S. in 2023, Telegram should voluntarily report CSAM to NCMEC too, which they have never done.

Apple and LinkedIn Halt Promotion of “Nudifying” Apps

Since being named to the 2024 Dirty Dozen List, Apple has removed four “nudifying” apps from the App Store that NCOSE brought to their attention in our notification letter. Read more about this win and what Apple still needs to do to prevent sexual exploitation here.

After the Daily Mail published an article featuring LinkedIn’s placement on the Dirty Dozen List for allowing promotion of “nudifying” apps, LinkedIn removed “nudifying” bot ads and articles from the platform. As of May 2024, NCOSE was unable to find any more posts promoting “nudifying” apps on LinkedIn.

THANK YOU!

All of these victories spotlight your tireless work as supporters of NCOSE and grassroots advocates. Your support is invaluable when taking on these massive, deep-pocketed corporations. We are deeply grateful to you all for an incredible year and we can’t wait to see what we can accomplish together in 2025!

To get updates on more corporate victories and important news in the movement to end sexual exploitation, sign up for our newsletter below.

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.