The Movement Ep. 4: Dirty Dozen List Victories, Advocacy with Senators, and More!

By:

The Movement Episode 4 is here!

“The Movement” is a regular video update, where we share the progress and challenges in the fight to end sexual exploitation.

This week we discuss Big Tech companies that have taken steps to prioritize safety thanks to YOUR action, how NCOSE & Senator Lee are linking arms to take on the pornography industry, and more!

Watch the episode and read more below:

Dirty Dozen List Victories!

NCOSE’s annual Dirty Dozen List Campaign targets tech companies who consistently enable sexual exploitation for profit. Thanks to people like YOU participating in this campaign and taking action, many tech companies are taking concrete steps to fight against sexual exploitation!

Cash App Hires Anti-Human Trafficking Program Manager

Cash App was ranked by law enforcement and researchers as the top payment platform for sextortion, buying and selling of child sexual abuse material, prostitution, sex trafficking and a host of other crimes. After being named to the 2024 Dirty Dozen List, Cash App posted a job opening for an Anti-Human Exploitation Program Manager, a position that was recently filled, to help curb the sexually abusive behaviors enabled by its platform.

The new Anti-Human Exploitation Program Manager, Sally Frank, formerly worked for the National Center for Missing and Exploited Children (NCMEC) and the International Justice Mission’s Center to End Online Sexual Exploitation of Children. We hope that her wealth of experience will provide CashApp with the expertise it needs to stop facilitating sexual exploitation.

In the meantime, join us in continue pressing on CashApp, to ensure this new hire is followed up with concrete changes!

GitHub Takes Steps to Combat Image-Based Sexual Abuse

Microsoft’s GitHub is one of the foremost developers of artificial intelligence, but they have also been one of the foremost developers of AI-generated sexually explicit images, a phenomenon that has wreaked havoc in people’s lives. The platform has historically been an incubator for software codes used to create AI-generated image-based sexual abuse (IBSA) and child sexual abuse material (CSAM). AI-generated sexually explicit images (commonly known as “deepfake pornography”) has been weaponized by men and boys all over the world, including by teen and preteen boys creating pornographic images and videos of their female peers.

Thankfully, after two years of being on the Dirty Dozen List, Microsoft’s GitHub is finally hearing your demands!

Just a month after the 2024 DDL reveal, Microsoft’s  GitHub announced a new policy prohibiting the use of their technology for the purposes of creating and distributing IBSA and CSAM. As a result, GitHub removed several repositories that were harboring codes used to generate IBSA, including DeepFaceLab which hosted the code used to create 95% of all deepfakes and sent users directly to the most prolific sexual deepfake website. And in September, Microsoft and GitHub signed a public commitment with the White House to combat IBSA,

Read more about Microsoft’s GitHub changes here.

Meta Makes Several Improvements … But Much More is Needed!

Earlier this year, Meta started automatically blurring nudity in direct messages to all minors (not just under 16s as originally announced). This was a direct result of your advocacy, as this announcement was made just a day after Meta was placed on the 2024 Dirty Dozen List. We have also called on Meta to implement these same measures on Facebook and WhatsApp.

In September, Instagram introduced new, safer “teen accounts.” Teen accounts have defaulted safety settings for all users 18 and under, including automatically setting the account to private and preventing teens from sending or receiving direct messages from people whom they are not connected with. For users under 16, parental permission is required to change these settings. Again, these are specific changes for which NCOSE has been advocating since 2019.

These two victories are major steps forward as Instagram is one of the top platforms used by sexual predators – such as sexual extortionists – to identify, groom, and exploit children

However, Meta still has their work cut out for them. Meta’s platforms continue to promote nudifying apps to its users, including children, even as the use of nudifying apps by teens to exploit their peers becomes more prevalent.

We applaud Meta for making some of the necessary changes thanks to your grassroots advocacy, but we urge them to continue to implement more protections for users on their applications.

While it’s encouraging that our advocacy is resulting in so many victories with individual tech companies, the Kids Online Safety Act (KOSA) would allow us to stop playing whack-a-mole! This crucial legislation would require ALL tech companies likely to be accessed by children to automatically turn-on common sense safety features and design their products with child safety in mind.

Ask your legislators to pass KOSA now!

Senator Lee & NCOSE Link Arms to Tackle Pornography

Senator Lee (UT) and NCOSE are working together to address the harms of pornography with legislation that would require meaningful protections to end the harms perpetuated by pornography.

Lee is spearheading two important bills: the PROTECT Act and the SCREEN Act.

The PROTECT Act (Preventing Rampant Online Technological Exploitation and Criminal Trafficking Act) addresses the harms of pornography by preventing the production of nonconsensual pornography. It would require websites to verify the age and consent of all individuals depicted in pornographic content, and quickly remove content upon receiving notice that it was uploaded without consent. This is something for which NCOSE has been advocating for years.

Currently, there is NO federal law requiring age and consent verification in pornography, which allows websites to distribute child sexual abuse material (CSAM, the more apt term for “child pornography”), recorded rape and sex trafficking, and other forms of image-based sexual abuse with impunity. This must end!

The measures the PROTECT Act proposes are common-sense and long overdue, and this bill has NCOSE’s full support.

Meanwhile, the SCREEN Act (Shielding Children’s Retinas from Egregious Exposure on the Net Act) combats the harms of pornography to minors. It would require pornography websites to verify the age of their users, thereby preventing children from accessing the site.

This would resolve the puzzling double standard under which our society currently operates. If a child attempts to buy a pornographic magazine or DVD in a store, they will be ID’d and denied purchase. But the same child can go online and watch the most hardcore pornography available, with no age-check preventing that.

Multiple U.S. states have been pursuing legislation to resolve this double standard and require age verification for online pornography as well as offline. Now, the SCREEN Act seeks to make age verification a requirement under federal law.

Senator Lee met with NCOSE at our office in May to discuss strategies for furthering these two bills.


THANK YOU for making all these victories and progress possible with your support! Onward, to a world where all can live and love, free from sexual abuse and exploitation!

The Numbers

300+

NCOSE leads the Coalition to End Sexual Exploitation with over 300 member organizations.

100+

The National Center on Sexual Exploitation has had over 100 policy victories since 2010. Each victory promotes human dignity above exploitation.

93

NCOSE’s activism campaigns and victories have made headlines around the globe. Averaging 93 mentions per week by media outlets and shows such as Today, CNN, The New York Times, BBC News, USA Today, Fox News and more.

Previous slide
Next slide

Stories

Survivor Lawsuit Against Twitter Moves to Ninth Circuit Court of Appeals

Survivors’ $12.7M Victory Over Explicit Website a Beacon of Hope for Other Survivors

Instagram Makes Positive Safety Changes via Improved Reporting and Direct Message Tools

Sharing experiences may be a restorative and liberating process. This is a place for those who want to express their story.

Support Dignity

There are more ways that you can support dignity today, through an online gift, taking action, or joining our team.

Defend Human Dignity. Donate Now.

Defend Dignity.
Donate Now.