Dirty Dozen List - 2021 banner

Reveal Event

The Dirty Dozen List is an annual campaign calling out twelve mainstream entities for facilitating or profiting from sexual abuse and exploitation. Since its inception in 2013, the Dirty Dozen List has galvanized thousands of individuals like YOU to call on corporations, government agencies, and organizations to change specific policies to instead promote human dignity. This campaign has yielded major victories, including significant changes at Google, Netflix, TikTok, Hilton Worldwide, Verizon, Walmart, US Department of Defense, and many more.

The Dirty Dozen Watch List serves dual purposes. In some instances, it puts entities on notice that they may soon find themselves named as a major contributor to sexual exploitation unless they demonstrate significant and sustained efforts to address their role in fueling sexual exploitation. In other cases, by placing an organization on the Watch List, NCOSE is affirming an entity’s positive step towards addressing its role in sexual exploitation. However, because some such steps represent only small progress in terms of the entity’s total contribution to sexual exploitation, or because we may have concerns about the entity’s intent to carry through with its progress, placement on the Watch List also signals our lingering concerns about their commitment to ending sexual exploitation.

Take Action!

Your voice is needed to move the Dirty Dozen List Members to change! Click on the logos above and take action against each target on their webpages. Each action takes less than one minute to do!

The 2021 #DirtyDozenList from the National Center on Sexual Exploitation includes @amazon, @WishShopping, @Google, @reddit, and more. Get the full list and use @ncose's easy-to-use Actions to make a difference today. DirtyDozenList.com Click To Tweet

Past Victories

For years, NCOSE has been requesting that Amazon Prime Video create systems to prevent kids from easily accessing material harmful to young viewers. In 2020, Amazon finally rolled out some key developments to address our concerns. You can now set up multiple users under one account with optional “Kids profiles” where only films and shows that are appropriate for kids under age 12 will be available.

In 2015, American Apparel stopped using nudity and sexually explicit advertising for its clothing line and took extensive measures to remove these types of ads from its online and print catalogs to ensure it was all removed so that they would be removed from the Dirty Dozen List.

Carl’s Jr., one of four brands under CKE Restaurants including Hardee’s, has announced that it will stop producing hyper-sexualized, misogynistic ads for their fast food products. CKE Restaurants received substantial negative press for their demeaning ads after being placed on NCOSE’s 2015 Dirty Dozen List.

Over several years of advocacy, Comcast continues to make significant improvements to the usability and parental control settings for cable and Internet users. In 2019, when Comcast added filters at the router level and removed access to pornography from some of their search features for cable, Comcast executives told NCOSE “We heard your feedback and made improvements.” While NCOSE is still petitioning Comcast for further improvements, these important innovations make Comcast a leader on family safety within the telecommunications industry.

From 2018 on, out of concerns about female objectification, CVS decided to remove the Sports Illustrated Swimsuit Issue from their checkout areas and promotional displays. CVS told NCOSE: “We share your organization’s concerns about female objectification.”

In 2014, in the wake of a growing crisis of sexual assault and the Dirty Dozen List highlighting problematic policies, the Department of Defense stopped the sale of pornography in all U.S. Army and Air Force base exchanges. The DOD also ordered regular searches and removals of all sexual materials in public and workspaces to take place for all military branches and updated the training for all servicemen and women on human trafficking to include harms of pornography and reasons why sex buying is harmful. As we continued the pressure, the U.S. Navy finally stopped selling pornography in 2019.

In 2013, after being placed on the Dirty Dozen List and meeting with NCOSE, Facebook took significant steps to improve efforts to block and report child sexual abuse material on its site.  Facebook’s example, continued work to find solutions to reduce CSAM, and their transparency in reporting and talking about it has set industry-wide standards and moved many of the other social media platforms to better prioritize child safety. We believe more can be done and are grateful for the open door to keep these conversations going with Facebook and platforms they own like Instagram.

In June 2014, Google enacted policies for AdWords to no longer accept ads that promote graphic depictions of sexual acts or ads that link to websites that have such material in them. In 2021, Google changed their policies to no longer allow ads for prostitution or “compensated dating.” We applaud Google for stemming the ability of sex buyers to more easily purchase people for their own pleasure.      

In 2020, after much urging, Google improved Google Images to decrease exposure to hardcore pornography for users looking up unrelated or innocent terms. Previously, searches for basic anatomical terms did not yield scientific drawings, but instead returned images of and links to hardcore pornography. The search term “happy black teen” returned thousands of images of rape and extreme sexual violence. [Note: pornographic images may still appear in Google Image searches for search terms more closely related to the pornography industry.]

In 2019, Google’s YouTube was revealed to be widely used for the eroticization of children and for pedophile networking. (Read more here.) Thankfully, after public outcry and contact from NCOSE, YouTube took action to remove a large number of sexually graphic comments under children’s videos. Note: while significantly improved, this problem is not entirely resolved.

In 2013, GooglePlay instituted policies that prohibit pornographic apps in their app store after the first year on our list, though lax enforcement of this policy followed. In 2014, following a second year on the list, GooglePlay removed all apps in violation.

Hilton Hotels Worldwide publicly announced it would stop selling pornography and issued orders to implement this policy in all of its brand contracts around the world. It is expected to be in full-force by July 2016.

Hyatt Hotels & Resorts revised their brand standard to stop profiting from all in-room pornography film offerings and has demanded that all of their properties comply. The compliance progress is on-going across Hyatt properties.

InterContinental Hotel Group performed an audit of their more than 4,800 properties around the world and insisted that all hotels immediately cease selling porn films or face the risk of losing good standing as an IGH brand. IHG made this move without having to be publicly named to the Dirty Dozen List.

Marsh Supermarkets, a chain of produce markets and convenience stores in Indiana and Ohio, removed Cosmopolitan magazine from its checkout lanes. As a result, Marsh customers can enjoy a sexploitation free checkout experience.

After thousands of emails were sent to Netflix by grassroots supporters through the Dirty Dozen List, Netflix took a step forward. Netflix’s parental controls have improved so that 4-digit pin codes used to block certain shows or ratings remain consistent across Profiles, thereby closing a loophole where children could accidentally access sexually graphic content. Further, there are now content warnings at the beginning of every show. These policies are impacting nearly 150 million subscribers!

RiteAid and Food Lion mandated policies to put the sexually explicit Cosmopolitan magazine behind blinders in their retail shops so that customers are not forced to view Cosmopolitan’s sexually degrading and objectifying themes.

Snapchat discontinued Snapcash—which was being used to buy and sell pornographic images and videos, often acting as advertisements for prostitution and sex trafficking. The removal of Snapcash was one of the key requests we made when Snapchat was placed on the 2018 Dirty Dozen List.

We brought concerns about Snapchat being used to facilitate pornography and sex trafficked/prostituted advertisements to Snapchat headquarters in Washington D.C.. Snapchat has made improvements to allow Discover publishers to age-gate content, and allowing users to delete specific Discover publishers. We thank Snapchat for discontinuing Snapcash—which was being used to facilitate commercial sexual exploitation. Snapchat has also updated its Safety Center.

Just a few weeks after our public announcement about TikTok being named to the 2020 Dirty Dozen List, the social media company announced new safety features which NCOSE requested, including fixing the problem of safety features turning off every 30 days, and a new Family Pairing mode for better parental controls. There is still more TikTok needs to improve, but this is a big step forward.

Twitter is blocking direct searches for porn within the “Photos” and “Videos” tabs, although not in the general search tab. Also, as of November 2019, Twitter requires pornography to be marked as “sensitive media” to somewhat curb unintentional exposure to it. However, this appears to be rarely enforced and Twitter continues to not adequately prioritize removing child sexual abuse images and red flags for commercial sexual exploitation.