Dirty Dozen List - YouTube

The Problem

UPDATE: FEBRUARY 22, 2019

Advertisers like Walt Disney Co. and AT&T are suspending advertising with YouTube over mounting concerns about the platform being used for the eroticization of children and for pedophile networking.

Two days after Google claimed to be finally fixing this problem,  The National Center on Sexual Exploitation’s Haley Halverson researched claims of pedophile rings, child erotica, and child exploitation on YouTube and found alarming results confirming the original reports and continued use of the platform for exploitive purposes.

A video displaying proof (edited to not endanger children’s identities) is here:

Play Video

Google’s YouTube is an Internet conduit to user-generated videos where the latest cute kitten videos are mixed together with sexually graphic material.

In late 2017, YouTube came under scrutiny for hosting disturbing videos that targeted children, often with children’s characters put in sexually charged or violent situations, many of which were monetized. YouTube states that it removed these ads from approximately 2 million videos and shut down over 50,000 channels that featured this kind of content.

However, YouTube is still rife with problems, including some reports that simple search terms like “how to have” would autocomplete with child sex themes.

While YouTube is fixing problems ad hoc whenever they receive concentrated media attention, the website does little to proactively monitor or restrict inappropriate content and it forces users to go through a rigorous process if they want to report the content for removal. It appears that whenever they can get away with it, YouTube allows inappropriate content to remain on its platform in order to generate views and more profit.

Currently, YouTube has over a billion users—almost one-third of all people on the Internet—and every day people watch hundreds of millions of hours of content on YouTube and generate billions of views. YouTube overall, and even YouTube on mobile alone, reaches more 18-49 year olds than any cable network in the U.S.  Every 60 seconds 300 hours of video is uploaded to YouTube and 323 days worth of YouTube videos are viewed on Facebook. Its 2014 revenue was estimated to be $4 billion.

Despite YouTube’s Community Guidelines, which specifically prohibit pornography, and other sexually explicit content particularly regarding “violent, graphics, or humiliating fetishes,” masses of such content is uploaded on the website each day and Google allows much of it to remain, thus amassing millions of views and handsome profits.

When you turn on the TV, you are not barraged with massive amounts of this kind material. Why is that when you open YouTube you’re flooded with suggestions to watch explicit sexual videos, even when typing in innocent searches? Given the popularity, quantity, and reach of content uploaded to the website, Google’s YouTube has a social responsibility to increase and improve efforts to curb sexual exploitation.

While NCOSE has successfully urged Google to curb exploitation on other fronts, and congratulates them for many positive changes regarding GooglePlay and AdWords, Google has remained reticent regarding changes to anything relating to the content on YouTube. While the launch of the YouTube Kids App is a step in right direction, it does not go far enough.

Together, with thousands of concerned parents and users, NCOSE urges Google to:

  1. Turn Safe Search and Restricted Mode on automatically for all YouTube users, so that they have to opt-in for more graphic or adult content, instead of being automatically bombarded with sexually exploitive material.
  2. Improve the ease and access of reporting videos that violate its Terms of Use.
  3. Apply the same image filtering software currently used to identify child pornography to flag all forms of adult pornography or sexualized nudity as well.
  4. Develop a more thorough review process for channels applying to monetize their videos.
  5. Extend the AdWords policy to YouTube and refuse to profit from sexually exploitive content.
  6. Update YouTube to work more efficiently with third-party filters.
  7. Most importantly, we call on Google to use its creativity and immense talent to develop effective solutions for this growing problem.

Google’s current system depends largely on reactive moderating, which relies on users to flag and report offensive or sexually explicit content. Users must first watch the explicit content and then report exact times with descriptions if they wish to report it at all. YouTube prefers this method to taking responsibility upon itself to provide more active moderation and policing in real-time, because the views on explicit videos help to serve YouTube’s bottom line.

When users must act as moderators, they must first watch the explicit content and then report exact time stamps with descriptions. The main problem with this established procedure for content removal is that users must first be exposed to the harmful content and then they must continue to view the offensive material as they alert Google about the violation. Many of the explicit videos on the site have hundreds of thousands and even millions of views because Google refuses to improve this process and instead facilitates further exploitation by making their viewers their Terms of Use enforcers.

Often, the audience viewing this material is comprised of children. Pornography, and other sexually graphic material, has a profound negative impact on the development of children and exposure puts them at greater risk for falling victims of exploitation themselves.

Sexually explicit videos on YouTube often amass many views, becoming eligible for lucrative pre-roll video ads that make YouTube and the Uploading Channel lots of money.

Many innocent search terms used on YouTube will bring up hardcore and violent explicit videos because uploaders use misleading descriptions when adding the content. Certainly, Google has the ability to develop a system for analyzing the images and not just the text descriptions of uploaded content.

Another problem is that once users view an explicit video and then other videos with sexually explicit thumbnails will fill the list of suggested content on the right side of the screen, or will be listed as suggestions in the video player when finished watching the desired film. This is especially dangerous for younger audiences using YouTube.

It appears that Google is willing to let the Terms of Use slide for celebrities who upload content that is in direct violation. Beyoncé, Justin Timberlake, and Robin Thicke are just a few examples of celebrities who have amassed millions of views (and $$$ for Google) with music videos that include full frontal nudity.

Community Guidelines must apply to the entire community on YouTube, no matter how rich or famous someone might be.

WARNING: There are graphic images and text descriptions shown in these sections.
POSSIBLE TRIGGER.

Proof

YouTube's Policies

YouTube's Policies

Despite YouTube’s Community Guidelines, which specifically prohibit sexual nudity, pornography, and other sexually explicit content, such content is uploaded in droves on the website each day and Google allows much to remain, amassing millions of views and handsome profits.

Community Guidelines:

“YouTube is not for pornography or sexually explicit content.”

“Most nudity is not allowed, particularly if it is in a sexual context… [or] if a video is intended to be sexually provocative…”

Terms of Service:

All users digitally sign and agree upon creating an account to upload content to the site.

“YouTube reserves the right to decide whether Content violates these Terms of Service for reasons other than copyright infringement, such as, but not limited to, pornography, obscenity, or excessive length.”

Examples From YouTube

Examples From YouTube

Many innocent search terms used in YouTube will bring up hardcore and violent explicit videos because uploaders use misleading descriptions or foreign languages when adding the content. Certainly, Google has the ability to develop a system for analyzing the images and not just the text descriptions of uploaded content.

Another problem is that users will often view a video and videos with sexually explicit thumbnails will fill the list of suggested content on the right side of the screen or will be listed as suggestions in the video player when finished watching the desired film. This is especially dangerous for younger audiences using YouTube.

A mother in this first video described the videos as sexually predatorial, and a policeman noted that while the videos are not illegal they could be being used by predators to groom children for later abuse.

https://www.youtube.com/watch?v=DDPMzvExwkA

https://www.youtube.com/watch?v=acTcMUJ6qBI

[Images available upon request]

Take Action

Pass the EARN IT Act

Tech has the Capability to Stop CSAM; Hold Them Accountable!

Pornography

Deeply Damaging to Relationships, Mental Health, and Empathy for Other Human Beings

Public Health Harms of Pornography

Download the research summaries of studies on the harm of pornography

Share Your Story

Have you or your kids been exposed to pornography and sexually explicit content on YouTube?

Help educate others and demand change by sharing these on social media:

Progress

With the reveal of the 2024 Dirty Dozen List fast approaching, let’s look back on what we accomplished together through last year’s list!

Read More »

NCOSE has released its 2023 Gratitude Report—and there is indeed much to be grateful for!

Read More »

5.3 billion Google searches per day have now been made safe!

Read More »