Summary of Problems on YouTube
Update: February 22, 2019
Advertisers like Walt Disney Co. and AT&T are suspending advertising with YouTube over mounting concerns about the platform being used for the eroticization of children and for pedophile networking.
Two days after Google claimed to be finally fixing this problem, The National Center on Sexual Exploitation’s Haley Halverson researched claims of pedophile rings, child erotica, and child exploitation on YouTube and found alarming results confirming the original reports and continued use of the platform for exploitive purposes.
A video displaying proof (edited to not endanger children’s identities) is here:
Google’s YouTube is an Internet conduit to user-generated videos where the latest cute kitten videos are mixed together with sexually graphic material.
In late 2017, YouTube came under scrutiny for hosting disturbing videos that targeted children, often with children’s characters put in sexually charged or violent situations, many of which were monetized. YouTube states that it removed these ads from approximately 2 million videos and shut down over 50,000 channels that featured this kind of content.
However, YouTube is still rife with problems, including some reports that simple search terms like “how to have” would autocomplete with child sex themes.
While YouTube is fixing problems ad hoc whenever they receive concentrated media attention, the website does little to proactively monitor or restrict inappropriate content and it forces users to go through a rigorous process if they want to report the content for removal. It appears that whenever they can get away with it, YouTube allows inappropriate content to remain on its platform in order to generate views and more profit.
Tweet at YouTube to Ask Them To Improve Some Key Policies and Systems.@YouTube please turn on Safe Search automatically to prevent accidental exposure to graphic material on YouTube. Click To Tweet .@YouTube please improve reporting and filtering software for sexually exploitive content. Click To Tweet .@YouTube please more thoroughly vet monetized videos, and extend your AdWords policy so YouTube doesn't profit from pornography. Click To Tweet .@YouTube please allow third-party filters to work better on your site, so users can avoid sexually exploitive content. Click To Tweet
Currently, YouTube has over a billion users—almost one-third of all people on the Internet—and every day people watch hundreds of millions of hours of content on YouTube and generate billions of views. YouTube overall, and even YouTube on mobile alone, reaches more 18-49 year olds than any cable network in the U.S. Every 60 seconds 300 hours of video is uploaded to YouTube and 323 days worth of YouTube videos are viewed on Facebook. Its 2014 revenue was estimated to be $4 billion.
Despite YouTube’s Community Guidelines, which specifically prohibit pornography, and other sexually explicit content particularly regarding “violent, graphics, or humiliating fetishes,” masses of such content is uploaded on the website each day and Google allows much of it to remain, thus amassing millions of views and handsome profits.
When you turn on the TV, you are not barraged with massive amounts of this kind material. Why is that when you open YouTube you’re flooded with suggestions to watch explicit sexual videos, even when typing in innocent searches? Given the popularity, quantity, and reach of content uploaded to the website, Google’s YouTube has a social responsibility to increase and improve efforts to curb sexual exploitation.
Requests to Google’s YouTube:
While NCOSE has successfully urged Google to curb exploitation on other fronts, and congratulates them for many positive changes regarding GooglePlay and AdWords, Google has remained reticent regarding changes to anything relating to the content on YouTube. While the launch of the YouTube Kids App is a step in right direction, it does not go far enough.
Together, with thousands of concerned parents and users, NCOSE urges Google to:
- Turn Safe Search and Restricted Mode on automatically for all YouTube users, so that they have to opt-in for more graphic or adult content, instead of being automatically bombarded with sexually exploitive material.
- Apply the same image filtering software currently used to identify child pornography to flag all forms of adult pornography or sexualized nudity as well.
- Develop a more thorough review process for channels applying to monetize their videos.
- Extend the AdWords policy to YouTube and refuse to profit from sexually exploitive content.
- Update YouTube to work more efficiently with third-party filters.
- Most importantly, we call on Google to use its creativity and immense talent to develop effective solutions for this growing problem.
PROBLEM: Users Must Serve As Moderators
Google’s current system depends largely on reactive moderating, which relies on users to flag and report offensive or sexually explicit content. Users must first watch the explicit content and then report exact times with descriptions if they wish to report it at all. YouTube prefers this method to taking responsibility upon itself to provide more active moderation and policing in real-time, because the views on explicit videos help to serve YouTube’s bottom line.
Often, the audience viewing this material is comprised of children. Pornography, and other sexually graphic material, has a profound negative impact on the development of children and exposure puts them at greater risk for falling victims of exploitation themselves.
PROBLEM: Ads on Explicit Videos
Sexually explicit videos on YouTube often amass many views, becoming eligible for lucrative pre-roll video ads that make YouTube and the Uploading Channel lots of money.
PROBLEM: Searches and Related Content
Many innocent search terms used on YouTube will bring up hardcore and violent explicit videos because uploaders use misleading descriptions when adding the content. Certainly, Google has the ability to develop a system for analyzing the images and not just the text descriptions of uploaded content.
Another problem is that once users view an explicit video and then other videos with sexually explicit thumbnails will fill the list of suggested content on the right side of the screen, or will be listed as suggestions in the video player when finished watching the desired film. This is especially dangerous for younger audiences using YouTube.
PROBLEM: Enforce the Standards, Regardless of the Partner
Community Guidelines must apply to the entire community on YouTube, no matter how rich or famous someone might be.
Have your kids been exposed to sexually explicit content through their schools?
You may remain anonymous if you wish, please indicate your preference.
Get Educated & Be Involved
Have regular discussions with your children about digital safety and your rules surrounding social media. For example: Make sure they are communicating only with people they know and that they realize the pictures they send don’t just vanish forever. Remind them, “Once on the Internet, always on the Internet!” Visit here, here, and here for more resources on teaching digital safety.
Consider using the social media tools that your children use so that you are not only aware of how it is used, but also as a way to show your children you care about their world, and to connect and communicate with them
NCOSE has successfully urged Google to curb exploitation on other fronts, and congratulates them for many positive changes regarding GooglePlay and AdWords. The launch of the YouTube Kids App is a step in right direction (not far enough). THANK THEM for these many positive policy changes here.
Share your STORY
Personal stories help elected and business leaders to see the grave harm associated with this material and can be very helpful in getting them to change their policies. All will be shared anonymously. Please email your story to email@example.com.
Protected: Stay updated on these projects
This time last year, Google Images, the site used by most people to find photos on the Internet, exposed children to countless graphic hardcore pornography images in less than 1 second, even for innocent or educational searches. Searches for basic anatomical terms did not yield scientific drawings but instead returned endless pages of images of…
YouTube has long been criticized by the National Center on Sexual Exploitation for its inefficient efforts to remove sexually graphic and grooming material. In 2019, NCOSE researched claims of pedophile rings, child erotica, and child exploitation on YouTube and found alarming results confirming the original reports and continued use of the platform for exploitive purposes. Learn more…
According to news reports, YouTube has blocked comments on most videos featuring minors to combat child exploitation, following the unveiling of pedophilia rings on its platform. “To the cynical eye, it appears that YouTube could be trying to retain the ability to monetize videos of children with hundreds of thousands, and millions, of viewers who…
YouTube is a facilitator of child sexual exploitation by allowing a softcore pedophilia ring to flourish on their site. YouTube has consistently failed to take full responsibility for the cleaning up of this issue, often putting forward lackluster solutions. Spread the message that it’s time to #WakeUpYouTube. Advertisers like Walt Disney Co., Nestle, Fortnite, and AT&T…
According to Bloomberg, the Walt Disney Co., Nestle, and “Fortnite” creator Epic Games, Inc. are suspending advertising on YouTube after mounting concerns about the platform being used for the eroticization of children and for pedophile networking. The National Center on Sexual Exploitation (NCOSE) is calling on YouTube to remove all pornography from its platform, and to…
The National Center on Sexual Exploitation is calling on YouTube to remove all pornography from its platform, following yet another disturbing account of apparently monetized images of children intended to sexually incite on YouTube. This is one of the reasons Google has been placed on NCOSE’s 2019 Dirty Dozen List which names 12 mainstream facilitators of sexual…
Washington, DC – The National Center on Sexual Exploitation is calling on YouTube to remove all pornography from its platform, following yet another disturbing account of apparently monetized child erotica on YouTube. This is one of the reasons Google has been placed on NCOSE’s 2019 Dirty Dozen List which names 12 mainstream facilitators of sexual…
VIRAL VIDEO: YouTube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized (2019)
Prominent YouTube personality MattsWhatItIs is shedding light on the prominence of child sexual exploitation on the YouTube platform. Within 15 hours of uploading his video, there are 678,000 views. Over the past 48 hours I have discovered a wormhole into a soft-core pedophilia ring on Youtube. Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with…