Google’s YouTube is an Internet conduit to user-generated videos where the latest cute kitten videos are mixed together with sexually graphic material.
In late 2017, YouTube came under scrutiny for hosting disturbing videos that targeted children, often with children’s characters put in sexually charged or violent situations, many of which were monetized. YouTube states that it removed these ads from approximately 2 million videos and shut down over 50,000 channels that featured this kind of content.
However, YouTube is still rife with problems, including some reports that simple search terms like “how to have” would autocomplete with child sex themes.
While YouTube is fixing problems ad hoc whenever they receive concentrated media attention, the website does little to proactively monitor or restrict inappropriate content and it forces users to go through a rigorous process if they want to report the content for removal. It appears that whenever they can get away with it, YouTube allows inappropriate content to remain on its platform in order to generate views and more profit.
Tweet at YouTube to Ask Them To Improve Some Key Policies and Systems.@YouTube please turn on Safe Search automatically to prevent accidental exposure to graphic material on YouTube. Click To Tweet .@YouTube please improve reporting and filtering software for sexually exploitive content. Click To Tweet .@YouTube please more thoroughly vet monetized videos, and extend your AdWords policy so YouTube doesn't profit from pornography. Click To Tweet .@YouTube please allow third-party filters to work better on your site, so users can avoid sexually exploitive content. Click To Tweet
Currently, YouTube has over a billion users—almost one-third of all people on the Internet—and every day people watch hundreds of millions of hours of content on YouTube and generate billions of views. YouTube overall, and even YouTube on mobile alone, reaches more 18-49 year olds than any cable network in the U.S. Every 60 seconds 300 hours of video is uploaded to YouTube and 323 days worth of YouTube videos are viewed on Facebook. Its 2014 revenue was estimated to be $4 billion.
Despite YouTube’s Community Guidelines, which specifically prohibit pornography, and other sexually explicit content particularly regarding “violent, graphics, or humiliating fetishes,” masses of such content is uploaded on the website each day and Google allows much of it to remain, thus amassing millions of views and handsome profits.
When you turn on the TV, you are not barraged with massive amounts of this kind material. Why is that when you open YouTube you’re flooded with suggestions to watch explicit sexual videos, even when typing in innocent searches? Given the popularity, quantity, and reach of content uploaded to the website, Google’s YouTube has a social responsibility to increase and improve efforts to curb sexual exploitation.
Requests to Google’s YouTube:
While NCOSE has successfully urged Google to curb exploitation on other fronts, and congratulates them for many positive changes regarding GooglePlay and AdWords, Google has remained reticent regarding changes to anything relating to the content on YouTube. While the launch of the YouTube Kids App is a step in right direction, it does not go far enough.
Together, with thousands of concerned parents and users, NCOSE urges Google to:
- Turn Safe Search and Restricted Mode on automatically for all YouTube users, so that they have to opt-in for more graphic or adult content, instead of being automatically bombarded with sexually exploitive material.
- Apply the same image filtering software currently used to identify child pornography to flag all forms of adult pornography or sexualized nudity as well.
- Develop a more thorough review process for channels applying to monetize their videos.
- Extend the AdWords policy to YouTube and refuse to profit from sexually exploitive content.
- Update YouTube to work more efficiently with third-party filters.
- Most importantly, we call on Google to use its creativity and immense talent to develop effective solutions for this growing problem.
PROBLEM: Users Must Serve As Moderators
Google’s current system depends largely on reactive moderating, which relies on users to flag and report offensive or sexually explicit content. Users must first watch the explicit content and then report exact times with descriptions if they wish to report it at all. YouTube prefers this method to taking responsibility upon itself to provide more active moderation and policing in real-time, because the views on explicit videos help to serve YouTube’s bottom line.
Often, the audience viewing this material is comprised of children. Pornography, and other sexually graphic material, has a profound negative impact on the development of children and exposure puts them at greater risk for falling victims of exploitation themselves.
PROBLEM: Ads on Explicit Videos
Sexually explicit videos on YouTube often amass many views, becoming eligible for lucrative pre-roll video ads that make YouTube and the Uploading Channel lots of money.
PROBLEM: Searches and Related Content
Many innocent search terms used on YouTube will bring up hardcore and violent explicit videos because uploaders use misleading descriptions when adding the content. Certainly, Google has the ability to develop a system for analyzing the images and not just the text descriptions of uploaded content.
Another problem is that once users view an explicit video and then other videos with sexually explicit thumbnails will fill the list of suggested content on the right side of the screen, or will be listed as suggestions in the video player when finished watching the desired film. This is especially dangerous for younger audiences using YouTube.
PROBLEM: Enforce the Standards, Regardless of the Partner
Community Guidelines must apply to the entire community on YouTube, no matter how rich or famous someone might be.
Get Educated & Be Involved
Have regular discussions with your children about digital safety and your rules surrounding social media. For example: Make sure they are communicating only with people they know and that they realize the pictures they send don’t just vanish forever. Remind them, “Once on the Internet, always on the Internet!” Visit here, here, and here for more resources on teaching digital safety.
Consider using the social media tools that your children use so that you are not only aware of how it is used, but also as a way to show your children you care about their world, and to connect and communicate with them
NCOSE has successfully urged Google to curb exploitation on other fronts, and congratulates them for many positive changes regarding GooglePlay and AdWords. The launch of the YouTube Kids App is a step in right direction (not far enough). THANK THEM for these many positive policy changes here.
Share your STORY
Personal stories help elected and business leaders to see the grave harm associated with this material and can be very helpful in getting them to change their policies. All will be shared anonymously. Please email your story to firstname.lastname@example.org.
Stay updated on these projects
Youtube has a group of “Trusted Flaggers” who are volunteer moderators who help to identify and flag inappropriate content that is against YouTube’s Community Guidelines. The Trusted Flaggers group includes volunteers, some charities and law enforcement agencies. YouTube says, “reports of violations by Trusted Flaggers are accurate more than 90% of the time.” In November […]
At the end of November 2017, Buzzfeed News confirmed dozens of users that were reporting how a Youtube search starting with “how to have” auto-filled with pedophiliac phrases. These phrases included “how to have s*x with your kids” and “how to have s*x kids”. Even on incognito browsers that don’t take into account past search […]
YouTube Kids was created as a space that was safe for kids and families to enjoy appropriate, educational, and entertaining videos, including ones from popular shows by Disney and Nickelodeon. However, it has become apparent that even a website designed for kids is not a guaranteed refuge from inappropriate videos. The first question is how […]
While we’ve had a number of victories, the targets on the 2017 Dirty Dozen List have avoided changing their policies. They are still facilitating sexual exploitation. Examples of minimal progress: Comcast promised to implement changes, but insists on still selling extremely violent pornography. EBSCO Information Services has made some efforts to clean up their K-12 school […]
Internet pirates and pornographers are taking advantage of a YouTube loophole that shields them from content review. For years YouTube has insisted that its policies prohibit sexually explicit material like pornography. Of course, this claim has always fallen flat against the virtually countless pornographic videos that are easily accessible on the video-sharing platform. There are several reasons […]
YouTube, for the second consecutive year, has been included on the National Center on Sexual Exploitation’s annual Dirty Dozen List. The popular video-sharing site returns to the annual list for the original reason: failing to eliminate sexually explicit and pornographic content—both softcore and hardcore—in videos posted on its site. While YouTube’s Community Guidelines clearly delineate […]
National Center on Sexual Exploitation Encourages Policies for Human Dignity Washington, DC – Last Friday, Senior Vice President of Google Search, Amit Singhal, wrote on the Google Public Policy Blog that the technology company would create a complaint form for victims of revenge porn so that their images could no longer be found through Google’s […]
Ten seconds is all it takes for a child to open YouTube, and come across a video lined with vulgar language, precarious concepts, and/or inappropriate images. Although YouTube has a rich supply of kid-friendly, uplifting videos, these videos are often entangled with videos inappropriate for children. But, a partial solution to this problem has come. […]