Their Policies

Despite YouTube’s Community Guidelines, which specifically prohibit sexual nudity, pornography, and other sexually explicit content, such content is uploaded in droves on the website each day and Google allows much to remain, amassing millions of views and handsome profits.

Community Guidelines:

“YouTube is not for pornography or sexually explicit content.”

“Most nudity is not allowed, particularly if it is in a sexual context… [or] if a video is intended to be sexually provocative…”

Terms of Service:

All users digitally sign and agree upon creating an account to upload content to the site.

“YouTube reserves the right to decide whether Content violates these Terms of Service for reasons other than copyright infringement, such as, but not limited to, pornography, obscenity, or excessive length.”


PROBLEM: Users Must Serve As Moderators

Google’s current system depends largely on reactive moderating, which relies on users to flag and report offensive or sexually explicit content. Users must first watch the explicit content and then report exact times with descriptions if they wish to report it at all. YouTube prefers this method to taking responsibility to provide more active moderation, or policing it in real-time on a company level, because the views on explicit videos help to serve YouTube’s bottom line.

The main problem with the established procedure for content removal is that users must first be exposed to the harmful content and then they have view the offensive material for many moments as they endure an arduous process just to alert Google about the violation. Many of the explicit videos on the site have hundreds of thousands and even millions of views because Google refuses to improve this process and instead facilitates further exploitation by making their viewers their TOU Enforcers.

Often, the audience viewing this material is comprised of children. Pornography has a profound negative impact on the development of children and exposure puts them at greater risk for falling victims of exploitation themselves.

PROBLEM: Ads on Explicit Videos

Pornographic videos on YouTube often amass many views, becoming eligible for lucrative pre-roll video ads that make YouTube and the Uploading Channel lots of money.

PROBLEM: Searches and Related Content

Many innocent search terms used in YouTube will bring up hardcore and violent explicit videos because uploaders use misleading descriptions when adding the content. Certainly, Google has the ability to develop a system for analyzing the images and not just the text descriptions of uploaded content.

Another problem is that users will often view a video and then other videos with sexually explicit thumbnails will fill the list of suggested content on the right side of the screen, or will be listed as suggestions in the video player when finished watching the desired film. This is especially dangerous for younger audiences using YouTube.

PROBLEM: Enforce the Standards, Regardless of the Partner

It appears that Google is willing to let the Terms of Use slide for celebrities who upload content that is in direct violation. Beyoncé, Justin Timberlake, and Robin Thicke are just a few examples of celebrities who have amassed millions of views (and $$$ for Google) with music videos that include full frontal nudity.

Community Guidelines must apply to the entire community on YouTube, no matter how rich or famous some might be.


Comments are closed.