YouTube Explains How it Enforces Community Guidelines

The Google's subsidiary video hosting platform releases a complete report outlining its strategy for flagging and removing unwanted content.

YouTube released today a new blog post underlying its enforcement of new community guidelines against pornographic, harassment or incitement to violence on its video hosting platform. The release of this document is aligned with the promise made by YouTube principals last December to offer greater transparency in its removal process and also to be more vigilant about the content it allows on its platform. Intense pressure was applied to the company following the 2016 Presidential election which witnessed YouTube flooded with content allegedly intended to influence voters and perform a campaign of misinformation, what has been called “fake news.”  And then there was what many content creators referred to as the “Adpocalypse” when they saw their ad revenues dropped following an undefined new policy of automated demonetization to discourage content deemed undesirable that lacked the discerning touch of human oversight.

YouTube, a Google’s subsidiaries also released a short video to explain the “life of a flag.”

The company said in an official post, “We are taking an important first step by releasing a quarterly report on how we’re enforcing our Community Guidelines. This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.”

YouTube is also introducing a Reporting History dashboard that will allow each YouTube user to monitor the status of videos they’ve flagged to us for review against the Community Guidelines.

The graphic shows the volume of videos removed by YouTube, by the source of the first detection including automated flagging or human detection.

 

Highlights from the post reflecting data from October – December 2017 illustrate that YouTube removed over 8 million videos from YouTube during these months. It states that the majority of these 8 million videos were mostly spam or people attempting to upload adult content – and represent a fraction of a percent of YouTube’s total views during this period. 6.7 million posts were first flagged for review by machines rather than human.Of those 6.7 million videos, 76 percent were removed before they received a single view, showing a high rate of efficiency.

“We introduced machine learning flagging in June 2017,” the company said,” Now more than half of the videos we remove for violent extremism have fewer than ten views.”

Google used to say ‘Don’t be evil’ but ditched it for ‘do the right thing.’ Let’s watch and see if it lives by its new adage.


Remember to subscribe to our newsletter. Follow The Scope Weekly on Twitter and Facebook and Instagram.

If you would like to become a contributor to The Scope Weekly, read our submission guidelines, and apply. For product reviews, click here.

 

Tags from the story
, ,
More from Anne Howard - Editor
The Pros and Cons of the Growing Role AI plays in Generating News
According to various experts, the utilization of tools such as ChatGPT may...
Read More
One reply on “YouTube Explains How it Enforces Community Guidelines”