After the recent YouTube controversy, Google has announced that as of September, they will be monitoring YouTube more than ever before in order to demonetise channels which are deemed to contain or promote hate speech and terrorism.

This comes after significant losses due to advertisers withdrawing from the channel, after it came to light that their ads for products and services were being displayed in conjunction with videos widely considered offensive.

Demonetisation, Not Censorship

Google has no plans to remove the videos themselves, however they are implementing restrictions on viewing, sharing and monetising them, which means that any content deemed not to meet the relevant guidelines will be prohibited from having ads shown with such content.

These plans were already mentioned in June of this year, but with issues like this currently being hotly debated in the US, Google has speeded up implementation in order to prevent criticism.

According to their general counsel Kent Walker, “These videos will have less engagement and be harder to find. This strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”

Minority Of Videos Affected

Google estimates that these changes will only affect a small fraction of the 400+ hours of video that are uploaded to YouTube every minute of every day.

In addition to being unable to run ads, these videos also won’t be able to have comments posted to them, and they won’t appear in any lists of recommended videos on the site. It will be impossible to embed them in external sites, and they will display a warning lable as well.

Changing Policies

This is at least the third time this year that Google has changed their video policies for YouTube, after implementing new software to monitor videos after the loss of big advertisers concerned over ads running with extremist content, and restricting advertising to them.