YouTube staffs up to tackle inappropriate content

YouTube has announced plans to grow its ‘trust and safety teams’ to 10,000 people in 2018 to clamp down on content that violates its policies.

Susan Wojcicki

Susan Wojcicki

In an open letter, YouTube CEO Susan Wojcicki said that in the last year YouTube has taken action to protect its community against violent or extremist content. Now it is applying the same lessons to tackle “other problematic content”.

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” said Wojcicki.

“Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.”

Wojcicki said that YouTube is taking “aggressive action on comments”, launching new comment moderation tools and in some cases shutting down comments altogether.

Since June the site has removed more than 150,000 videos for violent extremism and today 98% of violent videos that are removed were first flagged by YouTubes machine-learning algorithms, she added.

“We’re also taking actions to protect advertisers and creators from inappropriate content,” said Wojcicki. “We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand’s values. Equally, we want to give creators confidence that their revenue won’t be hurt by the actions of bad actors.”

“We believe this requires a new approach to advertising on YouTube, carefully considering which channels and videos are eligible for advertising. We are planning to apply stricter criteria, conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should.”

Read the full letter here.

Read Next