YouTube’s battle for advertiser approval

YouTube’s battle to create a safe space for advertisers took another turn this week when it announced on Tuesday that it is introducing stricter criteria for which channels can run ads.

Logan_Paul_so_sorryThe move follows a turbulent year of negative headlines and questionable content that has laid bare advertising industry concerns about what their inventory is placed against on YouTube. It also came hot on the heels of the latest YouTube scandal, Logan Paul’s visit to Japan and his subsequent vlog, titled ‘We found a dead body in the Japanese Suicide Forest’.

YouTube said it was changing the rules to mean that site users will now need 1,000 subscribers and 4,000 hours of watch time within the past 12 months to be eligible for ads. The move will by its own admission impact a “significant number of channels”.

YouTube also said that its flagship Google Preferred content – top channels that YouTube aggregates into easy-to-buy packages for brand advertisers – will from now on be manually reviewed. Ads will run only on videos that have been verified to meet YouTube’s ‘ad-friendly guidelines’.

The move comes almost a year after the start of YouTube’s so-called ‘Adpocalypse’, which began after the Wall Street Journal launched an investigation into popular YouTuber PewDiePie, highlighting a string of posts that included antisemitic jokes or Nazi imagery.

PewDiePie, real name Felix Kjellberg, claimed his words and humour had been taken out of context, but Disney’s Maker Studios promptly cut their ties, releasing a statement that read: “Although Felix has created a following by being provocative and irreverent, he clearly went too far.” YouTube also cancelled PewDiePie’s YouTube Red series on the back of the revelations and failed to include the star in their YouTube Rewind review video of 2017 – something he had featured in in each of the previous four years.

The PewDiePie scandal seemed to spark wider media scrutiny over what content was being monetised on YouTube. Soon newspapers including London’s The Times started to run damning reports on ads appearing next to extremist content in a tidal wave of negative press that causing top brands like Verizon and AT&T to pull advertising from the site.

While the action threatened to have serious financial implications on YouTube, in reality the negative publicity had little immediate impact on Google parent company Alphabet’s bottom line. Announcing its Q1 earnings in April, Alphabet reported large growth in both revenues and profits, driven by gains in its core advertising business.

However, the press coverage and advertiser push-back had set the wheels of change in motion. By June YouTube had introduced new guidelines to take a tougher stance on hateful, demeaning and inappropriate comment on the site. It said at the time that it had held “thousands of productive conversations” with advertisers, and had implemented additional controls to restore advertiser confidence.

Come November another damaging expose by The Times accused Google of making “millions of pounds in advertising revenue from videos that exploit young children and appeal to paedophiles.” The fallout from this story and much follow-up coverage caused advertisers like Mars, Deutsche Bank and Hewlett-Packard to state they would back away from the site.

YouTube CEO Susan Wojcicki published an open letter in December announcing plans to grow YouTube’s ‘trust and safety teams’ to 10,000 people in 2018 to clamp down on content that violates its policies. Having taken action earlier in the year against extremist content, YouTube was now going after “other problematic content”.

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualised decisions on content,” said Wojcicki. She claimed that since June the site had removed more than 150,000 videos for violent extremism and that as of December 98% of violent videos that were removed were first flagged by YouTube’s machine-learning algorithms.

As the year drew to an end, YouTube’s troubles did not. Logan Paul’s decision to upload a video that appeared to contain footage of a suicide victim’s dead body caused an outcry, so much so because the video – before Paul took it down – was monetised, as was Paul’s the subsequent apology video.

Oversight is a tricky issue for YouTube. It is an open platform and must rely on AI to a large extent to police the massive 480 hours of content is uploaded every minute. However, manually reviewing Google Preferred content should go some way to easing advertiser concerns that have been raised repeatedly in the past year. It should also put a stop to embarrassments like the Logan Paul incident.

However, the real consequence of this week’s policy changes will be on small, hobbyist YouTube creators. Users that don’t have a large profile and don’t make a career out of the site, but dutifully upload videos for which they have, until now, been rewarded with a trickle of ad dollars. With diminished hope for financial reward it is these YouTubers that will feel the brunt of YouTube’s policy change as it continues on its quest to provide a safe space for brands.

Read Next