YouTube tightens moderation policy to ban Covid-19 vaccine misinformation

Google-owned video platform YouTube has tightened up its content moderation policies to remove videos with misinformation about Covid-19 vaccines.

YouTube first launched its Covid-19 Medical Misinformation Policy earlier this year, when fake news stories about the virus were running rampant on the site, but the company has said that it will now actively remove videos which contradict information from health experts and the World Health Organisation.

The policy initially targeted videos which suggest the virus does not exist, is not contagious, or videos which discourage medical care guidelines in favour of alternative therapies including drinking bleach.

A spokesperson for YouTube explained the expansion of its policies: “A Covid-19 vaccine may be imminent, therefore we’re ensuring we have the right policies in place to be able to remove misinformation related to a Covid-19 vaccine from the platform. Any content that includes claims about Covid-19 vaccinations that contradict expert consensus from local health authorities or the World Health Organization (WHO) will be removed from YouTube.”

YouTube said that it has removed over 200,000 videos relating to misleading or dangerous information about the deadly virus since February.

The policy change comes days after Facebook and Twitter implemented similar measures to prevent the spread of disinformation. The former, an international survey of journalists found this week, has been the greatest houser of Covid-19 misinformation since the virus’s surge at the beginning of the year, with a separate Cornell University survey identifying US president Donald Trump as the largest individual driver of the “infodemic.”

As for YouTube, WHO digital solutions manager Andy Pattison said that it meets with the Google-owned company on a weekly basis to discuss content and ways to curb potentially damaging content. Pattison said that the body has been pleased by the YouTube announcement.

Read Next