YouTube has expanded its pandemic misinformation policies to officially ban videos that spread conspiracy theories and other lies about COVID-19 vaccines.
Content will be removed if it contains health information that contradicts expert consensus from the World Health Organization and other regional medical authorities, the platform tells Tubefilter. It’s specifically cracking down on conspiracies, including claims that the shot will kill people, sterilize people, or implant them with government microchips.
Videos that engage in general discussion about “broad concerns” over the vaccine will still be allowed, YouTube told Reuters. Only videos that deliberately spread harmful misinformation will be taken down.
YouTube also tells Tubefilter that it will suppress COVID-related borderline content. (Suppression usually means videos will be excluded from search results and hidden from recommendation algorithms.) It adds that in the coming weeks, it will reveal more about what it’s doing to combat vaccine misinformation and boost knowledge from authoritative sources.
The policies being expanded today were introduced at the beginning of this year; since February, YouTube has removed more than 200,000 videos containing COVID misinformation, it says.
YouTube isn’t alone in rolling out new rules ahead of a potential vaccine. Yesterday, Facebook revealed that it has stopped accepting ads that discourage people from receiving not just the COVID vaccine, but all “safe and effective” vaccinations.
“We don’t want these ads on our platform,” Facebook said.