Anti-Vaccination Videos May Be YouTube’s Latest Front In Battle Against Conspiracies

By 02/21/2019
Anti-Vaccination Videos May Be YouTube’s Latest Front In Battle Against Conspiracies

In addition to reportedly serving as a breeding grounds for ‘flat earth’ conspiracy theorists, YouTube is now coming under fire for propagating yet another strain of misinformation: anti-vaccination videos. The issue has come to light amid a national measles outbreak in Washington state.

BuzzFeed found in a self-conducted study that when searching for ‘immunization’ on a virgin account, and after playing the first suggested video — a pro-vaccination clip — the top video in the ‘Up Next’ section was an anti-vaccination video. After playing that clip, BuzzFeed reports, YouTube’s sophisticated algorithm recommended others. In a report earlier this month, The Guardian similarly found that YouTube’s algorithm was “steering viewers from fact-based medical information toward anti-vaccine misinformation.”

While the anti-vaccination movement has gained steam in recent years, given unsubstantiated speculation that vaccinations cause autism, BuzzFeed notes that the scientific consensus is that vaccines are safe. The Guardian notes that the World Health Organization (WHO) calls vaccine hesitancy one of the top 10 greatest threats to global health in 2019.

The issue has recently garnered the attention of prominent lawmakers. California Rep. Adam Schiff sent letters to both Google and Facebook last week, per BuzzFeed, stating that YouTube’s recommendations represented “a direct threat to public health, and reversing progress made in tackling vaccine-preventable disease.”

Accordingly, Facebook says it’s tweaking its algorithm to prevent health-related misinformation from being recommended, per BuzzFeed. And Pinterest, for its part, has taken the most drastic measures of all by banning vaccination-related searches across its platform full-stop, given that most shared images in relation to the topic recommended against vaccination, according to The Wall Street Journal.

Earlier this month, YouTube revamped its algorithm to stem the spread conspiracy theories — and the company told The Guardian that some anti-vaccine videos would be affected. It is unclear, however, which videos will apply.

“Over the last year, we’ve worked to better surface credible news sources across our site for people searching for news-related topics, begun reducing recommendations of borderline content and videos that could misinform users in harmful ways, and introduced information panels to help give users more sources where they can fact check information for themselves,” a YouTube spokesperson told BuzzFeed in a statement. “Like many algorithmic changes, these efforts will be gradual and will get more and more accurate over time.”