One day after expanding its pandemic misinformation policies to include conspiracies and other lies about COVID vaccinations, YouTube is taking aim at QAnon.
“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” the platform wrote in a blog post.
YouTube has been combating conspiracy content for years; it already had policies in place barring videos that endorse particularly harmful theories like Holocaust denial and that the Sandy Hook massacre was faked by the government. Now, it’s specifically banning “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” It names both Pizzagate and QAnon as examples.
For the unfamiliar, Pizzagate spawned during the 2016 election and centers around the inaccurate belief that Hillary Clinton and fellow top Democrats are running a child sex ring out of—among other locations—the basement of Washington, D.C., pizzeria Comet Ping Pong. In December 2016, a Pizzagate proponent opened fire at the restaurant (which does not have a basement) while “self-investigating” its supposed involvement.
Pizzagate went on to become one of the key tenets of QAnon, a rapidly growing movement that believes Donald Trump is an ultimate warrior secretly battling a cabal of Satan-worshipping, adrenochrome-drinking, child-trafficking politicians and celebrities. It is built on dispensations of (generally wildly false) information from “Q,” an anonymous message board poster claiming to work within the federal government.
Like Pizzagate did to Comet Ping Pong, the broader QAnon movement frequently locks onto individuals and baselessly alleges they are involved in criminal activities. YouTube’s updated policies target Pizzagate and QAnon by removing “content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies.”
This is not the first action YouTube has taken to contain QAnon content. Tweaks it made to its recommendation algorithm in 2018 have dropped QAnon channels’ “views that come from non-subscribed recommendations” by more than 80%, YouTube said. And, under its general harmful misinformation policies, it has already removed “tens of thousands” of QAnon videos and terminated “hundreds” of channels posting them, it added.
Per usual, the policies permit videos that discuss Pizzagate, QAnon, and similar conspiracies, as well as news content about them. Only endorsement will result in a takedown.
YouTube says it will begin enforcing the policy today “and will ramp up in the weeks to come.”