YouTube

YouTube Bans Content Endorsing QAnon, Pizzagate

One day after expanding its pandemic misinformation policies to include conspiracies and other lies about COVID vaccinations, YouTube is taking aim at QAnon.

“Today, we are taking another step in our efforts to curb hate and harassment by removing more conspiracy theory content used to justify real-world violence,” the platform wrote in a blog post.

YouTube has been combating conspiracy content for years; it already had policies in place barring videos that endorse particularly harmful theories like Holocaust denial and that the Sandy Hook massacre was faked by the government. Now, it’s specifically banning “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” It names both Pizzagate and QAnon as examples.

Subscribe for daily Tubefilter Top Stories

Subscribe

For the unfamiliar, Pizzagate spawned during the 2016 election and centers around the inaccurate belief that Hillary Clinton and fellow top Democrats are running a child sex ring out of—among other locations—the basement of Washington, D.C., pizzeria Comet Ping Pong. In December 2016, a Pizzagate proponent opened fire at the restaurant (which does not have a basement) while “self-investigating” its supposed involvement.

Pizzagate went on to become one of the key tenets of QAnon, a rapidly growing movement that believes Donald Trump is an ultimate warrior secretly battling a cabal of Satan-worshipping, adrenochrome-drinking

, child-trafficking politicians and celebrities. It is built on dispensations of (generally wildly false) information from “Q,” an anonymous message board poster claiming to work within the federal government.

Like Pizzagate did to Comet Ping Pong, the broader QAnon movement frequently locks onto individuals and baselessly alleges they are involved in criminal activities. YouTube’s updated policies target Pizzagate and QAnon by removing “content that threatens or harasses someone by suggesting they are complicit in one of these harmful conspiracies.”

This is not the first action YouTube has taken to contain QAnon content. Tweaks it made to its recommendation algorithm in 2018 have dropped QAnon channels’ “views that come from non-subscribed recommendations” by more than 80%, YouTube said. And, under its general harmful misinformation policies, it has already removed “tens of thousands” of QAnon videos and terminated “hundreds” of channels posting them, it added.

Per usual, the policies permit videos that discuss Pizzagate, QAnon, and similar conspiracies, as well as news content about them. Only endorsement will result in a takedown.

YouTube says it will begin enforcing the policy today “and will ramp up in the weeks to come.”

Facebook also recently enacted policies against QAnon content.

Share
Published by
James Hale

Recent Posts

Can TikTok Shop disrupt the gallery by empowering creators to sell fine art pieces?

TikTok Shop's push into luxury goods isn't stopping with the $11,000 handbags it listed. Another…

40 minutes ago

Patreon is prioritizing discovery. New short-form posts called Quips are part of the plan.

Back in 2024, Patreon announced its plan to bring a greater "network effect" to its platform. By…

2 hours ago

Top 5 Branded Videos of the Week: Yoshi egg, Pringles hack, alien ASMR

'Tis the season for festive holiday beverages, and some of YouTube's biggest channels are raising…

24 hours ago

TikTok is doing an official BookTok bestseller list–and Heated Rivalry is #1

An entire generation of BookTok bait authors now has a new list to hit. For…

1 day ago

The next phase of Tubi’s originals push features creators Keith Lee and Sofi Manassyan

Last year, Tubi went all-in on creators. The Fox-owned streaming service brought in former TikTok exec Kudzi…

1 day ago

‘The Guild’ was an iconic web series. Nearly 20 years later, Felicia Day is reviving it as a movie.

It's time to dust off those old World of Warcraft accounts, because The Guild is coming back. 19 years…

1 day ago