Misinformation about the 2020 election–and every presidential election previous to it–is now allowed on YouTube.
“In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm,” the platform said in a blog post. “With that in mind, and with 2024 campaigns well underway, we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections.”
For months leading up to the 2020 election, YouTube (like many other platforms, post-Cambridge Analytica) touted its election integrity policy, promising to “better support” democracy by minimizing the spread of misinformation. But on election day and in the weeks after, it was slow to contain conspiracy videos claiming Joe Biden had stolen the presidency.
When YouTube finally did start removing election misinformation in December 2020, the effect was so significant that the amount of misinformation spreading on other platforms, like Twitter and Facebook, also dropped, according to a study from the Center for Social Media and Politics at New York University.
YouTube says that under the policy, it removed tens of thousands of videos–but it’s now “reevaluat[ing] the effects of this policy in today’s changed landscape.”
“The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society–especially in the midst of election season,” it said.
We’re not clear on what changed between 2020 and now that made the “landscape” more hospitable to misinformation. We asked YouTube if it could share any further data or insights that contributed to this decision. It declined.
We also asked if this policy change means YouTube will not remove misinformation about the upcoming 2024 presidential election. It declined to comment. A spokesperson said it will “have more details to share about our approach towards the 2024 election in the months to come.”
YouTube says its other election misinformation policies remain in place, meaning it still does not allow “content aiming to mislead voters about the time, place, means, or eligibility requirements for voting; false claims that could materially discourage voting, including those disputing the validity of voting by mail; and content that encourages others to interfere with democratic processes.”
It additionally points out that it is “ensuring” when users look for information about elections, “they see content from authoritative sources prominently in search and recommendations.”
Following the 2020 election, “videos from authoritative sources like news outlets represented the most viewed and most recommended election videos on YouTube,” it says. “And our 2020 election information panels, with relevant context from voting locations, to live election results, were collectively shown over 4.5 billion times.”
Welcome to Streamers on the Rise, where we find streamers who are growing their channels,…
The TikTok-ification of Twitch is nearly complete. The Amazon-owned hub has announced that its Discovery feed, which arranges clips…
The For You Page is getting a much-needed clean. An update to TikTok's Community Guidelines introduces some "eligibility…
YouTube is adding new controls to Shorts ads. With four weeks to go until the annual YouTube Brandcast in…
Welcome to Creators on the Rise, where we find and profile breakout creators who are…
WME has partnered with the king of short-form cooking videos. The Beverly Hills-based agency has signed Nick…