YouTube

YouTube lets creators lie about election outcomes. It’s making money off their videos.

In June 2023, YouTube made the baffling decision to reverse its ban on election misinformation and start allowing people to spread lies about the outcome of the 2020 presidential race. Since then, surprising no one, the platform has become a megaphone for far-right conspiracy theorists to crow about how Donald Trump was robbed of his rightful re-election, according to a new report from The New York Times.

YouTube’s reasoning for allowing people to claim the 2020 election was rigged in favor of Democrats was, “we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm.”

“The ability to openly debate political ideas, even those that are controversial or based on disproven assumptions, is core to a functioning democratic society–especially in the midst of election season,” it said.

Subscribe to get the latest creator news

Subscribe

Well, we’re now thick in the midst of another election season, and this policy has allowed some of the most-watched, misinformation-spreading people on YouTube to establish a foundation for claims that the 2024 election will be rigged, too.

The Times looked at data from Media Matters for America, which is a progressive group that monitors conservative messaging. (The Times notes that while this is a bias, Media Matters uses objective data collection methodology, and because of this, journalists and academics frequently cite it as an authoritative source. It also independently verified all of Media Matters’ research.)

From May through August 2024, Media Matters tracked 30 of the most popular YouTube channels “identified as persistently spreading election misinformation,” the Times reports. Channels included those belonging to well-known right-wing commentators like Tucker Carlson, Ben Shapiro, and Rudy Guiliani.

Collectively, these channels posted 286 individual videos containing election misinformation during that timeframe, and racked up 47 million views–a third of which were monetized, with YouTube running ads alongside content that would’ve explicitly violated its policies prior to June 2023. Interesting to note, considering YouTube just announced unprecedented quarterly ad revenue earnings.

Giuliani was reportedly the worst offender, posting more misinformation than anyone else during Media Matters’ research period. He took that as a badge of honor: Over on X, he posted that he was “proud to be included with Ben and Tucker–two GREAT Patriots,” and added that “we are particularly honored by the designation as #1 among ‘major YouTube creators.'”

When Media Matters brought its research to YouTube, the platform reviewed some of the videos it flagged and said those did not violate Community Guidelines.

“The ability to openly debate political ideas, even those that are controversial, is an important value–especially in the midst of election season,” a YouTube spokesperson reiterated to the Times. They added that “most” of the channels Media Matters found are “ineligible for advertising,” and argued that Media Matters’ research actually “demonstrates our consistent approach to enforcing our policies.”

Media Matters doesn’t agree. Kayla Gogarty, the research director who led this investigation, said, “YouTube is allowing these right-wing accounts and channels to undermine the 2024 results.”

The Times also points out that YouTube placed information panels–one of the misinformation-combating features it promotes most–on 21 of the videos Media Matters found, but later removed the majority of them. It’s not clear why they were removed.

When YouTube originally changed this policy, we asked if it would also fail to remove misinformation about the 2024 election. It declined to comment. A spokesperson told us YouTube would “have more details to share about our approach towards the 2024 election in the months to come.”

But it’s been months, and while YouTube has done its usual touting, saying it’ll surface authoritative information about election results,AI-generated deepfakes (but won’t automatically remove them), and terminate any videos that spread misinformation about how/when/where to vote, it hasn’t given any updates about this particular policy.

Based on its response to Media Matters’ investigation, we think election outcome misinformation will continue to find a home on YouTube. And that could be damaging to the rest of the internet, as it’s been shown that when YouTube cracks down on misinformation, the amount on other platforms like Facebook is dramatically lessened. With the 2024 election just days away, YouTube stands to become the epicenter of a new wave of theft claims.

Share
Published by
James Hale

Recent Posts

YouTube hits nearly $10 billion in Q1 ad revenue

Alphabet's stock jumped more than 6% in after-hours trading following a strong quarterly earnings call--a…

8 hours ago

Kick gambling streamer N3on is spending millions on his army of clippers

Clipping is the content creator equivalent of a startup doing digital ads. And if you've…

10 hours ago

Australia limited teen access to social media. Now it’s trying to get a better deal for news media companies.

The Australian government has devised a new way to wring money out of social media…

11 hours ago

Can TikTok Shop disrupt the gallery by empowering creators to sell fine art pieces?

TikTok Shop's push into luxury goods isn't stopping with the $11,000 handbags it listed. Another…

11 hours ago

Patreon is prioritizing discovery. New short-form posts called Quips are part of the plan.

Back in 2024, Patreon announced its plan to bring a greater "network effect" to its platform. By…

13 hours ago

Top 5 Branded Videos of the Week: Yoshi egg, Pringles hack, alien ASMR

'Tis the season for festive holiday beverages, and some of YouTube's biggest channels are raising…

1 day ago