YouTube users in the U.S. will now see panels of fact-checked information pop up when they search for conspiracies or debunked claims, including those about COVID-19.
“Over the past several years, we’ve seen more and more people coming to YouTube for news and information,” the company wrote in a blog post. “More recently, the outbreak of COVID-19 and its spread around the world has reaffirmed how important it is for viewers to get accurate information during fast-moving events.”
The panels build on the “information cues” that YouTube launched in 2018. Those cues are deployed in pop-up boxes that appear over individual YouTube videos propagating conspiracies, such as that the earth is flat or false-flag attacks, and they contain bits of information from “fact-based websites” like Wikipedia and Encyclopedia Britannica.
Fact-check panels, on the other hand, appear at the top of search result pages rather than on individual videos. And instead of containing snippets from encyclopedia entries, they contain links to recent articles from a network of trustworthy fact-checkers and news publishers. Current partners–YouTube says there are around a dozen–include The Dispatch, FactCheck.org, PolitiFact, and The Washington Post Fact Checker.
Panels also include a brief description of the conspiracy being searched for, as well as a fact checker/publisher’s conclusion on whether the claim is true or false. If multiple partners offer information about one claim, multiple information panels will pop up, YouTube says.
The platform offers up an example: If a user searches for “COVID and ibuprofen,” a panel from FactCheck.org will appear. It summarizes the claim that the “vast majority of people who died had ibuprofen/Advil in their system”–which appears to have originated from a Twitter user–and rates that claim False. The panel then links to an article explaining that there’s “no evidence to back COVID-19 ibuprofen concerns.”
In its blog post about the feature, YouTube says fact-check panels won’t appear on every search. They’ll only appear when there’s an article about the claim available from one of YouTube’s partners, and when a user specifically searches for that claim. “For example, if someone searches for ‘did a tornado hit Los Angeles,’ they might see a relevant fact-check article, but if they search for a more general query like ‘tornado,’ they might not,” the platform says.
YouTube added that, as part of its efforts “to bolster fact-checking and verification efforts across the world,” it’s donating $1 million to the International Fact-Checking Network.
Fact-check panels have been available to users in Brazil and India since last year. For now, they’re only expanding to the U.S., but YouTube’s chief product officer, Neal Mohan, told The Verge there are plans to expand them to more countries in the future.
These panels are the latest of YouTube’s efforts to contain the spread of conspiracy theories and misinformation. It has been especially focused on COVID-19 since January, when it began cracking down on videos endorsing racist claims about the virus, spreading harmful healthcare advice, and hawking fake cures. It later added boxes of information about COVID-19 from the Centers for Disease Control, the World Health Organization, and regional healthcare groups to users’ home pages.
Earlier this month, Mohan said YouTube had removed thousands of coronavirus conspiracy and misinformation videos so far. He also noted the pandemic had sparked a 75% increase in traffic for YouTube’s news content.