YouTube, Facebook, Twitter Try To Crack Down On Coronavirus Conspiracies

As the Wuhan coronavirus spreads, so too is misinformation about it being disseminated across YouTube, Facebook, and Twitter.

There are nearly 3,000 confirmed cases of the SARS-like respiratory virus in China, five in the U.S., and two in Canada. So far, 82 people have died. Fifteen cities in China have been placed under full or partial lockdown, Hong Kong’s chief executive Carrie Lam has declared the outbreak an emergency, and the U.S. State Department today issued an official advisory telling citizens to reconsider traveling to China.

Scientists believe the virus may have originated in snakes, but some denizens of the internet have instead latched on to conspiracy theories to explain its origins. According to a new report from the Washington Post, people on Facebook have spread claims that the U.S. government created the virus or bought a patent for it, and have suggested the virus is a form of population control (again, created/deployed by the government). Twitter users are sharing racist claims that Chinese dietary habits sparked the virus. YouTube videos making similar claims are popping up, with one reportedly reaching at least 430K views.

Subscribe to get the latest creator news

Subscribe

On top of all this, and perhaps more dangerously, users are beginning to hawk supposed cures or preventative measures, talking up things like oregano oil or colloidal silver (neither of which are effective against coronavirus). One YouTube video, which WaPo reports has more than 20K views, falsely claims the virus has already killed 180,000 people, and offers fake cures.

YouTube tells Tubefilter it combats the spread of false information by surfacing authoritative content like trustworthy news sources in its search results and Up Next panels. For breaking news topics like coronavirus, it adds short previews of text-based news articles interspersed between videos, and posts a reminder that breaking and developing news can rapidly change, the platform says.

When we searched for “coronavirus” on YouTube, we were served dozens of videos from outlets like Global News, CBC, Bloomberg, TIME, NBC, and ABC. We did not see text articles or a reminder.

YouTube also tells us that videos containing false information generally don’t violate its Community Guidelines unless they cross the line into hate speech, harassment, inciting violence, or propagating scams.

All three platforms tweak content so users are less likely to see misinformation

As for how Facebook and Twitter are handling the situation…Facebook has mobilized its third-party fact-checkers to mark posts that claim coronavirus is fake and/or a government invention, plus posts hawking fake cures, as false. It also lowers those posts’ ranks, so they’re less likely to show up in users’ main feeds even if shared by friends. Additionally, organizations that work with Facebook are issuing statements of facts about coronavirus that directly contradict conspiracy theories being spread on the site, WaPo reports.

“This situation is fast-evolving and we will continue our outreach to global and regional health organizations to provide support and assistance,” Andy Stone, a Facebook spokesman, told the outlet.

Twitter, like YouTube, is pushing legit information in its search results. Spokesperson Katie Rosborough said it’s expanding a feature to the Asia-Pacific region so “when an individual searches a hashtag they’re immediately met with authoritative health info from the right sources up top.” (It’s not clear where else that feature is available, or if Twitter is using it combat coronavirus misinformation in regions outside APAC.)

Tweets that spread misinformation are apparently being removed as violations of Twitter’s policy against coordinated efforts to mislead users.

All three platforms have faced criticism for how they keep misinformation or unwanted content like graphic videos from spreading across their sites, but YouTube is perhaps the most embattled. Most recently, it’s faced allegations that some climate change-related videos recommended in its Up Next bar contain false information such as outright denial it’s happening or claims that humans have no part in global warming. YouTube has called the study’s methodology into question, and issued a statement reiterating that it surfaces “authoritative voices” on topics often cluttered with misinformation.

Share
Published by
James Hale

Recent Posts

Top 5 Branded Videos of the Week: YouTube uses sponsorship to show off Shopping features

MrBeast continues to show us that he's in a league of his own as far as…

18 hours ago

Issa Rae’s new management company wants to teach creators how to get better brand deals

Issa Rae's new company wants to hook up creators and brands for "deeper relationships" beyond…

21 hours ago

MrBeast is ending his exclusive relationship with Night (Report)

MrBeast is reportedly ending his exclusive relationship with management company Night. Two people familiar with…

22 hours ago

After cutting 15% of staff and saying goodbye to its CEO, Peloton must figure out what’s next

Peloton is dismissing a chunk of its workforce, including its top executive. Barry McCarthy announced that he is…

4 days ago

Meta is using AI to power brand and creator matchmaking on Facebook and Instagram

Meta is looking to improve creator and brand experiences on its platform by investing in AI. The…

4 days ago

Bob Does Sports cracks a cold one with new “Have a Day” tequila line

Bob Does Sports, the self-dubbed home of "brilliantly dumb sporting adventures" hosted by Robby Berger,…

4 days ago