YouTube is taking decisive action to combat an avalanche of conspiracy theory videos that have spread across the site in the wake of the Parkland shooting, after many far right creators mischaracterized survivors as “crisis actors” seeking to push stricter gun laws.
YouTube CEO Susan Wojcicki announced at South By Southwest last night that the platform will add Wikipedia links — as well as information from other “fact-based websites” — to conspiracy videos so that viewers will be able to consume them in their rightful context. The links will appear in new text boxes called “information cues” in coming weeks, The Verge reports, and for now will be used for conspiracies with notable social relevance, like those surrounding chemtrails and the moon landing.
“We’re using a list of well-known internet conspiracies from Wikipedia,” Wojcicki said of the theories that YouTube plans to address, noting that more will be added over time. “What I like about this unit is that it’s actually pretty extensible.”
While conspiracy videos about events like chemtrails and the moon landing will remain on YouTube, albeit contextualized with the new cues, Wojnicki noted that conspiracy videos that violate its community guidelines — like a video about a Parkland shooting survivor by InfoWars owner Alex Jones — would be removed. That video, which falsely characterized survivor David Hogg as a crisis actor, was removed by YouTube for harassment and bullying. As a result, Jones’ channel also received a strike.
A study by researcher Jonathan Albright found last month that YouTube’s monetization model and algorithm have provided a unique haven for conspiratorial content — though the site has yet to announce changes with respect to either of these components. Also of note is the fact Wikipedia’s crowdsourced nature means that it is mostly accurate, but also occasionally prone to falsehoods.