Updated Sept. 19, 3:20 p.m., to add information from YouTube.
Back in January, YouTube introduced a new term to its lexicon: “borderline content,” aka content that doesn’t blatantly violate its Community Guidelines, but which the platform says can still “misinform users in harmful ways.” The new category is laser-focused on minimizing things like conspiracy videos, which YouTube had previously combated by creating a feature that adds text blocks of actual facts to conspiracy-related content.
Now, a YouTube executive in Europe has said that some conspiracy content is allowed to remain on YouTube because the platform has deemed it not dangerous. (To be clear, this isn’t a policy update or a change-up in how YouTube does things — it’s just a new glimpse into how the platform thinks about conspiracy videos.)
“If you’re searching for ‘9/11 conspiracy’ or the names around those particular conspiracies, those do come up,” Ben McOwen Wilson, YouTube U.K.’s managing director, said during yesterday’s Royal Television Society conference, The Telegraph reports. “Our position is that if those videos are available, they’re not causing harm.”
He added that folks who search general terms like “9/11” aren’t going to find conspiracy content; it’s only those who specifically go looking for it that will be served search results containing conspiracy videos.
Wilson also clarified that YouTube removes all videos containing conspiracy theories about the Sandy Hook Elementary School massacre. The 2012 shooting, which killed 20 children and six educators, has been targeted by rabid conspiracists who believe it wasn’t a real shooting, but instead a ‘false flag’ attack organized by the government. Conspiracists allege that the people killed in the massacre never existed in the first place, and that their distraught family members are actually so-called “crisis actors” carrying on a lengthy ruse with the end goal of encouraging the public to support increased gun control. (A notable driver of the theory is Alex Jones, who’s currently being sued for defamation by multiple Sandy Hook families, and was just ordered to pay all court costs for one of them.)
Wilson called Sandy Hook an example that’s “different” from 9/11 conspiracy videos, and said YouTube has classed them wholesale as “harmful and offensive to some of the families involved.”
Tubefilter reached out to YouTube for more information about how the platform distinguishes conspiracy content that does violate its policies, and conspiracy content that doesn’t. According to YouTube, the difference lies with a policy update it made back in June. The policy update specified that survivors of tragedies like 9/11, Sandy Hook and the Holocaust, as well as their next of kin, are protected groups. Thus, videos targeting those people by alleging the tragedies didn’t happen or that survivors and/or their families are crisis actors, are in violation of YouTube’s Community Guidelines.
Unlike with Sandy Hook, some categories of 9/11 conspiracies don’t infringe on survivor and next of kins’ rights as a protected group, per YouTube, and thus are deemed borderline instead of policy-violating. In those cases, as Wilson mentioned, YouTube doesn’t remove the videos, but instead limits their ability to be discovered by and recommended to viewers.
The platform says that since it introduced the borderline content category in January, watchtime borderline content gets from recommendations has dropped by 50% in the U.S.