In January, YouTube introduced a new algorithm trained to hunt down “borderline content,” aka videos that don’t actually violate its Community Guidelines, but which can still “misinform users in harmful ways.” The algorithm doesn’t remove said videos from YouTube, but instead prevents YouTube’s recommendation algorithm from spreading them.
When it first debuted, the seek-and-suppress algorithm was instructed to comb through videos uploaded by users in the U.S., looking for three specific types of borderline content: videos “promoting a phony cure for a serious illness,” flat earth conspiracy videos, and videos that deny or “make blatantly false claims about” events like 9/11, the Holocaust, and the Sandy Hook massacre.
Now, YouTube has confirmed to TechCrunch that it’s expanding that algorithm to videos uploaded from users in the U.K.
The spokesperson said that over the past nine months in the U.S., videos the algorithm identified as borderline content had their number of views from recommendations reduced by more than 50% due to the algorithm’s intervention. They added that the algorithm will take time to make a similar impression in the U.K., but didn’t provide information about when the algorithm will roll out or how many videos it will affect. (When it was introduced in the U.S., it was first rolled out on “a very small set of videos,” per YouTube, so it’s possible the U.K. will see an equally quiet debut.)
They also said the algorithm is still being tested, reiterating YouTube’s January remarks that training the new algorithm to maximum efficiency and accuracy is a lengthy process that involves using human content evaluators and subject matter experts from locations where the algorithm is rolled out.
Additionally, they told TechCrunch that YouTube acknowledges it still has work to do on its recommendation system, something that’s been under near-constant fire for recommending conspiracy videos, weird and gory faux-kids’ content, alt-right/white supremacist content, and videos fetishizing young children.
The spokesperson said that on average, YouTube’s recommendation algorithm doles out mainstream videos to viewers.
While this is YouTube’s first time moving the new borderline content-hunting algorithm over to the U.K., it’s not the first time YouTube has applied the “borderline content” label to a U.K. YouTuber. Back in April, it ruled that British far-right activist Tommy Robinson’s (pictured above) videos — which generally involve complaining about the supposed evils of Islam and political correctness — counted as borderline content, and subsequently stripped them from search results, kept other creators’ from appearing as suggested under his content, and barring him from livestreaming.