YouTube’s Algorithm Recommends Disturbing Videos To Kids, But Viewers Can Help Clean It Up

By 11/09/2017
YouTube’s Algorithm Recommends Disturbing Videos To Kids, But Viewers Can Help Clean It Up

In the online video industry, one of the most talked about articles of the moment is “Something is wrong on the internet,” a thorough examination of the insidious world of YouTube’s children videos. In the piece, author James Bridle explains how creators are employing automated processes to design videos for kids that are perfectly synced with YouTube’s recommendation algorithm.

These videos, which are often dark, weird, and inappropriate, are part of a booming industry of unusual children’s content on YouTube that has drawn questions for months, if not years. They contain the right keywords, run for the right amount of time, and relate to recent trending formats, so they zip to the top of recommendation boxes and play out in front of the eyeballs of curious toddlers. With a single click, viewers can go from watching traditional kids’ content to witnessing some of the stranger videos Bridle highlighted. Sometimes, it doesn’t even take any user action to reach these clips due to YouTube’s autoplay feature.

Tubefilter

Subscribe to get the latest creator news

Subscribe

Bridle’s article, which arrived on the heels of a New York Times report on the salacious nature of some videos on the YouTube Kids app, has drawn responses across the internet. Some outlets have joined Bridle in likening the spread of inappropriate children’s videos to the proliferation of fake news on platforms like Facebook, and have urged YouTube to address issues within its platform. “For YouTube, that acknowledgment comes in addressing that the algorithm it has in place is, and has been for a while, deeply flawed,” reads a post from Polygon.

Other readers have decided that the presence of disturbing content within YouTube’s youngest community makes the world’s top video site unsuitable for children. Here are tweets from two well-known journalists who are advising parents to bar their kids from watching any age-appropriate YouTube videos at all:

Poke around Twitter, and you’ll find many more suggestions along those lines. Its true that children will not be exposed to dangerous content if their parents direct viewership to more trusted sources like PBS Kids. At the same time, it’s important to recognize the important role viewers play in “fixing” the YouTube algorithm. The video site bears responsibility for the errors in judgement its engine makes, but a little nuance and forethought from parents can help clean up recommendations.


In his piece, Bridle admits that he’s not sure how to fix YouTube. “I have no idea how they can respond without shutting down the service itself, and most systems which resemble it,” he writes. “We have built a world which operates at scale, where human oversight is simply impossible, and no manner of inhuman oversight will counter most of the examples I’ve used in this essay.”

In response to his piece, many readers have argued that parents should not let children use YouTube unsupervised without expecting those kids to stumble upon something they shouldn’t be seeing. That is a smart impulse I would expand upon thusly: Rather than taking away YouTube from kids, it is important to encourage their supervised and curated consumption of it. The amount of YouTube content that is appropriate for children is seemingly infinite. There are hundreds of Peppa Pig episodes, plentiful unboxing clips, and endless streams of nursery rhymes. Many of these videos are organized into playlists on platforms like YouTube Kids.

YouTube has also taken steps to prevent twisted videos aimed at children from accruing revenue. In a June update, it announced its plan to cut off monetization for any clip that features “inappropriate use of family entertainment characters.” If you’re not sure what that means, take a look at your own risk:

If, as we’ve established, there is a mountain of good, wholesome kids content on YouTube, which the site itself favors, why are the weird videos still favored in the algorithm? While Bridle is right that the children’s video industry is far from the only internet community struggling against an engine that makes poor recommendations, the problem is far worse among videos targeted towards kids than on most other content verticals at YouTube. Adult viewers can feel much safer about watching videos that appears in their suggestion boxes. Those clips might not be up their alley, but they probably won’t be snuff films, either.

If parents set ground rules for YouTube use, disable autoplay, and vet clips before presenting them to youngsters, they won’t just prevent their sons and daughters from viewing objectionable content: They might also be able to help YouTube retrain its algorithm. If viewers show the so-called “black box” that they want to watch family-friendly videos as opposed to the one I embedded above, the box might take that preference into account.

That said, it’s hard to make any concrete declarations about the YouTube algorithm, which is driven by mysterious principles. It’s at least important to notice, however, that the broken system described in Bridle’s article is also the one driven by consumers who are too young to know how to click responsibly. They are feeding the box bad information, and the box is generating bad recommendations. Perhaps, if the actor making the clicks was a parent who knew more about what constitutes appropriate content, the bizarre horrors Bridle discusses would become less common.

This is pure speculation, but it seems like it could be a solution that would help clean up YouTube Kids while allowing parents to keep their kids entertained for free.

There are, of course, flaws in this argument. For one parent, the extra work involved in policing their kids’ YouTube diet may not be worthwhile, especially when there are plenty of trusted sources that can be accessed with just a few clicks. Even if scrupulous attention from adults would improve the quality of children’s YouTube videos as a whole, it is not necessarily in the interest of any single individual to make that community safer. It’s the tragedy of the commons set against a digital media backdrop.

Bridle also suggested the possibility of bot views propping up the clips he discusses, though he offers no evidence to back up that point, and it should be taken with a grain of salt.


It’s easy to read this piece and come to the conclusion that I am being too easy on YouTube, which at the end of the day is in charge of its own algorithm. Bridle calls YouTube’s lack of attention to the faults of its engine “despicable,” and he’s not necessarily wrong. The algorithm currently seems to favor channels that upload a lot of long videos in rapid succession, and as observers like Matthew Patrick have noted, that makes it difficult for individual, driven creators to keep up with well-funded media operations — or, in this case, automated systems that churn out repetitive takes on the same kid-friendly keywords. Even if parents help YouTube clean up its mess, they can’t undo its current disposition toward certain mass-produced video formats.

Ultimately, though, the consternation over dark and weird children’s videos on YouTube is the latest example the video site being blamed for a scale problem it cannot act upon alone. For example, let’s look at another category where YouTube has made headlines of late: music. Artists and labels have accused the platform of creating a value gap that results in paltry royalties. They decry YouTube’s “safe harbor” status, which prevents it from being legally culpable for pirated music uploaded by its users.

But safe harbor exists for a reason. In recent legal cases related to this issue, courts have repeatedly stressed that sites like YouTube and Vimeo can’t possibly police all the videos they post. Instead, they can only be asked to play their part. When presented with examples of pirated music, they must take action. Beyond YouTube, both labels and viewers are also responsible for eliminating any perceived value gap. Labels have to be willing to modernize their offerings, and viewers have to be willing to avoid pirated material, even if it means they have to watch pre-roll ads before any music videos they watch.

YouTube would no doubt like to excise the strange, violent tumor that exists within its children’s community, but the video site cannot make that change on its own. It needs help from parents who are willing to take an active role in their kids’ media diets in order to preserve a platform that has the potential to be an indispensable entertainment tool.

Bridle’s main point is that the force he called “infrastructural violence” can be seen all over the web, and he laments that the conversation around that topic is still somewhat limited. “We’re still struggling to find a way to even talk about it, to describe its mechanisms and its actions and its effects,” he writes.

Well, here’s one way to talk about it. If we want to improve discussions around infrastructural violence, the first step should be to realize that systems like YouTube’s algorithm are derived from viewer input. The things we watch, share, and click on form the basis of what is recommended to us. For good people to eschew broken systems entirely is counterproductive. Instead, they should strive to make those systems better by telling them that they want sing-alongs, strong morals, and big smiles — not car crashes, violent acts, and sexual fetishes.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe