YouTube is willing to bend its ostensibly strict content rules for creators who bring in the big bucks, according to a number of current and former content moderators whose job is to screen videos for potential violations of the platform’s Community Guidelines.
The moderators, who work or worked for third-party outsourcing companies contracted by YouTube, anonymously told The Washington Post that YouTube fosters “a demoralizing work environment marked by ad hoc decisions, constantly shifting policies, and a widespread perception of arbitrary standards when it came to offensive content.”
The way YouTube’s content moderation hierarchy is structured positions the kind of folks who spoke to the Post as the first line of defense. Their job is to review videos and make recommendations based on YouTube’s guidelines as to whether ads be stripped off a video or that a video be pulled down. But they don’t make the final call — that’s up to other moderators above them.
The first-line moderators told the Post that higher-ups often dismiss their recommendations when it comes to popular creators whose videos are bringing in millions of views (and thus millions of views’ worth of advertising revenue) for YouTube. They specifically pointed out Logan Paul and Steven Crowder as creators who’ve received special lax application of the rules because they have such large viewer bases.
For those who don’t know, in January 2018, Paul (19.7 million subscribers; 75 million views per month) infamously vlogged the body of a suicide victim in Japan’s Aokigahara forest. As a result, he was dropped from Google Preferred — a program that aggregates YouTube’s top creators into higher-priced packages for advertisers — and his original project with YouTube, a sequel film to his first project The Thinning, was cancelled. He also received a two-week monetization suspension.
But, a year and a half later, The Thinning’s sequel is back on, and Paul has retained his channel and monetization (save the two-week suspension) despite further incidents that ostensibly cross YouTube’s content guidelines, like tasing a dead rat on camera.
Moderators said that, per their understanding of YouTube’s guidelines, they thought Paul’s channel would certainly be terminated after the Aokigahara vlog. One said YouTube’s decision to instead just suspend ads for two weeks was “like a slap in the face” because “Logan Paul broke one of their biggest policies and it became like it never happened.”
Alex Joseph, a YouTube spokesperson, told the Post that YouTube considered the two-week suspension to be sufficient, and clarified that Paul has never received three Community Guidelines strikes against his channel in one 90-day period, which is the requirement for channel termination.
That requirement is exactly why conservative commentator Steven Crowder’s channel should have been terminated, moderators added. In June, Vox journalist Carlos Maza went public with complaints that YouTube refused to do anything about Crowder’s (4 million subscribers; 8 million monthly views, down from 40 million prior to the incident) ongoing racist and homophobic harassment targeting him. Following a public outcry, YouTube issued a flurry of contradictory decisions, ultimately settling on demonetizing Crowder’s channel and vaguely decreeing that he would need to address “all of the issues” with it before he could have ads back.
But a team of Austin-based moderators told the Post they warned YouTube about Crowder’s content weeks before Maza spoke out. They said their investigations of Crowder’s videos showed many of them violated YouTube guidelines. However, when they took their concerns to a higher-up, that person came back to them a week later and said YouTube had decided not to demonetize. It wasn’t until Maza’s complaints were amplified by an en masse outcry and subsequently picked up by media outlets that YouTube actually did something about Crowder’s behavior.
“YouTube’s stance is that nothing is really an issue until there is a headline about it,” one team member said.
While Crowder and Paul make wildly different content, they have one thing in common: they bring YouTube millions upon millions of views. The moderators’ consensus is that despite obvious content guideline violations, YouTube is unwilling to bar ads from or terminate their channels, and channels like theirs, because they’re financial assets that add to its billions of dollars in revenue.
When Tubefilter reached out to YouTube for comment about the moderators’ allegations, Joseph, the same spokesperson who replied to the Post, issued this statement:
“Over the last several years we’ve made significant investments in the people, tools, and technology needed to live up to our responsibility. These investments, like deploying machine learning to detect bad content at scale or expanding the teams working to combat violative content to over 10,000 people across Google, are accompanied by a systematic review of our policies to make sure we’re drawing the line in the right place. We apply these policies consistently, regardless of who a creator is.”