YouTube Officially Separates Ad Policies For Video Game And Real-World Violence

By 12/02/2019
YouTube Officially Separates Ad Policies For Video Game And Real-World Violence

Following up on CEO Susan Wojcicki’s promise to gaming creators last month, YouTube has rolled out a policy change that officially separates real-world violence and video game violence.

“We know there’s a difference between real-world violence and scripted or simulated violence — such as what you see in movies, TV shows, or video games — so we want to make sure we’re enforcing our violent or graphic content policies consistently,” Team YouTube wrote in a post about the update.

The nitty-gritty details: Videos containing in-game violence can now be approved for all ages and all ads. There will also be “fewer restrictions for violence in gaming” in general, the platform said, without giving further detail. It did note that it could still age-restrict gaming content if violence and/or gore is the sole focus of the video. (So “Here’s my Grand Theft Auto gameplay” is fine, even if you run over a few people, but “Here’s me running over 50 people in GTA, look at their limbs go flying” might get age-restricted.)

This change comes after numerous creators expressed concerns about how YouTube treats video game violence in the context of its Self Certification system. The system, which will eventually be deployed to all creators, requires users to self-report any ad-inappropriate content in their videos during the upload process. In order for a video to be considered appropriate for all ads, its uploader must check a box certifying that it contains no sex, swearing, or violence.

Those may seem like simple requirements, but what about sniping someone in Fortnite? Or dragging a dastardly villain behind your horse in Red Dead Redemption 2? Or taking someone down with Wraith’s kunai in Apex Legends? It was vital for creators to know if their actions in those games and many others constituted violence, because Self Certification runs on an honor system — the more a creator’s self-report aligns with YouTube’s assessment of their video, the more trustworthy the platform considers them. Crucially, that works the other way as well: if a creator repeatedly self-reports something different from what YouTube finds, that could harm their trustworthiness.

It turned out creators were right to worry. Last month, at YouTube’s first Gaming Creator Summit, Wojcicki told longtime gaming creator MatPat that YouTube’s systems did consider real-world violence and video game violence the same.

Now, though, gaming creators can (in most cases) check that no sex/swearing/violence box without fear of repercussions.