Sorry kids, no more mowing down NPCs: YouTube’s new Community Guidelines age-restrict some video game violence (and also NFTs)

By 10/30/2025
Sorry kids, no more mowing down NPCs: YouTube’s new Community Guidelines age-restrict some video game violence (and also NFTs)

YouTube is updating its Community Guidelines around gambling and video game violence to “adapt to the evolving digital world,” spokesperson Boot Bullwinkle tells Tubefilter.

Let’s get into the gambling half of this first. Back in March, YouTube rolled out a sizable patch for its gambling policies that did two things: one, made it against the rules to even verbally mention the name of a gambling site that isn’t “certified by Google,” aka confirmed to be operating legally; and two, age-restricted all content that contains “depictions or promotions of online casino sites or apps,” so no one under 18 could watch content about virtual gambling.

These changes were an effort to protect its community, YouTube said, “especially younger viewers.”

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

Now, further updates to that same policy are expanding it to include not just things like your typical online poker, slots, or sports betting, but also digital goods like NFTs and video game skins as well as “social casino” websites and apps.

For those who don’t know, “skin gambling” is popular with players of games like Counter-Strike, where you can roll loot boxes and pull different weapon and character skins at random. Valve, which publishes CS and also runs game market/launcher Steam, allows players to sell these skins to one another for real-world cash, resulting in a thriving market. (That is, until this past week, when said market experienced a major crash.) Some skins are extremely rare and desirable, and can pull hundreds of thousands or even millions of actual dollars at auction. So, just like with sports and Pokémon cards, people are willing to drop big cash on digital loot boxes, hoping to hit it big with a skin that’ll sell for real-world dollars.

That chase is what people tend to chronicle in their YouTube videos–and what YouTube says that, starting Nov. 17, will be age-restricted.

As we mentioned above, the policy also now applies broadly to NFTs. YouTube officially classifying the entire concept of an NFT as gambling is pretty funny, though it is kinda too little too late considering how long ago the NFT bubble burst. (YouTube couldn’t put this out when CryptoZoo was a thing?) We do imagine it might upset some of the crypto bros still clinging to that particular wheezing life raft, though.

Also as we mentioned, “social casino” content is being age-restricted, too. According to YouTube, social casinos are “websites or applications that simulate traditional casino games but no real money is wagered.” The lack of IRL cash and risk doesn’t matter–YouTube considers these sites to be age-inappropriate for kids.

“These updates reflect the way value is exchanged online continues to evolve, including with online gambling,” YouTube says. “We’re strengthening enforcement of our existing policies to ensure that content is age-appropriate and that YouTube remains a responsible platform.”

OK, on to the video game violence portion of the update. YouTube has a tough history with this particular topic. Up until the introduction of its ad-safety self-certification system in 2019, it painted real-world violence and video game violence with the same brush, leaving creators with unbalanced, inconsistent enforcement (why was Apex Legends gameplay being treated the same as a real shooting?) and an overall lack of clarity about which rules applied when to their content.

In 2019, based on feedback from creators, YouTube finally separated out real violence and video game violence, giving laxer rules on the latter so people could show themselves, you know, playing video games. Video game violence was given general approval for all ages and all ads, with a few exceptions.

Now it’s adding to those exceptions: Starting Nov. 17, it will age-restrict videos that show “violent gaming content featuring realistic human characters that focuses on scenes of certain kinds of extreme violence against non-combatants.”

YouTube says it’ll consider several factors when judging whether a video should be restricted, including the duration of the violent scene, the prominence of the scene (“whether the violent imagery is zoomed-in or the main focus”), and the presence of “realistic human characters,” meaning the violence is happening “to a character that looks like a real human.”

It would be helpful if YouTube gave specific examples here, but it doesn’t. We’re guessing the note about “non-combatants” refers to people mowing down NPCs in games, but it could also impact parts of games like Dead by Daylight, where characters can perform special animated executions on one another.

But this is all dependent on what YouTube considers “realistic human characters. Obviously the company is eyeballs deep in supporting generative AI and the resulting flood of deepfakes, and though video games are steadily becoming more and more photorealistic, they’re usually not aiming for Midjourney’s uncanny valley realism. Video game people still look animated. How many pixels qualify as “realistic”? Do we need to see pores?

Creators will likely have similar questions, and will want them answered before they start getting Community Guidelines violations for a line they didn’t know to avoid crossing.

Hopefully they can get those questions answered before Nov. 17.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe