News

YouTube gives creators an inside look at its Community Guidelines

Violative content has become one of YouTube‘s hottest topics. With millions of new videos arriving on the platform each day, it can be difficult for moderators to stay ahead of misinformation, hateful content, and bad actors. Overreacting to those issues can cause problems as well.

So how does YouTube enforce its Community Guidelines without affecting too many of its fair-playing users? That question, along with several others, is addressed in an FAQ posted on the YouTube blog by execs Matt Halprin and Jennifer Flannery O’Connor.

Creators who are struggling to keep up with YouTube’s rules and regulations will want to check the FAQ out. As the post explains, the Community Guidelines are written, updated, and enforced with one major goal in mind: The reduction of real-world harm. That’s why YouTube has banned misinformation about COVID-19 vaccines even though other conspiracy theories are allowed to stay up (albeit with significant penalties).

Subscribe to get the latest creator news

Subscribe

If you’re wondering why YouTube has posted this guide now, consider that it might be targeting a wider intended audience than mere creators. Recently, the video site has come under attack from both left-wing and right-wing politicians in the U.S. While conservative lawmakers attempt to end all online video moderation

, liberals are calling for stricter rules in order to prevent users from becoming radicalized. By explaining its Community Guidelines, YouTube is demonstrating the exact criteria that help it draw a line between those two forces.

“We watch dozens or even hundreds of videos to understand the implications of drawing different policy lines,” reads the post. “Drawing a policy line is never about a single video; it’s about thinking through the impact on all videos, which would be removed and which could stay up under the new guideline.”

YouTube doesn’t take these actions alone. The company worked with the CDC and the WHO to develop its policies related to COVID-19. More recently, a pact between YouTube and Poynter has committed more than $13 million to fact-checkers around the globe.

As new challenges arise for the world’s biggest online video platform, YouTube will continue to use a combination of partnerships, manpower, and AI in its attempt to root out violative videos. The combined power of its in-house Intelligence Desk and machine-driven technical solutions can curb misinformation, so long as YouTube remains vigilant and continues to communicate its policies with its creators.

Share
Published by
Sam Gutelle

Recent Posts

Have you heard? Ludwig’s ‘GeoGuessr’ fame, Poland’s record-setting stream, and an NFL prank gone wrong.

Each week, we handpick a selection of stories to give you a snapshot of trends,…

2 days ago

Roblox hikes developer earnings by 42%–but only if they make games aimed at adults

Roblox is quadrupling down on chasing adult gamers--and rewarding developers who make games appealing to…

2 days ago

After FaZe Clan’s epic collapse, it’s CORE members are reuniting with a new creator group

Five months after FaZe Clan's collapse, some of its best-known alumni are looking to bring back…

2 days ago

TV production companies let creators use their game show formats. Then Squeezie flipped the script.

Creators have already made their mark in movie theaters and on Broadway stages. Now, they're…

2 days ago

Vine is back–and it has a zero-tolerance policy for creators using AI

Vine is back, and it's anti-AI. Jack Dorsey, co-founder and former multi-time CEO of Twitter,…

3 days ago

Spotify has a new use for “verified” check marks: They can identify human creators

On the internet, it's been a roller coaster ride for the humble check mark. At…

3 days ago