Violative content has become one of YouTube‘s hottest topics. With millions of new videos arriving on the platform each day, it can be difficult for moderators to stay ahead of misinformation, hateful content, and bad actors. Overreacting to those issues can cause problems as well.
So how does YouTube enforce its Community Guidelines without affecting too many of its fair-playing users? That question, along with several others, is addressed in an FAQ posted on the YouTube blog by execs Matt Halprin and Jennifer Flannery O’Connor.
Creators who are struggling to keep up with YouTube’s rules and regulations will want to check the FAQ out. As the post explains, the Community Guidelines are written, updated, and enforced with one major goal in mind: The reduction of real-world harm. That’s why YouTube has banned misinformation about COVID-19 vaccines even though other conspiracy theories are allowed to stay up (albeit with significant penalties).
Send the latest creator news straight to your inbox
If you’re wondering why YouTube has posted this guide now, consider that it might be targeting a wider intended audience than mere creators. Recently, the video site has come under attack from both left-wing and right-wing politicians in the U.S. While conservative lawmakers attempt to end all online video moderation, liberals are calling for stricter rules in order to prevent users from becoming radicalized. By explaining its Community Guidelines, YouTube is demonstrating the exact criteria that help it draw a line between those two forces.
“We watch dozens or even hundreds of videos to understand the implications of drawing different policy lines,” reads the post. “Drawing a policy line is never about a single video; it’s about thinking through the impact on all videos, which would be removed and which could stay up under the new guideline.”
YouTube doesn’t take these actions alone. The company worked with the CDC and the WHO to develop its policies related to COVID-19. More recently, a pact between YouTube and Poynter has committed more than $13 million to fact-checkers around the globe.
As new challenges arise for the world’s biggest online video platform, YouTube will continue to use a combination of partnerships, manpower, and AI in its attempt to root out violative videos. The combined power of its in-house Intelligence Desk and machine-driven technical solutions can curb misinformation, so long as YouTube remains vigilant and continues to communicate its policies with its creators.