YouTube

YouTube is axing all individual volunteers from its Trusted Flagger program

Moving forward, YouTube’s Trusted Flagger program will “focus exclusively on key partnerships with a variety of NGOs and Government Agencies,” a company spokesperson tells Tubefilter.

YouTube launched the program in 2012 as a kind of communal volunteer effort aimed at eradicating violative videos from its platform.

Trusted Flagger has always worked with both governmental and non-governmental organizations (NGOs). But since its inception, it’s also worked with a league of individual users, all of whom were hand-picked for their high rate of accurate flagging and personally invited to become Trusted Flaggers for YouTube.

Subscribe to get the latest creator news

Subscribe

Trusted Flaggers get access to a handful of tools regular users don’t, including a bulk-flagging feature that lets them report multiple videos at the same time, prioritized reviews of their flags/reports, and inside info about how YouTube makes decisions about potentially violative content.

At least, that’s the way it used to be.

Last October, we wrote about how some individual users had begun noticing a sharp decline in the quality of their experience within the Trusted Flagger program. One, LightCode Gaming, took to Reddit to allege YouTube was “killing” the program for individual flaggers.

They said that in 2018, “team meetings stopped happening [and] communication slowed to a crawl,” and flaggers’ reports “started taking months to be reviewed.” These issues worsened in 2020 with the onset of COVID, LightCode Gaming said.

At the time, YouTube spokesperson Ivy Choi told us individual flaggers were still welcome to participate in the program. Simultaneously, flaggers who reached out to Derek Slater, Google‘s global director of information policy, government affairs, and public policy, were allegedly told YouTube was shifting the Trusted Flagger program away from individuals and toward agencies and organizations.

LightCode Gaming pointed out that from 2018 to 2021, 200 NGOs and 70 governmental agencies in the Trusted Flagger program reported 157,437 and 700 videos, respectively.

During that same timeframe, 12 individual flaggers reported 9.5 million videos.

YouTube says its AI can handle the content-hunting

Now, YouTube confirms it’s no longer working with individual flaggers at all.

YouTube spokesperson Jack Malon indicates the platform is moving away from individual flaggers because it feels its AI systems can take over their workload.

“Over the past few years, we’ve made significant technical improvements to our automated flagging system, and in Q4 of 2021, 92% of videos removed from YouTube were first detected automatically,” he says. “In an effort to continue improving these systems, we’re revamping our Trusted Flagger program to focus on the expansion of partnerships with specialized organizations who have deep knowledge in fields like misinformation and hate speech, which we view as an important component to our systems in the future.”

YouTube adds that 74% of the videos detected by its automated content moderation systems were removed before they received more than 10 views.

So, what happens to the individual flaggers being cut out? YouTube says they’re all welcome to join its YouTube Contributors Program, “where they can engage directly with YouTube Community Managers on issues related to flagging.”

These changes take effect May 27, 2022.

YouTube’s also making updates to its general flagging systems

YouTube added that it’s working on updates to “improve the flagging experience for everyone.”

Users who report videos will now receive notifications informing them if/when YouTube removes the videos. On the comment side of things, YouTube has added more reasons for potentially reporting a comment, so there’ll be more options to select when users file reports.

On top of these, YouTube said it’s “invested heavily to scale our social support teams, with the goal of ensuring that anyone can quickly report content.”

Its @TeamYouTube Twitter account now operates 24/7 and responds to 90% of inquiries within 60 minutes, it says.

YouTube says improving flagging is a top priority, and it’ll share more updates in the coming months.

Share
Published by
James Hale
Tags: YouTube

Recent Posts

Have you heard? YouTube mogs Clavicular, iGumdrop is a ‘MasterChef’, and ‘me at the zoo’ turns 21

Each week, we handpick a selection of stories to give you a snapshot of trends,…

8 hours ago

Students have become a scarce resource. Can schools use TikTok to combat the demographic cliff?

In the world of academia, a demographic cliff is looming, and TikTok might be the most reliable…

10 hours ago

For creators, the outfit of the day is a crucial choice, so ShopMy is introducing personal shopping

ShopMy is offering a new solution for fashion influencers who obsess over their outfits. The influencer…

11 hours ago

Instagram’s new app is yet another riff on Snapchat

Stop me if you've heard this one before: Instagram is copying Snapchat. The latter app is known for…

1 day ago

YouTube’s uninterruptive “side-by-side” live streaming ads have been spotted in the wild

YouTube is testing a new ad format that reinforces the platform's mission to make its…

1 day ago

The NHL wants to capitalize on Heated Rivalry’s fandom success

Hot hockey players are driving more views for the NHL--and we're not just talking about…

1 day ago