Shifting Focus Within YouTube’s ‘Trusted Flagger’ Program Has Angered Some Participants

There’s some consternation afoot within YouTube’s Trusted Flagger program, which the video giant launched in 2012 to clamp down on violative content.

The program sees YouTube partnering with third-parties — including individual volunteers, governmental agencies, and non-governmental organizations (NGOs) — and providing them with special tools to help enforce its Community Guidelines. These include: a bulk-flagging tool to report multiple videos at the same time, visibility into decisions on flagged content, and prioritized reviews.

The Trusted Flagger program is invite-only for individuals, with YouTube looking for those who have flagged a large volume of videos with a high rate of accuracy. Organizations can apply to the program by contacting their resident YouTube reps.

Subscribe to get the latest creator news

Subscribe

While the program has mostly flown under the radar, a recent Reddit post from a disgruntled Trusted Flagger, whose online moniker is LightCode Gaming — which has been viewed upwards of 30,000 times — now alleges that YouTube is “killing” the program for individual flaggers.

YouTube confirms to Tubefilter that the program is now focusing on governmental agency and NGO flaggers, who possess deep knowledge in specific fields and geographies. That said, “Individual users are still eligible to participate in the Trusted Flagger program and have access to the same tools as specialized organizations,” YouTube spokesperson Ivy Choi said.

A gradual descent

LightCode Gaming, who became a Trusted Flagger in 2016, describes his experience in the program as a gradual descent of sorts. (It is unknown how many individual Trusted Flaggers there are, but he says he knew of roughly 12 others). In 2018, he recounts, “team meetings stopped happening…communication slowed to a crawl” — which got worse in 2020 amid the onset of COVID.

“Reports started taking months to be reviewed,” he shared on Reddit, “and our escalations that were originally taking a week? They jumped to months.”

At the time, YouTube acknowledged closing some content moderation offices and said it would temporarily rely on automated systems.

In Feb. 2021, LightCode Gaming says YouTube contacted Trusted Flaggers to let them know it was deprecating certain key tools. As frustrations mounted, he says, several flaggers contacted Derek Slater

Google‘s global director of information policy, government affairs, and public policy — who informed them that YouTube was changing the direction of the program to focus on governmental agencies and NGOs.

This felt like a blow, LightCode Gaming recounted, because 200 NGOs and 70 governmental agencies had only reported 157,437 and 700 videos, respectively, within the past four years — during which time the roughly 12 individual flaggers had reported 9.5 million videos.

“Over the past few years, we’ve made significant technical improvements to our automated flagging systems, and due to these ongoing investments, in Q2 2021, 94.4% of the videos removed were first detected by automated flagging,” Choi says, citing the company’s most recent Transparency Report. During this quarter, 0.86% of removed videos were first detected by individual trusted flaggers — the majority of which were spam.

“The expertise provided by Trusted Flagger organizations is an important complement to our systems and we remain committed to the program as it continues to evolve,” Choi said.

An uncertain fate

As a result, the fate of some individual flaggers remains unknown.

Another Trusted Flagger, who goes by LiamSeys on Twitter, tells Tubefilter that he joined the program six years ago because he felt personally compelled to help keep the community safe. But after tens of thousands of reports, he says he left the program recently due to lack of communication, support, and response time.

LightCode Gaming, for his part, has not yet severed ties, but is asking YouTube for better communication, as well as an explanation as to why the company has removed certain tools and is focusing on NGOs and government agencies — given that individual flaggers are more productive numerically. If nothing changes, he says, he will likely stop participating.

“Most of the others have already stopped reporting, I only report severe abuse at this point.” he tells Tubefilter. He’s also begun looking to bring his services elsewhere. “I’ve already started making tools to fight abuse on Twitch live streams, so I might just go to that with my time instead.”

Share
Published by
Geoff Weiss

Recent Posts

The “divest or ban” TikTok bill has passed. Now the legal battle begins.

The U.S. government has officially enacted a measure that will limit TikTok's power and reach --…

42 mins ago

Nigerian chess champ Tunde Onakoya raises $120k livestreaming his 60-hour chess world record

Nigerian chess champion Tunde Onakoya raised more than $120,000 livestreaming himself breaking the Guinness World…

20 hours ago

Korean Twitch alternative CHZZK doubles viewership in the wake of domestic ban

It's been about two months since Twitch went dark in South Korea, and some compelling alternatives have…

22 hours ago

Mark Rober’s Crunchlabs brings STEM to kids. A new “Hack Pack” does the same for adults.

Mark Rober is teaching his fans about "all the tools and tricks" that inform his YouTube videos.…

23 hours ago

Noah Beck talks love, life, and Iphis in first episode of new digital series ‘Marshall Will Pick You Up’

Gen Z venture capitalist Marshall Sandman is giving his friends a ride in his new…

23 hours ago

Top 5 Branded Videos of the Week: Stop trying to make “Shein wedding season” happen

What moral obligations do creators have when they work with questionable brands? The #1 video…

2 days ago