YouTube tells Tubefilter it did not instruct Accenture, a company that manages its content moderators, to make employees sign a document acknowledging their job could give them post-traumatic stress disorder and pledging to disclose any negative mental health changes to their supervisor or HR rep.
The document was distributed to workers in December, four days after The Verge published an extensive report about the trauma faced by moderators who spend hours each workday watching disturbing videos containing things like graphic violence, child pornography, and animal abuse. The workers profiled are part of YouTube’s 10,000-person content moderation cadre, which operates around the clock to review and remove videos that violate the platform’s terms of service. Doing so is a massive undertaking: 500 hours of content are uploaded to YouTube every single minute, and in the latter half of 2019, those 10,000 people processed and took down around 3 million violating videos each month. (87% of those were first flagged by YouTube’s automated systems as potential violations, then sent to moderators for second-tier examination.)
According to YouTube, the vast majority of videos are removed because they’re spam. But there’s still enough graphic content uploaded to fill five-hour shifts’ worth of work for an unknown number of moderators who review the “violent extremism” queue of flagged videos.
Subscribe for daily Tubefilter Top Stories
The Verge’s December report alleged that in Accenture’s hiring materials, reviewers were told they’d only spend one or two hours per week looking at graphic material. But moderators working out of Accenture’s largest YouTube/Google office in Austin are actually required to review at least five hours per day (contrary to YouTube CEO Susan Wojcicki’s promise to cut that time down to four hours per day). It also alleged that moderators are forced to work through breaks to handle overstuffed queues, and that they report suffering anxiety, depression, panic attacks, and “other severe mental health consequences,” including diagnosed PTSD, after as little as six months of moderating.
The document given to workers shortly after that report reads, “I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and could lead to Post Traumatic Stress Disorder. I will take full advantage of the weCare program and seek additional mental health services if needed. I will tell my supervisor/or my HR People Advisor if I believe that the work is negatively affecting my mental health.”
If workers don’t display “strict adherence to all the requirements in this document,” it “would amount to serious misconduct and for Accenture employees may warrant disciplinary action up to and including termination,” the document explains.
It adds that “this job is not for everyone.”
Hugh Baran, an attorney with the nonprofit National Employment Law Project, told The Verge that the document reads as “reverse psychology,” pushing employees to blame themselves for any mental health issues they develop. “It seems like these companies are unwilling to do the work that’s required by OSHA and reimagine these jobs to fit the idea that workers have to be safe,” he said. The Occupational Health and Safety Act of 1970 makes it illegal for employers to maintain a work environment that could cause serious harm or death. “Instead they’re trying to shift the blame onto workers and make them think that there’s something wrong in their own behavior that is making them get injured.”
Contractors said they were threatened with being fired if they refused to sign the document. Accenture told The Verge signing is voluntary.
“The wellbeing of our people is a top priority,” a company spokesperson said. “We regularly update the information we give our people to ensure that they have a clear understanding of the work they do–and of the industry-leading wellness program and comprehensive support services we provide.”
YouTube tells Tubefilter, “Moderators do vital and necessary work to keep digital platforms safer for everyone. We choose the companies we partner with carefully and require them to provide comprehensive resources to support moderators’ wellbeing and mental health.”
The company says its mandatory wellness program for suppliers requires that workers do not spend more than five hours each day reviewing content, that they receive a lunch break and two 15-minute breaks, and that they have comprehensive healthcare, access to group counseling, and at least 2.5 hours per day of team activities like yoga classes and mindfulness sessions.
Worth noting: This is all happening just a couple of months after a current Facebook moderator filed a lawsuit against the platform, alleging they suffered significant psychological trauma that led to PTSD.