YouTube is now facing a second class-action lawsuit alleging it unfairly restricts and removes marginalized creators’ content.
Four Black creators–Lisa Cabrera, Catherine Jones, Denotra Nicole Lewis, and Kimberly Carleste Newman–have filed suit against the platform (and parent company Google), accusing it of “knowingly, intentionally, and systematically” using algorithms to “restrict access and drive them off YouTube.”
Essentially, creators claim that YouTube’s machine-learning systems target them and their content not because they’re violating the Community Guidelines or terms of service, but simply because they’re Black. The lawsuit argues that YouTube uses its power to “restrict and block” them solely “based on racial identity or viewpoint.”
Cabrera, Lewis, and Newman all had numerous, seemingly rule-abiding videos removed by YouTube without explanation, they claim. Jones, on the other hand, had her entire channel terminated, with YouTube citing its policies against nudity. But none of her videos contained nudity, she says. The creators also collectively allege that YouTube did nothing about hate speech comments left on their videos.
The suit notably contains a sworn statement from YouTuber Stephanie Frosch, who alleges that in September 2017, YouTube officials asked her to sign a nondisclosure agreement, then told her that the platform’s algorithms do categorize creators based on their race and other characteristics. This data is used “when filtering and curating content and restricting access to YouTube services,” Frosch claims.
Frosch is one of the creators involved in Divino Group LLC et al vs Google, which was filed in August 2019 and is substantially similar to Cabrera, Jones, Lewis, and Newman’s suit. That suit, also class-action, alleges YouTube discriminates against LGBTQ+ creators by restricting their videos/channels based on their identities rather than actual contents of their content. (Peter Obstler, a partner at Browne George Ross, is representing creators in both suits.)
YouTube says its systems do not identify a creator’s race or sexual orientation
YouTube CEO Susan Wojcicki categorically denied creators’ claims, telling the Washington Post, “It’s not like our systems understand race of any of those different demographics.”
She added that YouTube works to ensure that “our machines haven’t by accident learned something that isn’t what we intended.” If YouTube does find that the algorithm is doing something it’s not supposed to, the company will retrain it so “that whatever that issue was has been removed from the training set of our machines,” she said.
YouTube reiterates Wojcicki’s claims to Tubefilter, telling us that its automated systems are not designed to identify any user by their race, ethnicity, or sexual orientation.
YouTube and Google recently asked for courts to dismiss the Divino Group lawsuit on the grounds that they are protected by Section 230, a part of the Communications Decency Act that allows platforms to establish content guidelines and take action against users in “good faith” when those guidelines are violated. Tubefilter asked if YouTube will also seek to have this lawsuit dismissed; the company said it’s currently reviewing the claim.