YouTube wants “No Fakes,” so it’s endorsing a Congressional bill that addresses AI concerns

By 04/10/2025
YouTube wants “No Fakes,” so it’s endorsing a Congressional bill that addresses AI concerns

A Congressional effort to address the rise of deepfaked images and videos has the support of several major tech companies, including YouTube. As the Google-affiliated platform expands its own suite of AI-combatting tools, it is throwing its weight behind the No Fakes Act, which was reintroduced in Congress this week.

The No Fakes Act is a bipartisan proposal that gives individuals the right to control digital replications of their likeness. That ownership can be transferred to an estate in the event of the affected individual’s death, and digital likeness rights would not expire until 70 years postmortem.

The bill is sponsored by Senators Marsha Blackburn (R-TN), Chris Coons (D-DE), Thom Tillis (R-NC), and Amy Klobuchar (D-MN). It failed to gain traction upon its initial introduction last year, with organizations like the Motion Picture Association (MPA) questioning whether the statute would violate First Amendment free speech protections.

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

Blackburn and co. have now reintroduced the No Fakes Act with support from more than a dozen prominent media companies, including YouTube, Disney, OpenAI, and the Creative Artists Agency (CAA). “For nearly two decades, YouTube has been at the forefront of handling rights management at scale, and we understand the importance of collaborating with partners to tackle these issues proactively,” reads a statement from YouTube VP of Public Policy Leslie Miller. “Now, we’re applying that expertise and dedication to partnership to ensure the responsible deployment of innovative AI tools.”

YouTube’s stable of deepfake hunters includes MrBeast, Marques Brownlee, and Mark Rober

The blog post that contains Miller’s statement also runs through the steps YouTube has taken to stamp out illicit AI-generated content on its platform. Those efforts include a system that lets creators request takedowns of unauthorized deepfakes, new tools that offer “likeness management,” and support for proposals like the No Fakes Act.

YouTube added a new safeguard against generative AI last year, when it announced a partnership with CAA to develop and test a toolkit that lets high-profile users find and moderate AI-generated uploads that feature their likeness. Several high-profile creators, including MrBeastMark Rober, and Marques Brownlee, have signed on to use the likeness matching system during the next phase of testing.

Brownlee is one of the creators who has called for more sophisticated solutions to combat the “evolving problem” of AI misuse. YouTube has complied with that demand through a mix of technological and policy-based solutions, but misleading generative content and other forms of AI slop continue to run rampant.

YouTube believes that a mix of internal advances and government regulations can provide a path for a more sustainable AI future. Now it’s Congress’ turn to decide whether they agree with that roadmap.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe