OpenAI‘s video generator Sora is here, and Marques Brownlee has some questions about it.
You may remember Sora as the genesis of that devastatingly memed clip where OpenAI’s then-CTO couldn’t confirm whether or not the product was trained on YouTube videos. After the clip went viral, YouTube said it would investigate, but hasn’t released any findings to date. As for OpenAI, it still, to this day, will not give a direct answer when asked if Sora was trained on YouTube videos.
Brownlee, however, suspects it might have been. As one of YouTube’s most prominent tech reviewers, he got early access to test Sora, and made a video about his experiences using the tool. He put it through its paces, and praised both its capabilities and the apparent restraints built into it (for example, it wouldn’t depict people engaging in dangerous acts). But then, in one clip, he asked it to generate a video of a tech reviewer talking about a smartphone.
Subscribe for daily Tubefilter Top Stories
When Sora churned out the video, Brownlee’s attention went to the fake reviewer’s desk, which was adorned with a plant that looked suspiciously like the one that’s been on his own desk for dozens of videos.
“Are my videos in that source material?” Brownlee asked. “Is this exact plant part of the source material? Is it just a coincidence? I don’t know.”
While a tech reviewer having a plant on set isn’t exactly unique or proprietary, and the plant on both Brownlee and Sora’s fake reviewer’s desks is pretty generic-looking, the detail is a little hmm-worthy—and, for Brownlee, it once again opens the question of whether or not Sora was trained on YouTubers’ content. If it wasn’t, why would it include a piece of set dressing seen in so many tech reviewers’ content?
Business Insider reached out to OpenAI to ask about Brownlee’s suspicions, but a company spokesperson would only repeat OpenAI’s oft-used line about how Sora was trained on stock footage and publicly available videos. Does “publicly available videos” include YouTube content? We doubt we’ll ever know, unless and until some sort of legal action takes place that requires OpenAI to divulge the precise sources of its training data.
Plant detail aside, Brownlee (who’s an advisor to self-described “community engagement engine”/AI company Glystn, and recently faced backlash for allowing artists to submit AI-made wallpapers to his app Panels) seemed optimistic about Sora from a tech standpoint, but cautious about it from an impact-on-the-world perspective.
“It’s still an extremely powerful tool that directly moves us further into the era of not being able to believe anything you see online,” he said. “The craziest part of all of this is the fact that this tool, Sora, is going to be available to the public, to millions of people, all at once.”
Brownlee is far from the only creator to comment on the growing prevalence of generative AI in the creator industry. Platforms including YouTube and creator services companies like Spotter have been very bullish about gen AI, pitching it as a productivity tool that can help creators put out more content.
OpenAI CEO Sam Altman echoed this messaging during Sora’s Dec. 9 launch, saying, “There’s a new kind of co-creative dynamic that we’re seeing emerge between early testers that we think points to something interesting about AI creative tools and how people will use them.”
But it may be difficult to convince creators to embrace AI unless the companies making these tools can assure them their content isn’t being used as raw material.




