YouTube appears to have decided on the next step it will take to attempt to boost child safety on its platform.
Executives are finalizing plans to stop running targeted advertisements on videos kids are likely to watch, according to three people familiar with the matter who spoke to Bloomberg. It’s likely, but not confirmed, that this decision is a response to the Federal Trade Commission’s (FTC) soon-to-be-concluded investigation into whether YouTube has collected personal data from kids under 13, a violation of the Children’s Online Privacy Protection Act (COPPA).
A brief rundown of the investigation: The FTC is concerned that YouTube is aware a number of kids under the age of 13 use its platform, presumably having signed up for an over-13 account. (Like many platforms, YouTube’s minimum age to sign up for an account is 13; that’s because COPPA specifies data collection from kids 14 and up is OK.) Since those kids are using the platform as if they’re over 13, YouTube is collecting the same data from them as it would from any user who’s actually over 13.
Send the latest creator news straight to your inbox
How YouTube is supposed to identify accounts belonging to under-13-year-olds who signed up for over-13 accounts isn’t clear, but regardless, the FTC expected to hit YouTube with a multimillion-dollar fine, and is currently working with the platform to settle on a method of protecting under-13 users’ privacy.
Solutions proposed since word of the investigation broke in June include shuttling all kids’ content over to YouTube’s child-safe app YouTube Kids (which does not run targeted ads), and asking creators who make videos for kids to willingly cut off advertising on their own content.
This new solution, while it would impact YouTube’s bottom line for ad sales, is the least radical of those suggested so far. For folks unfamiliar with YouTube’s ad sales process, it sells two broad types of advertising: targeted ads, and what we’ll call contextual ads. Contextual ads are matched up based on an individual video’s content, meaning that if a video’s about a cat climbing a tree, it may come paired with an ad for a cat food brand. Basically, these ads run based on what’s in the video, not who’s watching the video.
Targeted ads, however, are all about who’s watching the video. They use various data points about a particular user to serve that user an ad based on their interests. So, if they watch that same cat-climbing-a-tree video, they might be served an ad for a new car model. Not because new cars are related to cats climbing trees, but because maybe that user recently searched for information about said car model, or has watched all the way through ads for it in the past.
If YouTube implements this plan, videos intended to be seen by kids would only run contextual ads, so none of the accounts’ data would be used to decide which ads to serve. The plan, however, doesn’t seem to contain measures that would stop under-13 users (again, with over-13 accounts) from being served targeted ads on videos not aimed at kids, or from having data collected about their accounts in general, per Bloomberg. It’s also not apparent how YouTube will identify every video intended to be seen by kids.
In a July letter to the FTC, multiple child advocacy groups, including Campaign for a Commercial-Free Childhood and the Center for Digital Democracy, said if YouTube decided just to focus on changing how it deals with targeted ads, that wouldn’t be enough. In that same letter, the organizations concluded their ideal solution was for YouTube to move all kids’ content over to YouTube Kids, something YouTube appears to have decided against.
The people who spoke to Bloomberg said YouTube’s plan could change between now and complete finalization.
Tubefilter reached out to YouTube for more information, but YouTube declined to comment.