YouTube Considering Moving All Children’s Content To ‘Kids’ App As Sundar Pichai Becomes More Involved In Day-To-Day Decision-Making (Report)

By 06/19/2019
YouTube Considering Moving All Children’s Content To ‘Kids’ App As Sundar Pichai Becomes More Involved In Day-To-Day Decision-Making (Report)

YouTube is reportedly weighing its most drastic change yet as it further attempts to improve child safety on the platform.

A report from the Wall Street Journal indicates YouTube executives are mulling a move that would push all children’s content off YouTube’s main site and into its standalone YouTube Kids app. The app is currently geared toward kids up to the age of 12, and offers content from YouTube creators as well as partner channels from outlets like Sesame Workshop, PBS Kids, and Kidz Bop. The app has comprehensive parental controls, including a setting that allows parents to hand-pick every single video a child watches on it.

Exactly what content YouTube would classify as children’s content — whether it’s just content specifically aimed at children, whether it blanket includes all content made by young creators and creators with young children in their videos — isn’t clear.

Tubefilter

Subscribe for daily Tubefilter Top Stories

Subscribe

If YouTube’s executives do decide to make the change (which the WSJ reports is “not considered imminent”), it would be an unprecedented shift — but one that’s not entirely unexpected, considering YouTube has spent years weathering scandals related to child safety.

The latest, and the one that precipitated this potential change, began in February, when YouTuber Matt Watson uncovered what he called a “wormhole” of videos and comments fetishizing young children. YouTube implemented a number of new policies, including deleting comments on tens of millions of videos of kids, and barring most channels that regularly upload videos featuring minors from receiving comments at all (a decision that’s received widespread backlash from creators).

However, despite these efforts, the “wormhole” hasn’t gone away. As a New York Times report pointed out earlier this month, there are still thousands of videos and playlists on YouTube that show children wearing things like bathing suits and gymnastics leotards. Some of them are genuine home movies uploaded innocently by families or kids, but others appear to be stolen and reuploaded to channels with random English, Cyrillic, and Portuguese names that post dozens or even hundreds of videos of different children for apparently nefarious purposes.

Harvard-based internet experts cited in the Times’ report had one primary suggestion for YouTube: turn off its recommendation algorithm on videos featuring kids. The main cause of the “wormhole,” they said, is that what YouTube’s recommendation algorithm is doing is tantamount to automatically putting together a “catalog” of videos that sexually exploit children.

At the time, Jennifer O’Connor, YouTube’s product director for trust and safety, denied the recommendation algorithm is at fault. “It’s not clear to us that necessarily our recommendation engine takes you in one direction or another,” she said.

Now, though, some YouTube employees are pushing the company to do what Harvard’s experts suggested, the WSJ reports. It’s not clear whether this is something the platform’s executives are also considering, or if it’s purely a groundswell movement from lower-level employees.

About these potential changes, a YouTube spokesperson tells Tubefilter, “We consider lots of ideas for improving YouTube and some remain just that — ideas.”

The WSJ also reports that Google CEO Sundar Pichai — who usually does not have a hand in the day-to-day running of YouTube — has been personally involved in the platform’s direction after multiple missteps. People familiar with the matter said YouTube CEO Susan Wojcicki has apologized to employees in a memo, saying YouTube’s recent decisions have been “disappointing and painful.”

YouTube’s decisions about child safety are not the only ones under fire. After videos of the Christchurch massacre were disseminated across its site, it came under fire for failing to more stringently flag and remove hate content (something it’s addressed by expanding its removal policies). And most recently, YouTube ruled that conservative commentator Steven Crowder’s harassment of journalist Carlos Maza did not violate its policies against hate speech, because while Crowder referred to Maza as an “anchor baby” and a “lispy queer,” he was not directly encouraging his viewers to harass or physically harm Maza.

Following the ruling, Pichai hosted a private meeting for queer Google and YouTube employees where he fielded questions about why YouTube didn’t better enforce its policies, or develop new policies to address harassment like Crowder’s, people familiar with the matter told the WSJ. Pichai’s response was that YouTube prefers an incremental approach to policy changes, and that changes are not likely to be implemented this year.

Also relating to the Crowder decision, YouTube has developed a new internal mantra that’s been supported by Wojcicki in staff meetings, per the WSJ. That mantra is, “It’s not about free speech, it’s about free reach.” That could be referring to repeated criticism, largely from right-wing and alt-right sources, that YouTube’s removal of extremist content is tantamount to restricting free speech — as in, YouTube is now specifying that it can support free speech while not providing free reach to those espousing extremist views.

Subscribe for daily Tubefilter Top Stories

Stay up-to-date with the latest and breaking creator and online video news delivered right to your inbox.

Subscribe