There is a wormhole on YouTube, and those who enter it — even accidentally — find themselves surrounded by videos of underage girls.

The videos are pervasive. Click on one and YouTube’s algorithm will cause the ‘Up Next’ section of the site to populate entirely with dozens more videos just like it. All the videos are ostensibly innocent. They’ll often feature titles with the words “gymnastics” or “yoga” or “pool day.” Sometimes “popsicle” or “Twister” (as in the popular body-bending board game). They don’t have any sexual content.

But the videos’ comments sections are a different story. There, people openly say how attractive they find the girls. Some engage in an activity called “timestamping,” where they go through the videos to find moments where the young girls are in compromising positions. Then they post comments with lists of links to those moments, down to the second, so other viewers can find easily find the exact spots in the video where they exist. Other comments link to unlisted YouTube videos with similar content. Some post links that lead to sites outside of YouTube.

A number of those external links are to sites and videos containing explicit child pornography, YouTuber Matt Watson (aka MattsWhatItIs) says in his latest video. The video, YouTube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized, follows Watson’s journey through the subculture.

“Once you are in this wormhole,” he says, “there is nothing but more videos of little girls. How has YouTube not seen this?”

He says it’s remarkably easy for anyone to find the videos on YouTube. To prove it, he loads up a fresh YouTube account and searches for haul videos, a common subgenre where people show off what they’ve bought. Watson specifically looks for bikini hauls, then clicks on a random video by an apparently of-age YouTuber. That video leads (by way of the Up Next sidebar) to a video called Rating My Girlfriend’s Bikinis. Again, a video featuring of-age YouTubers.

But in the Up Next section next to that video (remember, because Watson was using a fresh account, YouTube’s algorithm didn’t have any information about what he personally wanted to watch), Watson found one called Gymnastics Video. The thumbnail is of a girl obviously under the age of 10. Clicking on that video led him into the wormhole: dozens upon dozens of videos of adolescent and preadolescent girls, most of them posted by accounts with random English or Cyrillic usernames. (Presumably the videos were originally posted innocently by a child or parent, and then stolen by uploaders with nefarious purposes.)

When Tubefilter attempted to recreate Watson’s experience, we didn’t find the gymnastics video in the Up Next under Rating My Girlfriend’s Bikinis. But in that sidebar, we did find a video where a young girl (again, obviously under 10 years old) wearing a bathing suit splashes in a kid’s pool. Comments below the video include someone calling the girl “very beautiful.” Another says, “Sexy sexy.” One comment offers a list of timestamps.

That video we found leads to dozens more like it. Beneath the majority are comments sections with hundreds of similar inappropriate remarks. A few, though, have comments disabled, something Watson mentions in his report.

Watson points out the fact that in late 2017 and in the wake of Elsagate, YouTube toughened its approach to content that “attempts to pass as family-friendly, but is clearly not.” In YouTube’s blog post at that time about its more aggressive approach, the platform says it’s “historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors. […] Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”

Watson makes the point that on a number of videos he saw, the comments were disabled, indicating that YouTube had recognized there were sexual or predatory comments. Other uploads appear to have been deleted or removed by YouTube. But the majority of videos he came across — which, again, were mostly uploaded by accounts that did not appear to be made by the girls in the videos or their parents — were still live.

“So this is significant because that means that we know YouTube has an algorithm in place that detects some kind of unusual predatory behavior on this kinds of videos,” Watson says. “And yet all that’s happening is that the comments are being disabled?”

What’s more is Watson says at least a handful of the videos he found were monetized, with advertisements from large companies running alongside them. As Watson notes, this find harkens back to the early days of the 2017 Adpocalypse, when several major marketers discovered their ads ran next to YouTube content promoting hate speech and terrorism. (It’s worth noting that when we replicated Watson’s investigation, we didn’t see any advertising, but we also didn’t travel as extensively into the wormhole.)

Watson, whose YouTube channel appears to have been scrubbed of all other videos except this one and some related followups, says this is his last upload, and that he can no longer support YouTube knowing what kind of subculture is thriving on the platform. At the end of his video, he encourages YouTube users and creators to use the hashtag #YouTubeWakeUp to raise awareness about his findings.

His video has garnered more than a million views in less than 24 hours. It’s also gathered steam on Reddit, where users are sharing their own experiences with this subculture. One Redditor says they first encountered it way back in 2011, when they found out older people were friending their young daughter after she uploaded videos of her gymnastics routines.

“I followed their ‘liked’ trail and found a network of YouTube users whose uploaded and liked videos consisted only of preteen girls,” they say. “Innocent videos of kids, but the comments sickened me. For two weeks, I did nothing but contact their parents and flag comments. A few accounts got banned, but they probably just started a new account.”

Tubefilter reached out to YouTube for comment about all of Watson’s allegations. A YouTube spokesperson sent us this statement:

“Any content — including comments — that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube. We enforce these policies aggressively, reporting it to the relevant authorities, removing it from our platform and terminating accounts. We also have strict policies that govern where we allow ads to appear and we enforce these policies vigorously. When we find content that is in violation of our policies, we immediately stop serving ads or remove it altogether. We continue to invest heavily in technology, teams and partnerships with charities to tackle this issue.”

The spokesperson also added that YouTube works with the National Center for Missing and Exploited Children and the Internet Watch Foundation, and reports any content involving child sexual abuse.

Watson’s finds are not an isolated incident in the category of recently discovered questionable content masquerading as family-friendly videos. In December, hundreds of videos purporting to just show moms doing everyday household chores with their families were actually peppered with deliberate and gratuitous shots of the women purposefully flashing their underwear and positioning themselves provocatively while playing with and around little kids.

And just as the videos in this story were uncovered thanks to an investigation from Watson, those videos were brought to light by an investigation from another YouTuber PaymoneyWubby. Soon after Tubefilter contacted YouTube about the moms-and-kids videos, the channels posting them were terminated.

We have reported all the videos we found while investigating this story.

Don't miss out on the next big story.

Get Tubefilter's Top Stories, Breaking News, and Event updates delivered straight to your inbox.

This information will never be shared with a third party