A three-week investigation by Business Insider found that disturbing videos appearing to show underage girls in sexually suggestive situations were being recommended to viewers by Instagram’s just-launched standalone vertical video app IGTV.
IGTV recommends videos to users in multiple ways, including a ‘For You’ section that’s apparently curated to reflect a user’s interests based on previous viewership, a ‘Popular’ section to show what’s trending on the app, and a section that shows content from followed accounts. To see what IGTV’s algorithm would recommend, Business Insider reporters used their own IGTV accounts — as well as a blank slate account set up to look like it belonged to a 13-year-old (which is the minimum age required by IGTV’s terms of service).
According to the report, a video titled Hot Girl Follow Me was soon recommended to a staffer in their ‘For You’ section, and also trended in the app’s ‘Popular’ section. The clip appeared to be a teaser video showing a girl, who Business Insider estimates was 11 or 12 years old, beginning to remove her shirt in a bathroom — though the video cut to black just before her top came off. The same video was then recommended as a ‘For You’ to the 13-year-old account, which had no prior record of any activity on IGTV.
A second video uploaded by the user who posted Hot Girl Follow Me showed “another clearly underage girl exposing her belly and pouting for the camera,” Business Insider reports. Duplicates of these two videos were then separately uploaded by a second user, and that user’s copies also appeared in IGTV’s recommendations.
Instagram removed all four copies of the videos five days after Business Insider reported them for TOS violations. By that time, they had garnered more than one million combined views combined. However, the accounts that posted both videos are still active, and still posting suggestive content. Furthermore, Business Insider reports that a video containing graphic genital mutilation was recommended to its child account. The videos appearing to show underage girls were reviewed by U.K.-based organization The National Society For The Prevention of Cruelty To Children, which forwarded them to police.
An Instagram spokesperson told Business Insider its zero-tolerance policy on child abuse applied to the accounts’ content, but not to the accounts themselves.
“We have community guidelines in place to protect everyone using Instagram and have zero tolerance for anyone sharing explicit images or images of child abuse,” the company told Business Insider in a statement. “We have removed the videos reported to us and apologize to anyone who may have seen them.” The company added that it has “a trained team of reviewers who work 24/7 to remove anything which violates our terms.”
Tubefilter has reached out to Instagram for additional comments, and will update this post with any new information.