Following a mass shooting that left 49 dead in Christchurch, New Zealand, people are calling on YouTube, Facebook, and Twitter to more stringently monitor hate content and stop the spread of white supremacy on their platforms.

One of a suspected four gunmen who targeted two mosques on Muslims’ day of worship used all three sites ahead of and during the massacre to propagate his messages.

On Twitter, prior to the shooting, he posted a 74-page manifesto that evoked common far-right sentiments, saying he thinks the world is suffering from a “white genocide” and that he wants to create “an atmosphere of fear” for Muslims, per The Guardian. On the day of the shooting, he livestreamed 17 minutes of footage on Facebook using a head-mounted camera. And just before he entered the Masjid Al Noor mosque, he told those watching his livestream to “subscribe to PewDiePie,” referring to YouTube’s most subscribed to creator. (He also targeted the Linwood mosque, where seven victims were killed.)

Twitter told The Washington Post that it suspended the account of the gunman who posted the manifesto, and is removing videos showing the attack. It said both the manifesto and the videos violate its terms of service.

Facebook spokesperson Mia Garlick said Facebook has taken a similar tack, terminating the shooter’s Facebook and Instagram (which is owned by Facebook) accounts. “We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware,” she added.

Numerous uploads of the shooter’s livestreamed footage have been posted to YouTube. At posting time, Tubefilter can confirm there are videos showing the full attack still live on the platform. (We have reported all the videos we found.) YouTube issued a statement on Twitter:

PewDiePie (real name Felix Kjellberg) issued a statement on Twitter as well, saying he’s “absolutely sickened” over having his name mentioned by the shooter.

Twitter is also where many people are calling for action from these three platforms. YouTube, Facebook, and Twitter have previously been criticized for not doing enough to stop the fostering of hateful ideologies, but the gunman’s targeted use of social media has renewed users’ push for these platforms to do something now.

These platforms have made efforts in the past to eradicate content related to terrorism. In 2017, YouTube, Facebook, Twitter, and Microsoft launched the Global Internet Forum to Counter Terrorism, where they collaborated to develop effective means of extinguishing extremism on their respective platforms. And last year, YouTube, Facebook, and Twitter were all commended by the European Commission for their work in removing hateful content.

But as users are pointing out in their calls to action, people are still spreading white supremacist ideals, and they are still using social media platforms to do it. And those users’ claims that tech companies need to be more responsible are being backed up by experts. Researchers who study extremism in online communities are now speaking up, saying they’ve been warning tech companies for years that the spread of hate content on their platforms can and will result in real-world violence, per NBC.

A number of funds have been set up for those affected by the Christchurch shooting. The New Zealand Islamic Information Centre has set up a crowdfunding campaign here to distribute money to victims and their families. The New Zealand Council of Victim Support Groups has also set up a crowdfunding campaign, here. And Al Manar Trust, a bank that serves the Muslim community, has set up a special fund for victims here.

Don't miss out on the next big story.

Get Tubefilter's Top Stories, Breaking News, and Event updates delivered straight to your inbox.

This information will never be shared with a third party