Following a mass shooting that left 49 dead in Christchurch, New Zealand, people are calling on YouTube, Facebook, and Twitter to more stringently monitor hate content and stop the spread of white supremacy on their platforms.
One of a suspected four gunmen who targeted two mosques on Muslims’ day of worship used all three sites ahead of and during the massacre to propagate his messages.
On Twitter, prior to the shooting, he posted a 74-page manifesto that evoked common far-right sentiments, saying he thinks the world is suffering from a “white genocide” and that he wants to create “an atmosphere of fear” for Muslims, per The Guardian. On the day of the shooting, he livestreamed 17 minutes of footage on Facebook using a head-mounted camera. And just before he entered the Masjid Al Noor mosque, he told those watching his livestream to “subscribe to PewDiePie,” referring to YouTube’s most subscribed to creator. (He also targeted the Linwood mosque, where seven victims were killed.)
Twitter told The Washington Post that it suspended the account of the gunman who posted the manifesto, and is removing videos showing the attack. It said both the manifesto and the videos violate its terms of service.
Facebook spokesperson Mia Garlick said Facebook has taken a similar tack, terminating the shooter’s Facebook and Instagram (which is owned by Facebook) accounts. “We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware,” she added.
Numerous uploads of the shooter’s livestreamed footage have been posted to YouTube. At posting time, Tubefilter can confirm there are videos showing the full attack still live on the platform. (We have reported all the videos we found.) YouTube issued a statement on Twitter:
Our hearts are broken over today’s terrible tragedy in New Zealand. Please know we are working vigilantly to remove any violent footage.
— YouTube (@YouTube) March 15, 2019
PewDiePie (real name Felix Kjellberg) issued a statement on Twitter as well, saying he’s “absolutely sickened” over having his name mentioned by the shooter.
Just heard news of the devastating reports from New Zealand Christchurch.
I feel absolutely sickened having my name uttered by this person.
My heart and thoughts go out to the victims, families and everyone affected by this tragedy.
— ƿ૯ωძɿ૯ƿɿ૯ (@pewdiepie) March 15, 2019
Twitter is also where many people are calling for action from these three platforms. YouTube, Facebook, and Twitter have previously been criticized for not doing enough to stop the fostering of hateful ideologies, but the gunman’s targeted use of social media has renewed users’ push for these platforms to do something now.
You have a lot more than that to remove. You built, plank by plank, the stage from which a virulent racist ideology reached so many children. We live in the nightmare your irresponsibility and greed helped build. https://t.co/vzMv7oLyFU
— Anthony Oliveira (@meakoopa) March 15, 2019
i am tired of a world where we all have to sit seething in a bath of racist poison because it is generating ad revenue
— Anthony Oliveira (@meakoopa) March 15, 2019
This is why people say ban the fucking Nazis. You give them a platform and they use it radicalize each other and livestream murder. Any platform that allows white supremacists to congregate is complicit
— Butt “Municipal Power for California” Praxis (@buttpraxis) March 15, 2019
The Christchurch shooter posted a manifesto on 8chan and shouted “subscribe to PewDiePie” before opening fire on the mosque, livestreaming the whole thing. The internet is 1000% a vehicle for right-wing radicalisation.
— Andrew Todd (@mistertodd) March 15, 2019
The New Zealand massacre was livestreamed on Facebook, announced on 8chan, reposted on YouTube, commentated about on Reddit, and mirrored around the world before the tech companies could even react.
— Drew Harwell (@drewharwell) March 15, 2019
We warn time & again that the Far Right is using social media to recruit & radicalize. They don’t just use Gab or 8chan. They use Facebook, Twitter, Instagram, & Youtube. The response by tech companies is to simper at these murderous scumbags & “both sides” the issue. #Complicit
— New York City Antifa (@NYCAntifa) March 15, 2019
And Twitter, Facebook and Instagram allow white supremacists accounts on their sites, which is exactly how this shooter was able to upload his video. If they took their own rules seriously, they would ban bigots. They choose not to. https://t.co/0zvC90bpRn
— Jill Filipovic (@JillFilipovic) March 15, 2019
These platforms have made efforts in the past to eradicate content related to terrorism. In 2017, YouTube, Facebook, Twitter, and Microsoft launched the Global Internet Forum to Counter Terrorism, where they collaborated to develop effective means of extinguishing extremism on their respective platforms. And last year, YouTube, Facebook, and Twitter were all commended by the European Commission for their work in removing hateful content.
But as users are pointing out in their calls to action, people are still spreading white supremacist ideals, and they are still using social media platforms to do it. And those users’ claims that tech companies need to be more responsible are being backed up by experts. Researchers who study extremism in online communities are now speaking up, saying they’ve been warning tech companies for years that the spread of hate content on their platforms can and will result in real-world violence, per NBC.
A number of funds have been set up for those affected by the Christchurch shooting. The New Zealand Islamic Information Centre has set up a crowdfunding campaign here to distribute money to victims and their families. The New Zealand Council of Victim Support Groups has also set up a crowdfunding campaign, here. And Al Manar Trust, a bank that serves the Muslim community, has set up a special fund for victims here.