Categories: FacebookYouTube

Muslim Advocacy Group Files Suit Against YouTube, Facebook For Hosting Christchurch Shooting Videos

A French Muslim advocacy group is suing YouTube and Facebook for not being quick enough in their removal of footage showing the Christchurch shooting that killed 50 mosquegoers less than two weeks ago.

Prior to and during the massacre, the shooter wore a head-mounted camera and livestreamed a total of 17 minutes of footage on Facebook. The video was reportedly up for about half an hour, including the 17 minutes during which the shooter was actively streaming, before Facebook found and pulled it with the help of New Zealand authorities.

Facebook’s removal wasn’t fast enough to prevent copies from being made by other users, though. The footage was rapidly disseminated across Facebook and YouTube, and both sites — along with Twitter — scrambled to contain what YouTube called an “unprecedented” number of reuploads from thousands of users.

Subscribe for daily Tubefilter Top Stories

Subscribe

Now, the French Council of the Muslim Faith (CFCM) — an elected body that liaises with the French government in favor of the country’s Muslim population — has filed suit against the French branches of YouTube and Facebook. The suit alleges they’re responsible for “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor.”

“Facebook must take their part of responsibility in this and must do everything to anticipate these livestreams, as much as [they do with] hate messages and Islamaphobia on their networks,” Ahmet Ogras, CFCM’s president, told CNN.

Under French law, those found to be broadcasting pro-terrorism or dignity-violating content where children may be exposed can be punished by up to three years in prison and a €75,000 ($84,911 U.S.) fine, CNN reports. A spokesman for CFCM said if it is awarded any money in the suit, it will split that money between the Christchurch victims’ families.

This appears to be the first lawsuit related to Christchurch that has been filed against YouTube and Facebook, but its message — that large social media platforms need to more stringently and efficiently monitor hate content — is not a new one. In the wake of the shooting, social media users en masse called for YouTube, Facebook, and Twitter to more stringently monitor hate content, including text and videos, at all times.

Share
Published by
James Hale

Recent Posts

After cutting 15% of staff and saying goodbye to its CEO, Peloton must figure out what’s next

Peloton is dismissing a chunk of its workforce, including its top executive. Barry McCarthy announced that he is…

16 hours ago

Meta is using AI to power brand and creator matchmaking on Facebook and Instagram

Meta is looking to improve creator and brand experiences on its platform by investing in AI. The…

16 hours ago

Bob Does Sports cracks a cold one with new “Have a Day” tequila line

Bob Does Sports, the self-dubbed home of "brilliantly dumb sporting adventures" hosted by Robby Berger,…

16 hours ago

Billion Dollar Boy launches biz dev community for creators with flagship location in London

Influencer marketing agency Billion Dollar Boy is launching a new membership community that's "dedicated to…

18 hours ago

Millionaires: Giulia Amato on faith, finding her niche, and getting up at 4 a.m.

Welcome to Millionaires, where we profile creators who have recently crossed the one million follower…

21 hours ago

Creators on the Rise: Celestial Sylvia reads the danger all around us

Welcome to Creators on the Rise, where we find and profile breakout creators who are…

2 days ago