A French Muslim advocacy group is suing YouTube and Facebook for not being quick enough in their removal of footage showing the Christchurch shooting that killed 50 mosquegoers less than two weeks ago.
Prior to and during the massacre, the shooter wore a head-mounted camera and livestreamed a total of 17 minutes of footage on Facebook. The video was reportedly up for about half an hour, including the 17 minutes during which the shooter was actively streaming, before Facebook found and pulled it with the help of New Zealand authorities.
Facebook’s removal wasn’t fast enough to prevent copies from being made by other users, though. The footage was rapidly disseminated across Facebook and YouTube, and both sites — along with Twitter — scrambled to contain what YouTube called an “unprecedented” number of reuploads from thousands of users.
Now, the French Council of the Muslim Faith (CFCM) — an elected body that liaises with the French government in favor of the country’s Muslim population — has filed suit against the French branches of YouTube and Facebook. The suit alleges they’re responsible for “broadcasting a message with violent content abetting terrorism, or of a nature likely to seriously violate human dignity and liable to be seen by a minor.”
“Facebook must take their part of responsibility in this and must do everything to anticipate these livestreams, as much as [they do with] hate messages and Islamaphobia on their networks,” Ahmet Ogras, CFCM’s president, told CNN.
Under French law, those found to be broadcasting pro-terrorism or dignity-violating content where children may be exposed can be punished by up to three years in prison and a €75,000 ($84,911 U.S.) fine, CNN reports. A spokesman for CFCM said if it is awarded any money in the suit, it will split that money between the Christchurch victims’ families.
This appears to be the first lawsuit related to Christchurch that has been filed against YouTube and Facebook, but its message — that large social media platforms need to more stringently and efficiently monitor hate content — is not a new one. In the wake of the shooting, social media users en masse called for YouTube, Facebook, and Twitter to more stringently monitor hate content, including text and videos, at all times.