The father of Virginia-based local news correspondent Alison Parker, who was shot and killed on air in 2015, is working with the Georgetown Law Civil Rights Clinic to push Google into better addressing YouTube’s longtime problem with conspiracy theory content.
In an opinion piece for the Washington Post published today, Andy Parker says he found a “rabbit hole of painful and despicable” YouTube videos about his daughter, “including claims that Alison had plastic surgery and was living a secret life in Israel.”
After Alison’s death, Andy and his wife Barbara began publicly campaigning for increased gun control in the U.S. Like many people tied to recent shootings (notably school shootings like those at Sandy Hook and Marjory Stoneman Douglas High School), they subsequently faced an onslaught of accusations on social media platforms that their daughter’s death was faked as part of a “false flag” attack arranged by the government.
Andy realized those conspiracists were also spreading their theories on YouTube when he tried to look up videos about the For Alison Foundation, the children’s arts nonprofit he and Barbara established. “They have taken the gruesome footage of my daughter’s murder, edited it into videos selling these lies and flooded YouTube,” Andy writes.
For Andy, more concerning than the content being on YouTube in the first place were indications that Google’s recommendation algorithm was promoting it (as the world’s largest video sharing site has been shown to promote other conspiracy theories, like flat Earth). “As much as I want to blame the sick creators for the pain I feel, I blame Google even more,” he writes. “By surfacing this content and profiting from the data Google collects from those who view it, Google is monetizing Alison’s death and our family’s pain.”
Furthermore, when Andy tried to have videos removed, he reports encountering pushback. He found and flagged videos one by one, then “implored” both YouTube and Google to take them down. But “removal of videos that violate [YouTube’s] terms of service seemingly happens on an ad hoc basis,” he writes.
Tubefilter reached out to YouTube for comment about how it’s handled Andy’s requests, and whether any conspiracy videos about Alison remain on the platform. (We searched and were unable to find any.)
A YouTube spokesperson issued this statement:
“Our hearts go out to the families who have suffered incredibly tragic losses due to violent events. We’ve heard feedback from victims’ families and that is why we updated our harassment policy in 2017. Since then we’ve removed flagged videos that target the victims or their families and claim these events were ‘hoaxes’ or didn’t happen and we’ve worked on training our machine detection systems to flag videos to our reviewers. We do not allow ads to run on videos about tragic or sensitive events.”
The 2017 update to YouTube’s harassment policy specifically bans content that claims “specific victims of public violent incidents or their next of kin are actors, or that their experiences are false.”
The process of hunting down conspiracy videos of his daughter became traumatizing for Andy, so he’s passed the case on to Georgetown Law Civil Rights Clinic, a public interest law firm that’s dealt with conspiracy theorists before. In March 2018, it filed suit against Alex Jones and his company InfoWars, alleging that Jones had led a campaign of harassment against Brennan Gilmore. Gilmore was protesting white supremacists at a “Unite the Right” rally in 2017 when he caught the murder of Heather Heyer on film. After he shared the footage on Twitter, Jones accused him of being a government pawn — an accusation similar to the ones Andy and his family have faced.
Andy says he hopes the clinic’s involvement will “urge Google to adequately monitor its own platforms,” but also called for the U.S. Congress to get involved (again) if Georgetown Law is unable to achieve results.
YouTube has addressed conspiracy theories numerous times, and over the past year has added features like Information Cues and Top News to help provide users with trustworthy sources for factual information they can then use to judge the veracity of videos. Most recently, YouTube introduced a new algorithm that will find and flag conspiracy theory videos, then block the platform’s usual recommendation algorithm from suggesting users watch them.