YouTube’s Drive to Clean Up its Platform is Erasing Vital Evidence of Potential War Crimes
Sites like Bellingcat, which document human rights abuses and identify state-sponsored assassins, are worried overzealous moderation is destroying evidence
In June, the award winning investigative outfit Bellingcat, which uses open source intelligence to investigate human rights abuses, had its YouTube account suspended for four hours.
Eliot Higgins, who founded and now runs Bellingcat, didn’t know anything about it until an email from YouTube at 7.30am that day, and it was only reinstated four hours later after an outcry from other media organizations and journalists.
For some, a YouTube account suspension is annoying but not devastating. For Bellingcat, and other investigative groups like it, YouTube is a vital source of information for their investigative work.
YouTube is the world’s largest video-sharing site, with two billion monthly active users visiting the website to watch the 500 hours of footage uploaded every minute. Its ease of access and ubiquity means that it functions as a living, growing archive. In some cases, the site records light-hearted cultural trends or memes that fade into obscurity, like the Numa Numa kid. But in the last decade, YouTube has also become the place to turn to if you have footage from an airstrike or a war zone.
“It’s a vast resource of the sort of video content you couldn’t possibly hope to find from conflict zones in such quantities prior to the existence of YouTube.”
Social media’s value in documenting and publicising unrest became obvious during the events of the Arab Spring. Open source intelligence (OSINT) investigations often use information garnered from social media — videos, photographs and information from tweets, Facebook posts and YouTube — to uncover the timeline behind a contentious story, or to bring little known information to light. Footage of violence in Syria found its way to YouTube and was used by newsrooms around the world to inform their coverage — especially if they couldn’t get reporters on the ground.
Other news stories that have emerged out of OSINT investigations include the identification of suspects in the Skripal poisoning in March 2018. Human rights collectives are using this evidence, alongside other research (or information garnered from social media via Facebook and Twitter) to further their investigations.
Some, such as Bellingcat or WITNESS, make visual evidence a key component of their investigations, while others, such as the Digital Verification Corps at Amnesty International’s Evidence Lab and Forensic Architecture, use video evidence to illustrate the conclusions of other research . A video of one specific airstrike or extra-judicial killing can lead to others, particularly given YouTube’s tagging and recommendation systems, illuminating a connection or an incident that researchers might not have known about otherwise. This has been pivotal in ongoing conflicts such as the Syrian civil war.
“We find a lot of content shared there from conflict zones or about specific events we’re investigating,” says Higgins. “In the case of Syria we’re talking literally millions of videos, so it’s a vast resource of the sort of video content you couldn’t possibly hope to find from conflict zones in such quantities prior to the existence of YouTube.”
In the spring of 2017, YouTube received criticism from a number of corners for hosting extremist content such as recruitment videos for terror cells operating around the world. The “adpocalypse” triggered two years of bad headlines for YouTube: companies with advertisements on the site began to raise questions about the site’s lax enforcement of its policies around content moderation. “So much material is posted to YouTube daily, that manual filtering of content would be very difficult, if not impossible,” says Yvonne McDermott Rees, a professor of law at Swansea University, who works on issues around using crowdsourced footage of human rights violations in court proceedings.
As a result, YouTube developed a machine learning system which flagged content that could be deemed offensive — the problem was this system started to flag footage which had valuable evidence of crimes being committed, the potential location, time and even the perpetrators’ faces. This happened with little to no warning.
At the time, OSINT investigations had become far less niche — many traditional news outlets, such as the New York Times, had set up their own departments to use similar techniques, and the International Criminal Court had started the process of incorporating this kind of evidence in a courtroom setting.
While YouTube’s terms of service are publicly accessible, the platform’s content moderation policies are closely guarded secrets.
“It was a massive shock when it happened — the scale was gigantic, and this algorithm just went to town,” says Nick Waters, a senior investigator at Bellingcat. “It’s like a three strike rule — if an account had three videos which violated those terms of service, it took down the whole account.” More than a million videos have been preserved by the Syrian Archive (which often works with Bellingcat), a group based in Berlin which aims to create a record of the atrocities that have been ongoing in Syria since 2012, but over 210,000 of those have been made unavailable for various reasons by YouTube, a situation that the collective is fighting — they have a monthly update on their website.
“We can see that the first Syrian war crimes cases coming through the courts now, a lot of them using open source evidence like YouTube videos. When videos have been removed from the original place where they were posted, questions can be raised about their provenance,” says McDermott Rees.
Waters says that the removal of videos by YouTube that are crucial evidence in investigations is an ongoing problem. He cites the example of a video from Sabr, Yemen which was uploaded by an eyewitness, showing the moment of an airstrike in great detail — Waters says it was the only known video of this airstrike, and that even though it was uploaded in 2015, it was removed a day after he found it years later. “We have no idea why those videos were ever taken down. We weren’t the account holder,” he explains.
The seemingly random removal of videos can scupper open source investigations at their most crucial moment. “The investigations carry less weight because you can’t back up what you’re saying,” says Waters. “So we have to remove the part which relies on those sources, and then that piece of the argument is worthless.” When those videos are taken off YouTube, it creates a gap in an otherwise cohesive narrative. OSINT organisations are already dogged by accusations of falsifying videos and pictures, so verification and accuracy is pivotal to their work having any kind of influence.
“What concerns me is the power of video to convince or make arguments,” says Sam Dubberly, who leads the Digital Verification Corps at Amnesty International’s evidence lab. “You can write a 90-page report which is highly researched, thoroughly corroborated, but how many people are going to read it? When we used verified video, we tell a story in a compelling way.” In June last year, a graphic video from Cameroon was being shared widely on social media that the government of Cameroon had tried to discredit. Using testimony from researchers on the ground, as well as other footage of the incident, Amnesty was able to analyse and verify the video, in addition to locating exactly where it was. That video also formed part of an award-winning investigation by the BBC.
While YouTube has backtracked on certain decisions, such as the suspension of Bellingcat’s account, or the deletion of thousands of videos in 2017, the fact remains that while YouTube’s terms of service are publicly accessible, the platform’s content moderation policies are closely guarded secrets, ones which aren’t subject to much scrutiny.
“What happens at the next horrendous event?”
Videos taken down often have to be appealed immediately — by the account holder, who may not be aware — and even then, there’s no guarantee they will be restored. Part of this due to the fact an algorithm can often not distinguish between eyewitness footage of violence and a recruitment video which calls for more of it.
YouTube’s role and influence has changed hugely since it began in 2005 as a digital repository for home videos. But its heft and immense influence places more and more responsibility on YouTube as a platform — it can’t simply say that the missteps are problems which it will work on at some point. Regardless of whether the platform has intended to function as a kind of gathering ground for evidence of contemporary conflict, it has done so — and those who rely on it for documentary evidence believe the platform needs to face those responsibilities head on. (YouTube did not respond to a request for comment.)
“WITNESS has been clear and vocal on the opposition to the increased use of machine learning algorithms for content removal,” says Dia Kayyali, the Technology + Advocacy program manager at WITNESS, a human rights collective which focuses on the power of video in documenting human rights abuses. “As responses to various policy documents develop, we’re hopeful that we can at least get some pragmatic harm reduction measures in place, such as having clear pathways for civil society to audit and input into training data for algorithms.”
OSINT investigation was once confined to the niche corners of the internet, but bodies such as the International Criminal Court are increasingly using its tactics and methods to bring cases forward. While YouTube has previously reversed its decision to remove certain videos, or even has provided an explanation, the systems which led to those removals are still in place.
“My bigger concern is the future of content moderation by organisations like YouTube,” says Dubberley. “In 2017, we knew that these videos existed, so we were able to bang the drum and collectively got a lot of them back. What happens at the next horrendous event? People are told that YouTube is where to put that evidence, and then those videos disappear, so we never get to see those stories.”