YouTube’s Drive to Clean Up its Platform is Erasing Vital Evidence of Potential War Crimes

Sites like Bellingcat, which document human rights abuses and identify state-sponsored assassins, are worried overzealous moderation is destroying evidence

Sanjana Varghese
Aug 2, 2019 · 7 min read
Photos: Wikimedia

In June, the award winning investigative outfit Bellingcat, which uses open source intelligence to investigate human rights abuses, had its YouTube account suspended for four hours.

Eliot Higgins, who founded and now runs Bellingcat, didn’t know anything about it until an email from YouTube at 7.30am that day, and it was only reinstated four hours later after an outcry from other media organizations and journalists.

For some, a YouTube account suspension is annoying but not devastating. For Bellingcat, and other investigative groups like it, YouTube is a vital source of information for their investigative work.

YouTube is the world’s largest video-sharing site, with two billion monthly active users visiting the website to watch the 500 hours of footage uploaded every minute. Its ease of access and ubiquity means that it functions as a living, growing archive. In some cases, the site records light-hearted cultural trends or memes that fade into obscurity, like the Numa Numa kid. But in the last decade, YouTube has also become the place to turn to if you have footage from an airstrike or a war zone.

“It’s a vast resource of the sort of video content you couldn’t possibly hope to find from conflict zones in such quantities prior to the existence of YouTube.”

Other news stories that have emerged out of OSINT investigations include the identification of suspects in the Skripal poisoning in March 2018. Human rights collectives are using this evidence, alongside other research (or information garnered from social media via Facebook and Twitter) to further their investigations.

Some, such as Bellingcat or WITNESS, make visual evidence a key component of their investigations, while others, such as the Digital Verification Corps at Amnesty International’s Evidence Lab and Forensic Architecture, use video evidence to illustrate the conclusions of other research . A video of one specific airstrike or extra-judicial killing can lead to others, particularly given YouTube’s tagging and recommendation systems, illuminating a connection or an incident that researchers might not have known about otherwise. This has been pivotal in ongoing conflicts such as the Syrian civil war.

“We find a lot of content shared there from conflict zones or about specific events we’re investigating,” says Higgins. “In the case of Syria we’re talking literally millions of videos, so it’s a vast resource of the sort of video content you couldn’t possibly hope to find from conflict zones in such quantities prior to the existence of YouTube.”

Photo: Syrian Archive

In the spring of 2017, YouTube received criticism from a number of corners for hosting extremist content such as recruitment videos for terror cells operating around the world. The “adpocalypse” triggered two years of bad headlines for YouTube: companies with advertisements on the site began to raise questions about the site’s lax enforcement of its policies around content moderation. “So much material is posted to YouTube daily, that manual filtering of content would be very difficult, if not impossible,” says Yvonne McDermott Rees, a professor of law at Swansea University, who works on issues around using crowdsourced footage of human rights violations in court proceedings.

As a result, YouTube developed a machine learning system which flagged content that could be deemed offensive — the problem was this system started to flag footage which had valuable evidence of crimes being committed, the potential location, time and even the perpetrators’ faces. This happened with little to no warning.

At the time, OSINT investigations had become far less niche — many traditional news outlets, such as the New York Times, had set up their own departments to use similar techniques, and the International Criminal Court had started the process of incorporating this kind of evidence in a courtroom setting.

While YouTube’s terms of service are publicly accessible, the platform’s content moderation policies are closely guarded secrets.

“We can see that the first Syrian war crimes cases coming through the courts now, a lot of them using open source evidence like YouTube videos. When videos have been removed from the original place where they were posted, questions can be raised about their provenance,” says McDermott Rees.

Waters says that the removal of videos by YouTube that are crucial evidence in investigations is an ongoing problem. He cites the example of a video from Sabr, Yemen which was uploaded by an eyewitness, showing the moment of an airstrike in great detail — Waters says it was the only known video of this airstrike, and that even though it was uploaded in 2015, it was removed a day after he found it years later. “We have no idea why those videos were ever taken down. We weren’t the account holder,” he explains.

Photo: Syrian Archive

The seemingly random removal of videos can scupper open source investigations at their most crucial moment. “The investigations carry less weight because you can’t back up what you’re saying,” says Waters. “So we have to remove the part which relies on those sources, and then that piece of the argument is worthless.” When those videos are taken off YouTube, it creates a gap in an otherwise cohesive narrative. OSINT organisations are already dogged by accusations of falsifying videos and pictures, so verification and accuracy is pivotal to their work having any kind of influence.

“What concerns me is the power of video to convince or make arguments,” says Sam Dubberly, who leads the Digital Verification Corps at Amnesty International’s evidence lab. “You can write a 90-page report which is highly researched, thoroughly corroborated, but how many people are going to read it? When we used verified video, we tell a story in a compelling way.” In June last year, a graphic video from Cameroon was being shared widely on social media that the government of Cameroon had tried to discredit. Using testimony from researchers on the ground, as well as other footage of the incident, Amnesty was able to analyse and verify the video, in addition to locating exactly where it was. That video also formed part of an award-winning investigation by the BBC.

While YouTube has backtracked on certain decisions, such as the suspension of Bellingcat’s account, or the deletion of thousands of videos in 2017, the fact remains that while YouTube’s terms of service are publicly accessible, the platform’s content moderation policies are closely guarded secrets, ones which aren’t subject to much scrutiny.

“What happens at the next horrendous event?”

YouTube’s role and influence has changed hugely since it began in 2005 as a digital repository for home videos. But its heft and immense influence places more and more responsibility on YouTube as a platform — it can’t simply say that the missteps are problems which it will work on at some point. Regardless of whether the platform has intended to function as a kind of gathering ground for evidence of contemporary conflict, it has done so — and those who rely on it for documentary evidence believe the platform needs to face those responsibilities head on. (YouTube did not respond to a request for comment.)

“WITNESS has been clear and vocal on the opposition to the increased use of machine learning algorithms for content removal,” says Dia Kayyali, the Technology + Advocacy program manager at WITNESS, a human rights collective which focuses on the power of video in documenting human rights abuses. “As responses to various policy documents develop, we’re hopeful that we can at least get some pragmatic harm reduction measures in place, such as having clear pathways for civil society to audit and input into training data for algorithms.”

OSINT investigation was once confined to the niche corners of the internet, but bodies such as the International Criminal Court are increasingly using its tactics and methods to bring cases forward. While YouTube has previously reversed its decision to remove certain videos, or even has provided an explanation, the systems which led to those removals are still in place.

“My bigger concern is the future of content moderation by organisations like YouTube,” says Dubberley. “In 2017, we knew that these videos existed, so we were able to bang the drum and collectively got a lot of them back. What happens at the next horrendous event? People are told that YouTube is where to put that evidence, and then those videos disappear, so we never get to see those stories.”

FFWD

Getting you up to speed with the world of online video

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store