All of YouTube, Not Just the Algorithm, is a Far-Right Propaganda Machine

YouTube’s celebrity culture and community dynamics play a major role in the amplification of far-right content

Becca Lewis
FFWD
Published in
7 min readJan 8, 2020

--

Image: Unsplash/Oleg Laptev

In recent years, the media has sounded a constant drumbeat about YouTube: Its recommendation algorithm is radicalizing people.

First articulated by Zeynep Tufekci in a short piece for The New York Times, and later corroborated by ex-Google employee Guillaume Chaslot, the theory goes something as follows: YouTube, in its wish to keep eyeballs glued to its platform, nudges people to more and more extreme content over time. For most types of content, this trend can be harmless, but in the case of political content, it can drive people down “algorithmic rabbit holes” to conspiracy theories or white supremacist propaganda.

This theory has spawned a wave of understandably outraged headlines: “How YouTube Built a Radicalization Machine”; “How YouTube Drives People to the Internet’s Darkest Corners”; “How YouTube Pushes Viewers to Extremism.” In response, YouTube stated last winter that it made changes to its algorithm to decrease recommendations from “borderline” and conspiracy content — although it remained frustratingly vague and opaque about specifics. In fact, YouTube’s lack of transparency has made it nearly impossible to effectively research the algorithm from the outside. Most recently, a study claimed that the algorithm actually led people to less radical and more mainstream content, only to be thoroughly criticized by scholars who claimed the researchers were operating from insufficient data.

So what are we to make of this? Is the YouTube algorithm radicalizing people? Is it just a moral panic generated by an outraged media scared of losing relevance? If it is a problem, can YouTube fix it? Is it even possible for us to know?

I have been researching far-right propaganda on YouTube since 2017, and I have consistently argued that we cannot understand radicalization on the platform by focusing solely on the algorithm. I have also come to find that we don’t actually need to understand the recommendation algorithm to still know that YouTube is an effective source of far-right propaganda. In fact, I will go even further…

--

--

Becca Lewis
FFWD
Writer for

I research media manipulation and political digital media at Stanford and Data & Society.