All of YouTube, Not Just the Algorithm, is a Far-Right Propaganda Machine

YouTube’s celebrity culture and community dynamics play a major role in the amplification of far-right content

Becca Lewis
FFWD

--

Image: Unsplash/Oleg Laptev

In recent years, the media has sounded a constant drumbeat about YouTube: Its recommendation algorithm is radicalizing people.

First articulated by Zeynep Tufekci in a short piece for The New York Times, and later corroborated by ex-Google employee Guillaume Chaslot, the theory goes something as follows: YouTube, in its wish to keep eyeballs glued to its platform, nudges people to more and more extreme content over time. For most types of content, this trend can be harmless, but in the case of political content, it can drive people down “algorithmic rabbit holes” to conspiracy theories or white supremacist propaganda.

This theory has spawned a wave of understandably outraged headlines: “How YouTube Built a Radicalization Machine”; “How YouTube Drives People to the Internet’s Darkest Corners”; “How YouTube Pushes Viewers to Extremism.” In response, YouTube stated last winter that it made changes to its algorithm to decrease recommendations from “borderline” and conspiracy content — although it remained frustratingly vague and opaque about specifics. In fact, YouTube’s lack of transparency

--

--

FFWD
FFWD

Published in FFWD

Getting you up to speed with the world of online video

Becca Lewis
Becca Lewis

Written by Becca Lewis

I research media manipulation and political digital media at Stanford and Data & Society.

Responses (29)