YouTube’s Deradicalization Argument Is Really a Fight About Transparency

The reason we’re all focusing on the algorithm’s effects is we don’t know how it really works

Image: Unsplash/Norwood Themes and Chris Stokel-Walker

YouTube’s algorithm has once again been thrust into the limelight as 2019 comes to an end, with a new study published on the arXiv, an academic repository for what are commonly referred to in academia as “pre-print” papers (which haven’t been peer-reviewed) about the code that makes the platform tick.

The paper, authored by programmer Mark Ledwich and a UC Berkeley researcher, Anna Zaitsev, looks at the role YouTube’s algorithm plays in recommending videos to users. Outside the paper, Ledwich has gone further, saying his data — which shows that YouTube’s algorithm is recommending videos from mainstream publishers — demonstrates, contrary to many media headlines, that YouTube’s algorithm “deradicalizes” users.

His co-author, Zaitsev, distanced herself from Ledwich’s comments when approached by FFWD. “ We don’t really claim that it does deradicalize, we rather say that it is a leap to say that it does,” she says. “The last few comments about deradicalization are blown out of proportion on Twitter. Our main point is rather in the direction of the traffic.”

When asked if she disagreed with Ledwich’s claim that “the algo has a deradicalizing influence”, Zaitsev says: “My contribution is within the paper, not the Medium posts or Twitter.”

The paper has been submitted to open-access journal First Monday for peer-review, where the rigor of the research will be checked. If sufficient, the paper will be accepted for publication.

Ledwich’s suggestion that the YouTube algorithm deradicalizes users is a leap in a number of ways, not least because showing people videos from mainstream channels doesn’t mean you’re deradicalizing them at all. Arvind Narayanan, a Princeton computer science professor, has debunked the claims made in a Twitter thread here, as have a number of journalists including myself, pointing out that Ledwich has tracked YouTube’s algorithm at the tail end of a change instigated by the platform after negative headlines about the potential of the algorithm to spool people off into dangerous niches.

“The paper itself — except from the discussion at the beginning — has a lot of interesting data, a lot of interesting conclusions to draw, but not that YouTube is deradicalizing,” says Guillaume Chaslot, a former YouTube engineer who worked on the algorithm and now runs AlgoTransparency.org, which tracks YouTube’s algorithm.

Chaslot says he was approached by Ledwich around a year ago, asking about the methodology he used to create AlgoTransparency, though has since disagreed with Ledwich. “I think he is very partisan and trying to show one side of the story,” says Chaslot. “He seemed like a cool guy but he seems very partisan, like there is a big conspiracy from the media that is trying to destroy social media, and he feels like he has to save social media from the gatekeepers of the media.” (Ledwich initially shrugged off a contact from me by saying “You can work with someone who maybe shares more of your biases”, then offered to provide the underlying data behind his site, Recfluence.)

The algorithm is not directly to blame for radicalization, but radicalization is a byproduct of the algorithm

There’s also a question about what kind of content passes for “mainstream”. Chaslot’s analysis of YouTube’s algorithm throughout 2019 shows the most recommended political channels are Fox News and MSNBC — considered the most right- and left-wing fringes of the mainstream media. “You can argue that’s better than Alex Jones but it still seems YouTube is pushing people to be more and more partisan and to fight against each other,” says Chaslot. In part, that’s down to the type of content each channel creates. “It seems like MSNBC and Fox News are accusing the other side of being horrible people are just more efficient for engagement. They are getting more free advertisement from YouTube’s algorithm. The question I raise is do people really want to live in this super-partisan world, or is it just that this super partisan world is better for YouTube’s own engagement metrics?”

In many ways, the algorithm is not directly to blame for radicalization, but radicalization is a byproduct of the algorithm. “I think [Ledwich and Zaitsev] are right in saying that YouTube is not responsible for the radicalization,” says Steven Buckley, associate lecturer at the University of the West of England, who is conducting a PhD on YouTube’s promotion of media sources and politics on the platform. “It’s the content itself that is the problem. YouTube feels it has to direct those who want right-wing stuff to more right-wing stuff. It’s just that that ‘stuff’ is getting more extreme on its own accord.”

I often make the argument that YouTube prioritizes car crashes over paint drying because we as human beings find the former intrinsically more interesting. And in an attention economy, on a platform designed to promote and prolong watch time, YouTube wants to promote more interesting content. Often that means “more extreme.”

“If you have an algorithm that rewards engagement, of course you’re going to push people towards borderline content,” says Chaslot. “That’s what happened for a long time and that’s what’s happening.” And because the algorithm is looking for ever more engaging content, it pushes people further to the fringes of acceptability — nudging the “borderline” as far as it’ll go.

But the argument that has roiled over the last few days, with people who ascribe to the view that the mainstream media is trying to attack YouTube agreeing with Ledwich’s findings, while others disagree vehemently with them, is evidence of the bigger problem of the debate around the algorithm and radicalization. We simply don’t have a good enough handle on how YouTube works.

We have become 1980s Kremlinologists, guessing at what’s going on behind a giant, reinforced wall.

But even they are working semi-blind. Visit any gathering of YouTube creators and within a matter of minutes the conversation will likely turn to the algorithm. It’s like an omnipotent deity that must be appeased at all times — but the problem is no one knows how to a satisfying degree of certainty, because YouTube doesn’t share much detail about how the algorithm works. Instead it’s left to creators — and increasingly recently, to researchers — to try and divine how it works and share their results.

“When these systems are opaque, their decisions can seem mystical or even sinister,” says Becca Lewis, a researcher at Stanford University, who Ledwich has previously singled out for criticism. “Creators are forced to develop their own theories about how things work, and academics are left attempting to imitate and recreate these systems from the outside. This erodes trust in the platform and in everyone trying to understand it.”

It’s not just about the platform’s engine that YouTube is circumspect publicly. No one reliably knows how many videos there are on the platform, while YouTube only periodically provides updates as to the scale of its growth, often in a vague manner, and usually as casual mentions in larger stories (for instance, the site has moved relatively recently from being cited as having “500 hours” of video uploaded every minute, to “more than 500 hours”). It’s left to those of us watching the platform to read the tea leaves and decide whether the change in language is significant, or just a journalist phrasing something differently. We have become 1980s Kremlinologists, guessing at what’s going on behind a giant, reinforced wall. In the absence of hard data, we’re left to rely on partial snapshots and personal experiences — enough of which exist to suggest there has been a radicalizing effect through YouTube in the recent past.

Chaslot agrees that “it’s been time for a long time for YouTube to lift the lid” on how the algorithm works. “They say it’s difficult, they have plenty of reasons why they can’t give us this data, but if we don’t have this data, we don’t know where the algorithm is pushing society.”

Until they do, we’ll be second-guessing with various degrees of specificity, based on whatever peeks we can get through that wall. In a story we’ll be publishing early in 2020, there is a campaign being led to compel YouTube to release more information about how it works, though the platform has so far robustly defended its right to not throw any more light on things.

“More transparency is not a cure-all solution, and other researchers have astutely noted that it can come with its own set of problems,” says Lewis. The major argument against it is precisely what lies at the heart of this current argument: bad actors are pushing the envelope, using what little data we have about how the platform works now. If the inner workings of the algorithm were opened up to everyone, it could end up in a free-for-all.

“But as it currently stands, YouTube asks us to trust that it has made meaningful changes to a system, without providing us with any useful metrics for understanding those changes,” says Lewis. “And we are left with a series of competing studies about the algorithm that are all operating with insufficient data.”

If we could ask for one resolution from YouTube in 2020, it’d be this: pull back the curtain — even just a little bit.

FFWD

Getting you up to speed with the world of online video

Chris Stokel-Walker

Written by

UK-based freelancer for The Guardian, The Economist, BuzzFeed News, the BBC and more. Tell me your story, or get me to write for you: stokel@gmail.com

FFWD

FFWD

Getting you up to speed with the world of online video

Chris Stokel-Walker

Written by

UK-based freelancer for The Guardian, The Economist, BuzzFeed News, the BBC and more. Tell me your story, or get me to write for you: stokel@gmail.com

FFWD

FFWD

Getting you up to speed with the world of online video

Medium is an open platform where 170 million readers come to find insightful and dynamic thinking. Here, expert and undiscovered voices alike dive into the heart of any topic and bring new ideas to the surface. Learn more

Follow the writers, publications, and topics that matter to you, and you’ll see them on your homepage and in your inbox. Explore

If you have a story to tell, knowledge to share, or a perspective to offer — welcome home. It’s easy and free to post your thinking on any topic. Write on Medium

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store