Maybe It’s Not YouTube’s Algorithm That Radicalizes People


YouTube is the biggest social media platform in the country, and, perhaps, the most misunderstood. Over the past few years, the Google-owned platform has become a media powerhouse where political discussion is dominated by right-wing channels offering an ideological alternative to established news outlets. And, according to new research from Penn State University, these channels are far from fringe—they’re the new mainstream, and recently surpassed the big three US cable news networks in terms of viewership.

The paper, written by Penn State political scientists Kevin Munger and Joseph Phillips, tracks the explosive growth of alternative political content on YouTube, and calls into question many of the field’s established narratives. It challenges the popular school of thought that YouTube’s recommendation algorithm is the central factor responsible for radicalizing users and pushing them into a far-right rabbit hole.

The authors say that thesis largely grew out of media reports, and hasn’t been rigorously analyzed. The best prior studies, they say, haven’t been able to prove that YouTube’s algorithm has any noticeable effect. “We think this theory is incomplete, and potentially misleading,” Munger and Phillips argue in the paper. “And we think that it has rapidly gained a place in the center of the study of media and politics on YouTube because it implies an obvious policy solution—one which is flattering to the journalists and academics studying the phenomenon.”

Instead, the paper suggests that radicalization on YouTube stems from the same factors that persuade people to change their minds in real life—injecting new information—but at scale. The authors say the quantity and popularity of alternative (mostly right-wing) political media on YouTube is driven by both supply and demand. The supply has grown because YouTube appeals to right-wing content creators, with its low barrier to entry, easy way to make money, and reliance on video, which is easier to create and more impactful than text.

“This is attractive for a lone, fringe political commentator, who can produce enough video content to establish themselves as a major source of media for a fanbase of any size, without needing to acquire power or legitimacy by working their way up a corporate media ladder,” the paper says.

According to the authors, that increased supply of right-wing videos tapped a latent demand. “We believe that the novel and disturbing fact of people consuming white nationalist video media was not caused by the supply of this media ‘radicalizing’ an otherwise moderate audience,” they write. “Rather, the audience already existed, but they were constrained” by limited supply.

Other researchers in the field agree, including those whose work has been cited by the press as evidence of the power of YouTube’s recommendation system. Manoel Ribeiro, a researcher at the Swiss Federal Institute of Technology Lausanne and one of the authors of what the Penn State researchers describe as “the most rigorous and comprehensive analysis of YouTube radicalization to date,” says that his work was misinterpreted to fit the algorithmic radicalization narrative by so many outlets that he lost count.

For his study, published in July, Ribeiro and his coauthors examined more othan 330,000 YouTube videos from 360 channels, mostly associated with far right ideology. They broke the channels into four groups, based on their degree of radicalization. They found that a YouTube viewer who watches a video from the second-most-extreme group and follows the algorithm’s recommendations has only a 1-in-1,700 chance of arriving at a video from the most extreme group. For a viewer who starts with a video from the mainstream media, the chance of being shown a video from the most extreme group is even smaller.

Munger and Phillips cite Ribeiro’s paper in their own, published earlier this month. They looked at 50 YouTube channels that researcher Rebecca Lewis identified in a 2018 paper as the “Alternative Influence Network.” Munger and Phillips’ reviewed the metadata for close to a million YouTube videos posted by those channels and mainstream news organizations between January 2008 and October 2018. The researchers also analyzed trends in search rankings for the videos, using YouTube’s API to obtain snapshots of how they were recommended to viewers at different points over the last decade.



Source link