Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Social Psychology Political Psychology

YouTube’s recommendation system exhibits left-leaning bias, new study suggests

YouTube's algorithm tends to direct users to centrist content, but shows an asymmetry in political "escape speed"

by Eric W. Dolan
September 15, 2023
in Political Psychology, Social Media
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook
Stay informed on the latest psychology and neuroscience research—follow PsyPost on LinkedIn for daily updates and insights.

YouTube recommendation algorithm has a tendency to influence users to move away from extreme right-wing political content more quickly than it does from extreme left-wing political content, according to new research published in PNAS Nexus.

“YouTube, with more than two billion monthly active users, significantly influences the realm of online political video consumption,” explained study author Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.

“In the United States, for example, a quarter of adults regularly engage with political content on the platform. Approximately 75% of YouTube video views result from its recommendation algorithm, highlighting the algorithm’s potential ability to foster echo chambers and disseminate extremist content, a matter of paramount concern. This motivated us to design an experiment to better understand how the algorithm influences what people see on the platform.”

The researchers conducted their study by employing 360 bots to simulate YouTube users. They created new Google and YouTube accounts for each bot to isolate the “personalization” process driven by YouTube’s recommendation algorithm.

Initially, the researchers collected the top 20 recommended videos from the YouTube homepage for new accounts without any watch history. They analyzed the distribution of recommended video categories and political classes, finding that “News & Politics” videos, when recommended, were primarily Center and Left-leaning.

Next, each user (bot) watched 30 videos matching their designated political group (e.g., Far Left, Left, etc.), recording the recommended videos after each watch. The analysis showed that recommendations largely aligned with the user’s political classification, and the speed at which recommendations adapted to the user’s preferences varied.

Users who completed the first stage then watched 30 videos from a different political classification, aiming to see how quickly they could “escape” their original class and “enter” the new political class. The results indicated an asymmetry in the escape speed, suggesting a skew towards left-leaning content in YouTube’s recommendations.

Finally, the users watched the top recommended video on their homepage and collected recommendations after each video. This stage explored transitions in recommendations and found that the algorithm tended to return users to the centrist content and away from political extremes.

Overall, the findings indicated that YouTube’s recommendation algorithm exhibited a left-leaning bias in the distribution of recommendations, even when controlling for the number of videos in different political classes. The algorithm made it easier for users to enter left-leaning political personas and escape from right-leaning ones.

In other words, when a user starts watching Far-Right political content on YouTube, the recommendation algorithm is more effective at suggesting and promoting content that is less extreme, which could include more moderate right-wing content or even centrist content. As a result, users who initially engage with Far-Right content may find themselves exposed to a broader range of political perspectives relatively quickly.

On the other hand, when a user starts watching Far-Left political content on YouTube, the algorithm is somewhat slower in guiding them away from extreme left-wing content. It takes users more time to transition from Far-Left content to less extreme content compared to the Far-Right scenario.

“Our research highlights that YouTube’s recommendation algorithm exerts a moderating influence on users, drawing them away from political extremes. However, this influence is not evenly balanced; it’s more effective in steering users away from Far Right content than from Far Left content. Additionally, our findings reveal that the algorithm’s recommendations lean leftward, even in the absence of a user’s prior viewing history.”

The research highlights how YouTube’s recommendation algorithm can influence users’ political content consumption.

“Research on algorithmic bias has attracted significant attention in recent years, and a number of solutions have been proposed to expose and address any such biases in today’s systems. Given this, we were surprised to find that the recommendation algorithm of YouTube, one of the most popular platforms, still exhibits a left-leaning political bias. These findings prompt inquiries into the appropriateness of political biases in recommendation algorithms on social media platforms, and the significant societal and political consequences that may arise as a result.”

But the study, like all research, includes some caveats. The study was conducted over a specific period, and YouTube’s algorithms and content landscape may evolve over time. Thus, the findings may not fully capture the platform’s current state or its future developments.

Additionally, the research primarily focused on the U.S. political context. As a result, the findings may not be directly applicable to other countries or political landscapes. The dynamics of YouTube’s recommendation algorithm and its impact on political content consumption could vary significantly in different cultural and political settings.

“Our study focused primarily on U.S. politics, and more research is needed to determine the degree of which these findings hold outside of the United States.”

The study, “YouTube’s recommendation algorithm is left-leaning in the United States“, was authored by Hazem Ibrahim, Nouar AlDahoul, Sangjin Lee, Talal Rahwan, and Yasir Zaki.

TweetSendScanShareSendPin1ShareShareShareShareShare

RELATED

Radical leaders inspire stronger devotion because they make followers feel significant, study finds
Political Psychology

Radical leaders inspire stronger devotion because they make followers feel significant, study finds

June 28, 2025

A new study finds that voters are more motivated by radical political leaders than moderates, because supporting bold causes makes them feel personally significant—driving greater activism, sacrifice, and long-term engagement across elections in the United States and Poland.

Read moreDetails
TikTok tics study sheds light on recovery trends and ongoing mental health challenges
Body Image and Body Dysmorphia

TikTok and similar platforms linked to body dissatisfaction and eating disorder symptoms

June 27, 2025

Frequent use of platforms like TikTok and YouTube Shorts is linked to disordered eating symptoms among teens, according to new research. The study found that body comparisons and dissatisfaction may help explain this troubling association—especially among girls.

Read moreDetails
Loneliness skews partner perceptions, harming relationships and reinforcing isolation
Mental Health

Maximization style and social media addiction linked to relationship obsessive compulsive disorder

June 24, 2025

Researchers have identified connections between obsessive thoughts about relationships, emotional closeness, and habits like social media addiction and striving for perfection. The findings highlight risk factors that can deepen doubt and tension in romantic connections, especially when conflict is present.

Read moreDetails
It’s not digital illiteracy: Here’s why older adults are drawn to dubious news
Social Media

Believing “news will find me” is linked to sharing fake news, study finds

June 22, 2025

People who rely on social media to “stumble upon” news are more prone to spreading misinformation, according to a new longitudinal study.

Read moreDetails
Political ambivalence has a surprising relationship with support for violence
Authoritarianism

New study sheds light on the psychological roots of collective violence

June 21, 2025

A new study from Lebanon finds that people with authoritarian beliefs tend to oppose violence against political leaders, while those high in social dominance orientation are more likely to support violence against rival group members.

Read moreDetails
Epistemic mistrust and dogmatism predict preference for authoritarian-looking leaders
Authoritarianism

Epistemic mistrust and dogmatism predict preference for authoritarian-looking leaders

June 20, 2025

A new study suggests that the way people learn to trust others early in life can shape their political ideology and preference for strong, dominant leaders—though not directly, but through dogmatic thinking and broader political attitudes.

Read moreDetails
Individual traits, not environment, predict gun violence among gun-carrying youth
Political Psychology

Republican women and Democratic men often break with party lines on gun policy

June 19, 2025

New research shows that Americans’ views on gun policy are shaped by the intersection of gender and partisanship, with Republican women and Democratic men often expressing positions that differ from those typically associated with their party.

Read moreDetails
Troubling study shows “politics can trump truth” to a surprising degree, regardless of education or analytical ability
Donald Trump

Racial insecurity helped shield Trump from Republican backlash after Capitol riot, study suggests

June 18, 2025

Despite widespread condemnation of the January 6th attack, many white Republicans remained loyal to Trump—especially those who perceived anti-white discrimination. A new study shows how racial status threat can protect political leaders from the consequences of norm violations.

Read moreDetails

SUBSCRIBE

Go Ad-Free! Click here to subscribe to PsyPost and support independent science journalism!

STAY CONNECTED

LATEST

Liver health may influence mental health via inflammation and glutamate levels

Sleep helps stitch memories into cognitive maps, according to new neuroscience breakthrough

Radical leaders inspire stronger devotion because they make followers feel significant, study finds

Openness to sugar relationships tied to short-term mating, not life history strategy

Regular psychedelic users exhibit different brain responses to self-related thoughts, study finds

New psychology research uncovers surprisingly consistent misjudgments of tattooed individuals

Neuroscientists identify key gatekeeper of human consciousness

New study links intermittent fasting to improved mood via brain’s dopamine system

         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy