Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Social Psychology Political Psychology

YouTube’s recommendation system exhibits left-leaning bias, new study suggests

YouTube's algorithm tends to direct users to centrist content, but shows an asymmetry in political "escape speed"

by Eric W. Dolan
September 15, 2023
in Political Psychology, Social Media
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook

YouTube recommendation algorithm has a tendency to influence users to move away from extreme right-wing political content more quickly than it does from extreme left-wing political content, according to new research published in PNAS Nexus.

“YouTube, with more than two billion monthly active users, significantly influences the realm of online political video consumption,” explained study author Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.

“In the United States, for example, a quarter of adults regularly engage with political content on the platform. Approximately 75% of YouTube video views result from its recommendation algorithm, highlighting the algorithm’s potential ability to foster echo chambers and disseminate extremist content, a matter of paramount concern. This motivated us to design an experiment to better understand how the algorithm influences what people see on the platform.”

The researchers conducted their study by employing 360 bots to simulate YouTube users. They created new Google and YouTube accounts for each bot to isolate the “personalization” process driven by YouTube’s recommendation algorithm.

Initially, the researchers collected the top 20 recommended videos from the YouTube homepage for new accounts without any watch history. They analyzed the distribution of recommended video categories and political classes, finding that “News & Politics” videos, when recommended, were primarily Center and Left-leaning.

Next, each user (bot) watched 30 videos matching their designated political group (e.g., Far Left, Left, etc.), recording the recommended videos after each watch. The analysis showed that recommendations largely aligned with the user’s political classification, and the speed at which recommendations adapted to the user’s preferences varied.

Users who completed the first stage then watched 30 videos from a different political classification, aiming to see how quickly they could “escape” their original class and “enter” the new political class. The results indicated an asymmetry in the escape speed, suggesting a skew towards left-leaning content in YouTube’s recommendations.

Finally, the users watched the top recommended video on their homepage and collected recommendations after each video. This stage explored transitions in recommendations and found that the algorithm tended to return users to the centrist content and away from political extremes.

Google News Preferences Add PsyPost to your preferred sources

Overall, the findings indicated that YouTube’s recommendation algorithm exhibited a left-leaning bias in the distribution of recommendations, even when controlling for the number of videos in different political classes. The algorithm made it easier for users to enter left-leaning political personas and escape from right-leaning ones.

In other words, when a user starts watching Far-Right political content on YouTube, the recommendation algorithm is more effective at suggesting and promoting content that is less extreme, which could include more moderate right-wing content or even centrist content. As a result, users who initially engage with Far-Right content may find themselves exposed to a broader range of political perspectives relatively quickly.

On the other hand, when a user starts watching Far-Left political content on YouTube, the algorithm is somewhat slower in guiding them away from extreme left-wing content. It takes users more time to transition from Far-Left content to less extreme content compared to the Far-Right scenario.

“Our research highlights that YouTube’s recommendation algorithm exerts a moderating influence on users, drawing them away from political extremes. However, this influence is not evenly balanced; it’s more effective in steering users away from Far Right content than from Far Left content. Additionally, our findings reveal that the algorithm’s recommendations lean leftward, even in the absence of a user’s prior viewing history.”

The research highlights how YouTube’s recommendation algorithm can influence users’ political content consumption.

“Research on algorithmic bias has attracted significant attention in recent years, and a number of solutions have been proposed to expose and address any such biases in today’s systems. Given this, we were surprised to find that the recommendation algorithm of YouTube, one of the most popular platforms, still exhibits a left-leaning political bias. These findings prompt inquiries into the appropriateness of political biases in recommendation algorithms on social media platforms, and the significant societal and political consequences that may arise as a result.”

But the study, like all research, includes some caveats. The study was conducted over a specific period, and YouTube’s algorithms and content landscape may evolve over time. Thus, the findings may not fully capture the platform’s current state or its future developments.

Additionally, the research primarily focused on the U.S. political context. As a result, the findings may not be directly applicable to other countries or political landscapes. The dynamics of YouTube’s recommendation algorithm and its impact on political content consumption could vary significantly in different cultural and political settings.

“Our study focused primarily on U.S. politics, and more research is needed to determine the degree of which these findings hold outside of the United States.”

The study, “YouTube’s recommendation algorithm is left-leaning in the United States“, was authored by Hazem Ibrahim, Nouar AlDahoul, Sangjin Lee, Talal Rahwan, and Yasir Zaki.

Previous Post

New research finds women prefer funny men, especially good-looking ones

Next Post

Intrasexual competition linked to young women’s willingness to use a risky diet pill

RELATED

New psychology research sheds light on the mystery of deja vu
Political Psychology

Black Lives Matter protests sparked a short-term conservative backlash but ultimately shifted the 2020 election towards Democrats

March 9, 2026
A psychological need for certainty is associated with radical right voting
Personality Psychology

A psychological need for certainty is associated with radical right voting

March 7, 2026
Pro-environmental behavior is exaggerated on self-report questionnaires, particularly among those with stronger environmentalist identity
Climate

Conservatives underestimate the environmental impact of sustainable behaviors compared to liberals

March 5, 2026
Common left-right political scale masks anti-establishment views at the center
Political Psychology

American issue polarization surged after 2008 as the left moved further left

March 5, 2026
Evolutionary psychology reveals patterns in mass murder motivations across life stages
Authoritarianism

Psychological network analysis reveals how inner self-compassion connects to outward social attitudes

March 5, 2026
Republicans’ pro-democracy speeches after January 6 had no impact on Trump supporters, study suggests
Conspiracy Theories

Trump voters who believed conspiracy theories were the most likely to justify the Jan. 6 riots

March 5, 2026
Scientists discover psychedelic drug 5-MeO-DMT induces a state of “paradoxical wake”
Business

Black employees struggle to thrive under managers perceived as Trump supporters

March 4, 2026
Self-interest, not spontaneous generosity, drives equality among Hadza hunter-gatherers
Political Psychology

X’s feed algorithm shifts users’ political opinions to the right, new study finds

March 3, 2026

STAY CONNECTED

LATEST

Therapists test an AI dating simulator to help chronically single men practice romantic skills

Women with tattoos feel more attractive but experience the same body anxieties in the bedroom

Misophonia is strongly linked to a higher risk of mental health and auditory disorders

Brain scans reveal the unique brain structures linked to frequent lucid dreaming

Black Lives Matter protests sparked a short-term conservative backlash but ultimately shifted the 2020 election towards Democrats

Massive global study links the habit of forgiving others to better overall well-being

Neuroscientists have pinpointed a potential biological signature for psychopathy

Supportive relationships are linked to positive personality changes

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc