Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Social Psychology Political Psychology

YouTube’s recommendation system exhibits left-leaning bias, new study suggests

YouTube's algorithm tends to direct users to centrist content, but shows an asymmetry in political "escape speed"

by Eric W. Dolan
September 15, 2023
in Political Psychology, Social Media
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook

YouTube recommendation algorithm has a tendency to influence users to move away from extreme right-wing political content more quickly than it does from extreme left-wing political content, according to new research published in PNAS Nexus.

“YouTube, with more than two billion monthly active users, significantly influences the realm of online political video consumption,” explained study author Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.

“In the United States, for example, a quarter of adults regularly engage with political content on the platform. Approximately 75% of YouTube video views result from its recommendation algorithm, highlighting the algorithm’s potential ability to foster echo chambers and disseminate extremist content, a matter of paramount concern. This motivated us to design an experiment to better understand how the algorithm influences what people see on the platform.”

The researchers conducted their study by employing 360 bots to simulate YouTube users. They created new Google and YouTube accounts for each bot to isolate the “personalization” process driven by YouTube’s recommendation algorithm.

Initially, the researchers collected the top 20 recommended videos from the YouTube homepage for new accounts without any watch history. They analyzed the distribution of recommended video categories and political classes, finding that “News & Politics” videos, when recommended, were primarily Center and Left-leaning.

Next, each user (bot) watched 30 videos matching their designated political group (e.g., Far Left, Left, etc.), recording the recommended videos after each watch. The analysis showed that recommendations largely aligned with the user’s political classification, and the speed at which recommendations adapted to the user’s preferences varied.

Users who completed the first stage then watched 30 videos from a different political classification, aiming to see how quickly they could “escape” their original class and “enter” the new political class. The results indicated an asymmetry in the escape speed, suggesting a skew towards left-leaning content in YouTube’s recommendations.

Finally, the users watched the top recommended video on their homepage and collected recommendations after each video. This stage explored transitions in recommendations and found that the algorithm tended to return users to the centrist content and away from political extremes.

Google News Preferences Add PsyPost to your preferred sources

Overall, the findings indicated that YouTube’s recommendation algorithm exhibited a left-leaning bias in the distribution of recommendations, even when controlling for the number of videos in different political classes. The algorithm made it easier for users to enter left-leaning political personas and escape from right-leaning ones.

In other words, when a user starts watching Far-Right political content on YouTube, the recommendation algorithm is more effective at suggesting and promoting content that is less extreme, which could include more moderate right-wing content or even centrist content. As a result, users who initially engage with Far-Right content may find themselves exposed to a broader range of political perspectives relatively quickly.

On the other hand, when a user starts watching Far-Left political content on YouTube, the algorithm is somewhat slower in guiding them away from extreme left-wing content. It takes users more time to transition from Far-Left content to less extreme content compared to the Far-Right scenario.

“Our research highlights that YouTube’s recommendation algorithm exerts a moderating influence on users, drawing them away from political extremes. However, this influence is not evenly balanced; it’s more effective in steering users away from Far Right content than from Far Left content. Additionally, our findings reveal that the algorithm’s recommendations lean leftward, even in the absence of a user’s prior viewing history.”

The research highlights how YouTube’s recommendation algorithm can influence users’ political content consumption.

“Research on algorithmic bias has attracted significant attention in recent years, and a number of solutions have been proposed to expose and address any such biases in today’s systems. Given this, we were surprised to find that the recommendation algorithm of YouTube, one of the most popular platforms, still exhibits a left-leaning political bias. These findings prompt inquiries into the appropriateness of political biases in recommendation algorithms on social media platforms, and the significant societal and political consequences that may arise as a result.”

But the study, like all research, includes some caveats. The study was conducted over a specific period, and YouTube’s algorithms and content landscape may evolve over time. Thus, the findings may not fully capture the platform’s current state or its future developments.

Additionally, the research primarily focused on the U.S. political context. As a result, the findings may not be directly applicable to other countries or political landscapes. The dynamics of YouTube’s recommendation algorithm and its impact on political content consumption could vary significantly in different cultural and political settings.

“Our study focused primarily on U.S. politics, and more research is needed to determine the degree of which these findings hold outside of the United States.”

The study, “YouTube’s recommendation algorithm is left-leaning in the United States“, was authored by Hazem Ibrahim, Nouar AlDahoul, Sangjin Lee, Talal Rahwan, and Yasir Zaki.

Previous Post

New research finds women prefer funny men, especially good-looking ones

Next Post

Intrasexual competition linked to young women’s willingness to use a risky diet pill

RELATED

Narcissism alignment between leaders and followers linked to higher creativity
Political Psychology

New data shows a relationship between subjective social standing and political activity

April 9, 2026
Study provides first evidence of a causal link between perceived moral division and support for authoritarian leaders
Political Psychology

Mathematical model sheds light on the hidden psychology behind authoritarian decision-making

April 9, 2026
Social media may be trapping us in a cycle of loneliness, new study suggests
Body Image and Body Dysmorphia

Young men steadily catch up to young women in online appearance anxiety

April 8, 2026
Americans misperceive the true nature of political debates, contributing to a sense of hopelessness
Political Psychology

Social media analysis links polarized political language to distorted thought patterns

April 7, 2026
Brain rot and the crisis of deep thought in the age of social media
Anxiety

Anxious young adults are more likely to develop digital addictions

April 6, 2026
Scientists reveal the impact of conspiracy theories on personal relationships and dating success
Conspiracy Theories

The exact political location where conspiracy theories thrive

April 3, 2026
Psychotic delusions are evolving to incorporate smartphones and social media algorithms
Cognitive Science

Brain scans shed light on how short videos impair memory and alter neural pathways

April 3, 2026
This psychological factor might help unite America or “destroy us from within”
Political Psychology

The psychological divide between Democrats and Republicans during democratic backsliding

April 2, 2026

STAY CONNECTED

RSS Psychology of Selling

  • When brands embrace diversity, some customers pull away — and new research explains why
  • Smaller influencers drive engagement while bigger ones drive purchases, meta-analysis finds
  • Political conservatives are more drawn to baby-faced product designs, and purity values explain why
  • Free gifts with no strings attached can boost customer spending by over 30%, study finds
  • New research reveals the “Goldilocks” age for social media influencers

LATEST

New data shows a relationship between subjective social standing and political activity

Psychedelic retreats linked to mental health improvements in people with severe childhood trauma

Children are less likely to use deception after being given permission to deceive, study finds

Why some neuroscientists now believe we have up to 33 senses

Mathematical model sheds light on the hidden psychology behind authoritarian decision-making

Fake medicine yields surprisingly real results for older adults’ memory and stress

People view coercive control in relationships as less harmful when the victim is a man

Casual sex is linked to lower self-esteem and weaker moral orientations in women but not men

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc