Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Social Psychology Political Psychology

YouTube’s recommendation system exhibits left-leaning bias, new study suggests

YouTube's algorithm tends to direct users to centrist content, but shows an asymmetry in political "escape speed"

by Eric W. Dolan
September 15, 2023
in Political Psychology, Social Media
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook

YouTube recommendation algorithm has a tendency to influence users to move away from extreme right-wing political content more quickly than it does from extreme left-wing political content, according to new research published in PNAS Nexus.

“YouTube, with more than two billion monthly active users, significantly influences the realm of online political video consumption,” explained study author Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.

“In the United States, for example, a quarter of adults regularly engage with political content on the platform. Approximately 75% of YouTube video views result from its recommendation algorithm, highlighting the algorithm’s potential ability to foster echo chambers and disseminate extremist content, a matter of paramount concern. This motivated us to design an experiment to better understand how the algorithm influences what people see on the platform.”

The researchers conducted their study by employing 360 bots to simulate YouTube users. They created new Google and YouTube accounts for each bot to isolate the “personalization” process driven by YouTube’s recommendation algorithm.

Initially, the researchers collected the top 20 recommended videos from the YouTube homepage for new accounts without any watch history. They analyzed the distribution of recommended video categories and political classes, finding that “News & Politics” videos, when recommended, were primarily Center and Left-leaning.

Next, each user (bot) watched 30 videos matching their designated political group (e.g., Far Left, Left, etc.), recording the recommended videos after each watch. The analysis showed that recommendations largely aligned with the user’s political classification, and the speed at which recommendations adapted to the user’s preferences varied.

Users who completed the first stage then watched 30 videos from a different political classification, aiming to see how quickly they could “escape” their original class and “enter” the new political class. The results indicated an asymmetry in the escape speed, suggesting a skew towards left-leaning content in YouTube’s recommendations.

Finally, the users watched the top recommended video on their homepage and collected recommendations after each video. This stage explored transitions in recommendations and found that the algorithm tended to return users to the centrist content and away from political extremes.

Google News Preferences Add PsyPost to your preferred sources

Overall, the findings indicated that YouTube’s recommendation algorithm exhibited a left-leaning bias in the distribution of recommendations, even when controlling for the number of videos in different political classes. The algorithm made it easier for users to enter left-leaning political personas and escape from right-leaning ones.

In other words, when a user starts watching Far-Right political content on YouTube, the recommendation algorithm is more effective at suggesting and promoting content that is less extreme, which could include more moderate right-wing content or even centrist content. As a result, users who initially engage with Far-Right content may find themselves exposed to a broader range of political perspectives relatively quickly.

On the other hand, when a user starts watching Far-Left political content on YouTube, the algorithm is somewhat slower in guiding them away from extreme left-wing content. It takes users more time to transition from Far-Left content to less extreme content compared to the Far-Right scenario.

“Our research highlights that YouTube’s recommendation algorithm exerts a moderating influence on users, drawing them away from political extremes. However, this influence is not evenly balanced; it’s more effective in steering users away from Far Right content than from Far Left content. Additionally, our findings reveal that the algorithm’s recommendations lean leftward, even in the absence of a user’s prior viewing history.”

The research highlights how YouTube’s recommendation algorithm can influence users’ political content consumption.

“Research on algorithmic bias has attracted significant attention in recent years, and a number of solutions have been proposed to expose and address any such biases in today’s systems. Given this, we were surprised to find that the recommendation algorithm of YouTube, one of the most popular platforms, still exhibits a left-leaning political bias. These findings prompt inquiries into the appropriateness of political biases in recommendation algorithms on social media platforms, and the significant societal and political consequences that may arise as a result.”

But the study, like all research, includes some caveats. The study was conducted over a specific period, and YouTube’s algorithms and content landscape may evolve over time. Thus, the findings may not fully capture the platform’s current state or its future developments.

Additionally, the research primarily focused on the U.S. political context. As a result, the findings may not be directly applicable to other countries or political landscapes. The dynamics of YouTube’s recommendation algorithm and its impact on political content consumption could vary significantly in different cultural and political settings.

“Our study focused primarily on U.S. politics, and more research is needed to determine the degree of which these findings hold outside of the United States.”

The study, “YouTube’s recommendation algorithm is left-leaning in the United States“, was authored by Hazem Ibrahim, Nouar AlDahoul, Sangjin Lee, Talal Rahwan, and Yasir Zaki.

Previous Post

New research finds women prefer funny men, especially good-looking ones

Next Post

Intrasexual competition linked to young women’s willingness to use a risky diet pill

RELATED

The disturbing impact of exposure to 8 minutes of TikTok videos revealed in new study
Cognitive Science

Problematic TikTok use correlates with social anxiety and daily cognitive errors

March 1, 2026
Social media may be trapping us in a cycle of loneliness, new study suggests
Mental Health

New psychology research reveals a vicious cycle involving smartphone use and feelings of disconnection

February 28, 2026
New research: AI models tend to reflect the political ideologies of their creators
Authoritarianism

Right-wing authoritarianism is linked to belief in the paranormal, independent of cognitive style

February 26, 2026
New research: AI models tend to reflect the political ideologies of their creators
Artificial Intelligence

New research: AI models tend to reflect the political ideologies of their creators

February 26, 2026
Depression might unlock a more independent mind at the ballot box
Political Psychology

People who believe they contribute to society are more likely to vote and engage in politics

February 25, 2026
The power of the point: The science of Donald Trump’s gestures
Donald Trump

Donald Trump gained 2024 votes in areas where inflation was worse, study finds

February 25, 2026
How parent-child political disagreements harm relationships and individual mental health
Political Psychology

How parent-child political disagreements harm relationships and individual mental health

February 24, 2026
What scientists found when they analyzed 187 of Donald Trump’s shrugs
Donald Trump

Donald Trump’s 2024 election win increased the social acceptability of prejudice, study suggests

February 24, 2026

STAY CONNECTED

LATEST

War leaves most adults in Gaza with severe mental health conditions

Childhood ADHD medication is linked to slight changes in adult height and weight

Growing up with solid cooking fuels linked to long-term brain health risks

Your relationship dynamic plays a bigger role in jealousy than your personality, new study shows

Problematic TikTok use correlates with social anxiety and daily cognitive errors

Psychology study shows how a “fixed mindset” helps socially anxious people

Dark personality traits are linked to the consumption of violent pornography

Why most people fail to spot AI-generated faces, while super-recognizers have a subtle advantage

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc