Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Social Psychology Political Psychology

TikTok’s algorithm exhibited pro-Republican bias during 2024 presidential race, study finds

Trump videos were more likely to reach Democrats on TikTok than Harris videos were to reach Republicans

by Eric W. Dolan
February 4, 2025
in Political Psychology, Social Media
(Photo credit: Adobe Stock)

(Photo credit: Adobe Stock)

Share on TwitterShare on Facebook
Stay informed on the latest psychology and neuroscience research—follow PsyPost on LinkedIn for daily updates and insights.

TikTok, a widely used social media platform with over a billion active users worldwide, has become a key source of news, particularly for younger audiences. This growing influence has raised concerns about potential political biases in its recommendation algorithm, especially during election cycles. A recent preprint study examined this issue by analyzing how TikTok’s algorithm recommends political content ahead of the 2024 presidential election. Using a controlled experiment involving hundreds of simulated user accounts, the study found that Republican-leaning accounts received significantly more ideologically aligned content than Democratic-leaning accounts, while Democratic-leaning accounts were more frequently exposed to opposing viewpoints.

TikTok has become a major force among social media platforms, boasting over a billion monthly active users worldwide and 170 million in the United States. It has also emerged as a significant source of news, particularly for younger demographics. This has raised concerns about the platform’s potential to shape political narratives and influence elections.

Despite these concerns, there has been limited research investigating TikTok‘s recommendation algorithm for political biases, especially in comparison to extensive research on other social media platforms like Facebook, Instagram, YouTube, X (formerly Twitter), and Reddit.

“We previously conducted experiments auditing YouTube’s recommendation algorithms. This study published at PNAS Nexus demonstrated that the algorithm exhibited a left-leaning bias in the United States,” said Yasir Zaki, an assistant professor of computer science at New York University Abu Dhabi.

“Given TikTok’s widespread popularity—particularly among younger demographics—we sought to replicate this study on TikTok during the 2024 U.S. presidential elections. Another motivation was the concerns over TikTok’s Chinese ownership led many U.S. politicians to advocate for banning the platform, citing fears that its recommendation algorithm could be used to promote a political agenda.”

To examine how TikTok’s algorithm recommends political content, the researchers designed an extensive audit experiment. They created 323 “sock puppet” accounts—fake accounts programmed to simulate user behavior—across three politically diverse states: Texas, New York, and Georgia. Each account was assigned a political leaning: Democratic, Republican, or neutral (the control group).

The experiment consisted of two stages: a conditioning stage and a recommendation stage. In the conditioning stage, the Democratic accounts watched up to 400 Democratic-aligned videos, and the Republican accounts watched up to 400 Republican-aligned videos. Neutral accounts skipped this stage. This was done to “teach” TikTok’s algorithm the political preferences of each account.

In the recommendation stage, all accounts watched videos on TikTok’s “For You” page, which is the platform’s main feed of recommended content. The accounts watched 10 videos, followed by a one-hour pause, and repeated this process for six days. Each experimental run lasted one week. The researchers collected data on approximately 394,000 videos viewed by these accounts between April 30th and November 11th, 2024.

To analyze the political content of the recommended videos, the researchers downloaded the English transcripts of videos when available (22.8% of unique videos). They then used a system involving three large language models—GPT-4o, Gemini-Pro, and GPT-4—to classify each video. The language models answered questions about whether the video was political, whether it concerned the 2024 U.S. elections or major political figures, and what the ideological stance of the video was (pro-Democratic, anti-Democratic, pro-Republican, anti-Republican, or neutral). The majority vote of the three language models was used as the final classification for each question.

The analysis uncovered significant asymmetries in content distribution on TikTok. Republican-seeded accounts received approximately 11.8% more party-aligned recommendations compared to Democratic-seeded accounts. Democratic-seeded accounts were exposed to approximately 7.5% more opposite-party recommendations on average. These differences were consistent across all three states and could not be explained by differences in engagement metrics like likes, views, shares, comments, or followers.

“We found that TikTok’s recommendation algorithm was not neutral during the 2024 U.S. presidential elections,” explained Talal Rahwan, an associate professor of computer science at New York University Abu Dhabi. “Across all three states analyzed in our study, the platform consistently promoted more Republican-leaning content. We showed that this bias cannot be explained by factors such as video popularity and engagement metrics—key variables that typically influence recommendation algorithms.”

Further analysis showed that the bias was primarily driven by negative partisanship content, meaning content that criticizes the opposing party rather than promoting one’s own party. Both Democratic- and Republican-conditioned accounts were recommended more negative partisan content, but this was more pronounced for Republican accounts. Negative-partisanship videos were 1.78 times more likely to be recommended as an ideological mismatch relative to positive-partisanship ones.

“We observed a bias toward negative partisanship in TikTok’s recommendations,” Zaki noted. “Regardless of the political party—Democratic or Republican—the algorithm prioritized content that criticized the opposing party over content that promoted one’s own party.”

The researchers also examined the top Democratic and Republican channels on TikTok by follower count. Republican channels had a significantly higher mismatch proportion, meaning their videos were more likely to be recommended to accounts with an opposite political leaning. Notably, videos from Donald Trump’s official TikTok channel were recommended to Democratic-conditioned accounts nearly 27% of the time, while Kamala Harris’s videos were recommended to Republican-conditioned accounts only 15.3% of the time.

Finally, the researchers analyzed the topics covered in partisan videos. Topics stereotypically associated with the Democratic party, like climate change and abortion, were more frequently covered by Democratic-aligned videos. Topics like immigration, foreign policy, and the Ukraine war were more frequently covered by Republican-aligned videos. Videos on immigration, crime, the Gaza conflict, and foreign policy were most likely to be recommended as ideological mismatches to Democratic-conditioned accounts.

To build on this work, future research could explore how TikTok’s algorithm behaves across different election cycles, investigate how misinformation is distributed within partisan content, and compare TikTok’s political content recommendations with those of other major platforms. Additionally, studies incorporating real user data alongside automated experiments could provide a more comprehensive understanding of how individuals experience political content on TikTok. Given the platform’s growing role in shaping public discourse, continued scrutiny of its recommendation system will be essential for assessing its impact on political knowledge and voter decision-making.

“We want to address fundamental questions about the neutrality of social media platforms,” Rahwan said.

The study, “TikTok’s recommendations skewed towards Republican content during the 2024 U.S. presidential race,” was authored by Hazem Ibrahim, HyunSeok Daniel Jang, Nouar Aldahoul, Aaron R. Kaufman, Talal Rahwan, and Yasir Zaki.

RELATED

Study finds Trump and Harris used distinct rhetoric in 2024—but shared more similarities than expected
Political Psychology

Study finds Trump and Harris used distinct rhetoric in 2024—but shared more similarities than expected

August 24, 2025

Donald Trump and Kamala Harris framed the 2024 presidential debate in starkly different terms, according to a new study—but their language also showed surprising overlap in tone, emotional content, and specificity.

Read moreDetails
The most popular dementia videos on TikTok tend to have the lowest quality, study find
Social Media

Most TikTok videos about birth control are unreliable, study finds

August 23, 2025

TikTok is flooded with misleading content about contraception, according to a new study. Most viral videos are not made by medical experts and often promote “natural” methods while casting doubt on hormonal options and professional medical advice.

Read moreDetails
Americans broadly agree on what’s “woke,” but partisan cues still shape perceptions
Political Psychology

Americans broadly agree on what’s “woke,” but partisan cues still shape perceptions

August 22, 2025

Do Americans agree on what “woke” means? A new study suggests yes—up to a point. The term tends to signal different things depending on political identity, especially around race, gender, and alignment with the Democratic Party.

Read moreDetails
Narcissistic grandiosity predicts greater involvement in LGBTQ activism
Moral Psychology

New psychology research finds leftist causes widely seen as more moral — even by conservatives

August 21, 2025

A pair of studies conducted in Spain provides evidence of a striking moral asymmetry in politics: both leftists and rightists feel more morally obligated to defend progressive causes, and conservatives tend to view liberals as more morally upright than vice versa.

Read moreDetails
People high in psychopathy and low in cognitive ability are the most politically active online, study finds
Political Psychology

People high in psychopathy and low in cognitive ability are the most politically active online, study finds

August 20, 2025

New research highlights a striking pattern: individuals with high psychopathic traits and lower cognitive ability tend to be the most politically active online. The study also links fear of missing out to digital engagement across eight diverse national contexts.

Read moreDetails
The brain is shown with a wave of sound
Neuroimaging

Early brain responses to political leaders’ faces appear unaffected by partisanship

August 15, 2025

New research suggests that while the brain quickly distinguishes politicians from strangers, it doesn’t initially register political allegiance. The findings challenge assumptions about how early partisan bias kicks in during perception and suggest that party loyalty may emerge later.

Read moreDetails
People with narcissistic tendencies report more ostracism and are more often excluded
Political Psychology

Intellectual humility is linked to less political and religious polarization across the board

August 10, 2025

A large online study indicates that intellectual humility is linked to less hostility toward political and religious opponents. The effect was seen across political parties and belief systems, and persisted even after controlling for the strength of participants’ convictions.

Read moreDetails
Antagonistic narcissism and psychopathic tendencies predict left-wing antihierarchical aggression, study finds
Political Psychology

Populism may act as a “thermometer” for democratic health

August 8, 2025

Long-term data from Britain and the Netherlands reveal that citizens’ populist beliefs rise and fall alongside changes in democratic satisfaction. The research challenges the idea that populist attitudes are static traits and highlights their potential responsiveness to political reforms.

Read moreDetails

STAY CONNECTED

LATEST

Neuroscientists find evidence of an internal brain rhythm that orchestrates memory

High-fat fructose diet linked to anxiety-like behavior via disrupted liver-brain communication

Study finds Trump and Harris used distinct rhetoric in 2024—but shared more similarities than expected

Evolution may have capped human brain size to balance energy costs and survival

Cannabidiol shows potential to reverse some neuropsychological effects of social stress

Top AI models fail spectacularly when faced with slightly altered medical questions

A new frontier in autism research: predicting risk in babies as young as two months

Cerebellar-prefrontal brain connectivity may shape negative symptoms in psychosis

         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy