Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Social Psychology Sexism

Women who hate men: Study finds similarities in gendered hate speech on Reddit

by Karina Petrova
March 29, 2026
in Sexism
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A new study reveals that online communities dedicated to hating men share strikingly similar behaviors and language patterns with communities dedicated to hating women. The research suggests that gender driven hate speech is a broad phenomenon characteristic of toxic digital groups, regardless of the victim’s gender. These findings were published in the journal Scientific Reports.

Social media networks allow people around the world to share ideas and perspectives at an unprecedented scale. While these platforms can foster community building, they also create environments where discrimination and extreme ideologies can spread. One unexpected impact is the creation of echo chambers. An echo chamber is a closed environment where users only encounter information or opinions that mirror and reinforce their own.

Anonymity on the internet often accelerates the formation of these isolated spaces. Within these chambers, hate speech acts as a mechanism of communication that expresses an ideology using offensive stereotypes. This speech targets individuals based on traits like ethnicity, religion, or gender. Gendered hate speech specifically involves harassing or degrading people based entirely on whether they are men or women.

Historically, researchers and content moderators have focused heavily on misogyny, which is the hatred of or prejudice against women. A routine search of academic databases reveals hundreds of thousands of papers examining online misogyny over the past two decades. In contrast, academic attention toward misandry, defined as the hatred of or prejudice against men, remains notably scarce. Studies examining misandry only began to appear around 2014, leaving huge gaps in the scientific understanding of digital harassment.

Erica Coppolillo, a researcher at the University of Calabria and the National Research Council of Italy, initiated a project to address this literature gap. Coppolillo sought to determine if there are systematic differences between communities that target men and communities that target women. The goal was to see if the gender of the perpetrators changes the nature of the hostility. If the behavior remains identical, it suggests that the core issue is the toxicity of extremist online environments rather than the specific gender dynamics.

To investigate these questions, the study focused on Reddit. This platform is organized into thousands of individual communities, known as subreddits, dedicated to specific topics. Users interact by sharing posts and commenting on threads, creating dense networks of conversation. The researcher selected four subreddits known for extreme views on gender as the basis for the text analysis.

Two of these groups were chosen as examples of misandric communities. The first was a mainstream feminist subreddit discussing women’s issues, and the second was a radical feminist subreddit. The latter was banned by the platform in 2020 for violating hate speech policies. For the misogynistic side, the researcher selected a men’s rights subreddit and a group for involuntary celibates. The involuntary celibate community was also eventually banned for promoting hate and violence.

The primary data included text posts and comments generated between 2016 and 2022. To ensure the analysis focused strictly on gender targeting, a tight filtering process was applied. In the misandric groups, only texts mentioning terms like man, men, or husband were retained. In the misogynistic groups, the texts had to include terms like woman, women, or wife.

Google News Preferences Add PsyPost to your preferred sources

The analysis began with a linguistic comparison to identify the vocabulary shaping these conversations. A computational tool designed to process human language cleaned the text by removing punctuation and numbers. The researcher then examined the twenty most frequent words in each community. The results showed that most common terms occurred with similar frequency across all four groups.

There were no sharp linguistic boundaries separating the groups targeting men and those targeting women. Next, the study measured the toxicity of the content to see how aggressive these conversations were. Toxicity refers to how rude, disrespectful, or hateful a given comment appears to the reader. The researcher used an advanced artificial intelligence framework known as a transformer to evaluate the text.

A transformer is a deep learning model that understands the context of a word based on the surrounding sentence structure. This specific model had been trained on tens of thousands of manually annotated internet posts to learn the nuances of hate speech. It assigned a toxicity score to each post and comment, placing it on a continuous scale from completely harmless to intensely toxic.

The results of the toxicity analysis showed that the majority of content across all four communities was rated as non-toxic. Almost all the communities had a dual pattern, with a large peak indicating harmless text and a smaller peak indicating highly toxic text. The two misogynistic communities showed a slightly higher peak in extreme toxicity compared to the misandric groups. Even so, the overall distribution patterns of toxicity were remarkably similar.

The third phase of the study evaluated the specific emotions expressed within the texts. The researcher used two different machine learning algorithms capable of detecting emotions like sadness, joy, fear, and anger. For this analysis, the focus was narrowed exclusively to negative emotions. The algorithms evaluated each piece of text to see if sadness, anger, fear, or hate was the dominant sentiment.

When examining the emotions at a broad content level, all four communities expressed hate most frequently. Anger was the second most common emotion across the board. The men’s rights group and the mainstream feminist group displayed incredibly similar emotional patterns. The involuntary celibate group leaned slightly more toward sadness, while the radical feminist group leaned slightly toward fear.

Once again, the findings did not reveal sweeping differences between the two sides. The researcher also decided to evaluate the same emotions at an individual user level. Instead of looking at unlinked posts, the algorithms calculated the dominant emotion expressed by each individual user across all their lifetime contributions. When viewed this way, the pattern shifted dramatically.

The mainstream feminist community displayed the highest levels of user driven hate, followed by the radical feminist group and the men’s rights group. This shifted perspective suggests that misandric communities might harbor more concentrated negative sentiments among actively posting users than misogynistic ones do. Finally, the study mapped the conversational networks within each subreddit. The researcher built visual graphs in which every user was a point, and an interaction between two users was a connecting line.

This allowed the researcher to measure the structural properties of each community network. One measured property was modularity, which dictates how strongly a network divides into smaller, isolated sub-communities. Another structural property was the network diameter, which represents the longest chain of communication between two users.

The network structures did not align with the gender focus of the subreddits. The mainstream feminist group shared more structural features, like high modularity and wide diameter, with the men’s rights group. In contrast, the involuntary celibate community’s conversational network more closely resembled the radical feminist network. The structural analysis confirmed that the intended direction of the hate speech does not dictate how an online community organizes itself.

These findings suggest that content moderation strategies should address all hate speech neutrally. Recognizing misandric hostility as a serious issue could lead to safer digital spaces for everyone. Treating misogyny and misandry with equal seriousness pushes platforms toward universal interventions to curb toxic behavior.

However, the study relies on data scraped from an open internet platform, which inevitably contains noise and formatting errors. Real world social data is rarely perfectly clean, which can impact automated evaluation. The study also relies heavily on artificial intelligence algorithms to evaluate toxicity and emotions. While these computerized models are highly accurate, they are not flawless.

These models occasionally misclassify internet slang or sarcasm, which could introduce a small degree of uncertainty into the results. The findings are also specific to the analyzed Reddit communities. Content dynamics on different platforms, such as Facebook or a video sharing site, might yield completely different results.

Future research could investigate whether artificial bot accounts contribute to the spread of negativity in these specific forums. Researchers could also look for heavily radicalized sub-factions hidden within the broader internet communities.

The study, “Women who hate men: a comparative analysis across extremist Reddit communities,” was authored by Erica Coppolillo.

Previous Post

Severe emotional outbursts in ADHD are linked to distinct brain differences, study finds

Next Post

The psychological difference between playing video games to relax and playing to win

RELATED

Weird disconnect between gender stereotypes and leader preferences revealed by new psychology research
Business

When the pay gap is wide, women see professional beauty as a strategic asset

April 11, 2026
Women with sexual trauma histories more likely to engage in “Duty Sex”
Relationships and Sexual Health

New psychology research explains why some women devalue their own orgasms

April 10, 2026
Most people dislike being gossiped about—except narcissistic men, who welcome even negative gossip
Sexism

Hostile sexism is linked to higher rates of social sabotage and gossip among young adults

April 4, 2026
Men who favor the tradwife lifestyle often view the women in it with derision
Sexism

Men who favor the tradwife lifestyle often view the women in it with derision

April 1, 2026
ChatGPT’s social trait judgments align with human impressions, study finds
Artificial Intelligence

Efforts to make AI inclusive accidentally create bizarre new gender biases, new research suggests

March 22, 2026
Major study reshapes our understanding of assortative mating and its generational impact
Relationships and Sexual Health

Feminist beliefs linked to healthier romantic relationship skills for survivors of childhood trauma

March 15, 2026
Anti-male gender bias deters men from healthcare, early education, and domestic career fields, study suggests
Sexism

How sexual orientation stereotypes keep men out of early childhood education

March 13, 2026
New study finds link between ADHD symptoms and distressing sexual problems
Relationships and Sexual Health

A surprising number of men suffer pain during sex but are less likely than women to speak up

March 11, 2026

STAY CONNECTED

RSS Psychology of Selling

  • Why personalized ads sometimes backfire: A research review explains when tailoring messages works and when it doesn’t
  • The common advice to avoid high customer expectations may not be backed by evidence
  • Personality-matched persuasion works better, but mismatched messages can backfire
  • When happy customers and happy employees don’t add up: How investor signals have shifted in the social media age
  • Correcting fake news about brands does not backfire, five-study experiment finds

LATEST

Soft brain implants outperform rigid silicon in long-term safety study

Disclosing autism to AI chatbots prompts overly cautious, stereotypical advice

Can choking during sex cause brain damage? Emerging evidence points to hidden neurological risks

The decline of hypergamy: How a surge in university degrees changed marriage in the US and France

New research finds a persistent and growing leftward tilt in the social sciences

How a year of regular exercise alters the biology of stress

Scientists tested the creativity of AI models, and the results were surprisingly homogeneous

Live music causes brain waves to synchronize more strongly with rhythm than recorded music

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc