A new study published in The Journal of Neuroscience suggests that people who are more attuned to their internal bodily sensations are also more likely to make moral decisions that align with the values of the broader group. The researchers found that individuals with greater interoceptive awareness—how accurately they can perceive their own bodily signals—tended to choose responses in moral dilemmas that matched the majority’s preferences. Brain imaging revealed that this connection may be supported by resting-state activity in specific brain regions involved in self-reflection and internal signal monitoring.
The researchers were interested in understanding why people so often make moral decisions that match the expectations of those around them, even when no explicit social pressure is present. Past theories have suggested that aligning with social norms helps conserve energy by minimizing social conflict, which in turn supports survival. The research team hypothesized that people who are better at sensing internal signals might use that information to more efficiently model others’ expectations and align their decisions accordingly.
“When people behave in ways that conflict with others’ expectations in social situations, it can easily lead to interpersonal conflict, and resolving this conflict may increase the use of physical resources,” said study author Hackjin Kim, a professor at Korea University and director of the Laboratory of Social and Decision Neuroscience.
“Recent theories (Constant et al., 2019; Theriault et al., 2021) suggest that our brains are designed to minimize physical resource consumption while maintaining survival. One way to do this is to learn others’ expectations to avoid social conflict. This strategy may ultimately be an important social adaptation skill that enhances survival. Based on this hypothesis, we predicted that if moral intuition is fundamentally based on the brain’s principle of minimizing bodily resource expenditure by learning others’ expectations, then individuals with better body-brain communication would more effectively adjust their moral intuitions by learning others’ expectations. Our findings provide the first evidence supporting this hypothesis.”
To test this idea, the researchers conducted two studies with Korean university students. In the first study, 74 participants completed an online task involving 48 moral dilemmas. These scenarios were designed to present ethically difficult decisions with no clear right or wrong answer—such as whether to sacrifice one person to save several others. Participants also completed a questionnaire measuring their interoceptive awareness, specifically how aware they were of bodily sensations and how they interpreted those sensations in emotional contexts. Additionally, all participants underwent resting-state brain scans.
In the second study, a separate group of 30 participants completed the same moral dilemma task and later performed a heartbeat counting exercise in a lab setting. This task measured interoceptive accuracy by asking participants to count their heartbeats without using physical cues, while their actual heartbeats were recorded using sensors.
Across both studies, participants tended to make moral decisions that aligned with the group consensus, as determined by the majority response to each scenario. Importantly, this tendency was not simply a matter of choosing utilitarian over deontological answers or vice versa. Instead, it reflected how closely a participant’s individual choices matched the specific pattern of group responses across different dilemmas.
The researchers found that individuals with higher interoceptive awareness, as measured by the self-report questionnaire in Study 1, made moral choices more similar to the group norm. Similarly, participants who performed better on the heartbeat counting task in Study 2 also showed greater alignment with group preferences. These findings were particularly strong in scenarios where there was less agreement among the group overall, suggesting that interoception plays a role in guiding decisions when norms are less obvious or intuitive.
“This study shows that people who are more aware of their internal bodily signals—like heartbeats or gut feelings—are more likely to make moral decisions that match what most people would consider acceptable or fair,” Kim told PsyPost. “This connection between the body and moral choices suggests that your physical sensations can influence how you decide what’s right or wrong.”
“Moral intuition isn’t random—it reflects what society expects. Even when people aren’t told what others think, their moral choices often align with group norms. That’s because over time, we build up internal ‘rules’ based on our social experiences. These rules help us predict what others expect from us, which in turn helps us avoid conflict and maintain social harmony.”
To understand the brain mechanisms that might support this effect, the researchers analyzed resting-state functional MRI data using a computational approach called a hidden Markov model. This allowed them to identify patterns of brain activity that fluctuate over time, even when participants are not engaged in any specific task.
They identified eleven distinct brain states and focused on two in particular. One state, characterized by heightened activity in the medial prefrontal cortex (mPFC)—a region linked to social evaluation and internal reflection—was associated with higher interoceptive awareness. A second state, marked by decreased activity in the precuneus—a region involved in self-related thought and internal monitoring—was linked to greater deviation from group consensus in moral decision-making.
“The mPFC has long been known to play a key role in moral decision-making in dilemma scenarios,” Kim explained. “Past research has highlighted its role in enabling emotionally driven deontological decisions that favor empathy for individuals over utilitarian choices that benefit the majority.”
“However, our findings offer a different interpretation. Rather than supporting a specific moral judgment style—utilitarian or deontological—we found that the mPFC is more closely associated with internalized social norms acquired through one’s life experience. This suggests that mPFC-driven intuitions are shaped by communal norms, and depending on the context, they may support either utilitarian or deontological decisions. Our study provides a new perspective on the principles and processes underlying the formation of moral intuition, which could pave the way for future research in this area.”
Although interoceptive awareness was not directly correlated with moral alignment at a statistically robust level, a mediation analysis showed that the resting-state brain dynamics helped bridge the two. In particular, spending more time in the mPFC-associated state indirectly predicted closer alignment with group moral norms through reduced time in the precuneus-deactivated state. This suggests that the way the brain processes internal signals during rest might set the stage for how we unconsciously form moral intuitions that match the social environment.
The findings support the idea that our moral judgments are not formed in isolation. Instead, they appear to be influenced by the body’s internal signals and how the brain interprets them, especially in regions involved in tracking one’s own and others’ mental states. The researchers propose that internal bodily cues may help individuals form intuitive models of social expectations—essentially, a “sense of should” that guides behavior in ways that avoid conflict and conserve mental and physical resources.
While the study offers intriguing insights, it also has limitations. One important caveat is that the brain imaging data came from resting-state scans, meaning participants were not actively making decisions at the time. Although this approach is valuable for identifying stable traits, it cannot capture the specific brain activity involved in the moment of moral decision-making. Future studies using task-based brain scans could offer more direct evidence of the neural processes underlying this behavior.
Another limitation is that the sample consisted entirely of Korean university students, raising questions about the cultural generalizability of the findings. What counts as a morally acceptable decision in one cultural context might not align with group preferences elsewhere. Expanding this research to include participants from diverse backgrounds would help clarify which aspects of moral consensus are universal and which are culture-specific.
“Future research could explore how the relationship between interoception and moral behavior varies across cultural contexts, types of moral dilemmas, and individual differences such as self-esteem or emotion regulation,” Kim told PsyPost. “Experimental studies could investigate how interoception training affects the formation and revision of moral intuitions, as well as moral conformity.”
“Clinically, this work could lead to scientifically grounded and systematic interventions for individuals with social cognition impairments, such as those with autism or alexithymia. On the technical side, we hope to develop AI models that simulate interoception-based moral reasoning and wearable systems that track internal signals to support ethical decision-making.”
The authors noted that the tendency to align one’s moral judgments with group consensus, as observed in this study, is different from simply conforming to others’ opinions in the moment.
“We want to emphasize that the moral alignment tendency revealed in this study is conceptually distinct from moral conformity,” Kim explained. “Importantly, this study did not provide participants with others’ or group preferences, allowing them to freely express their own moral preferences. Another recent study (von Mohr et al., 2023) found that people with high interoceptive sensitivity were actually less susceptible to social conformity. When taken together, these findings suggest that moral intuitions, which are deeply internalized through lifelong social experiences, may resist being influenced by others’ preferences when they are in conflict.”
“In fact, our lab recently conducted an experiment designed to distinguish between moral alignment and moral conformity, in order to examine how interoception is associated with each. We hope to be able to share the results of this study as soon as the analysis is complete.”
The study, “Neural Processes Linking Interoception to Moral Preferences Aligned with Group Consensus,” was authored by JuYoung Kim and Hackjin Kim.