Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

How personality and culture relate to our perceptions of artificial intelligence

by Eric W. Dolan
February 23, 2026
in Artificial Intelligence, Personality Psychology
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A recent study reveals that a person’s cultural background, personality traits, and technical skills shape how they view the impact of artificial intelligence on their overall well-being. The findings suggest that feeling competent with new technology and possessing a sense of personal control lead to more positive experiences with artificial intelligence. The research was published in the Journal of Technology in Behavioral Science.

As artificial intelligence becomes a regular part of daily life, from personalized internet recommendations to healthcare planning, questions have emerged about how these tools affect mental health and happiness. Previous research tends to focus on the positive contributions of these technologies in specific fields like education or medicine. Less is known about how an individual’s unique psychological traits influence their daily interactions with these systems.

Most existing studies also focus heavily on Western populations. This leaves a gap in understanding how people from different cultural backgrounds experience artificial intelligence.

“Public reactions to artificial intelligence are highly polarized. While some people see AI as exciting and beneficial, others express concern, discomfort, or even fear about its societal and personal consequences,” explained study authors Magnus Liebherr of the University of Duisburg-Essen and Raian Ali of Hamad Bin Khalifa University.

“Much of the public debate focuses on the capabilities of the technology itself, but psychological research suggests that people’s responses to new technologies are strongly shaped by individual differences. We therefore wanted to understand which personal factors are associated with how people perceive AI’s impact on their well-being.”

In particular, the researchers aimed to see how cultural differences, personality types, and a psychological concept called “locus of control” affect whether people view these tools as helpful or harmful. Locus of control refers to how much a person believes they have the power to influence events in their own life. Sociologists generally classify Arab cultures as collectivist, meaning they emphasize group harmony, social bonds, and community cohesion.

The United Kingdom represents a more individualistic culture, where personal autonomy and individual rights tend to take precedence. The researchers suspected these foundational cultural differences might shape how societies embrace new and potentially disruptive software.

“This geographical diversity was particularly important given the different cultural contexts in which AI is adopted and used,” Liebherr and Ali told PsyPost. “As we discussed in our work “Who lets AI take over? Cross-national variation in willingness to delegate socially important roles to artificial intelligence” (Yankouskaya et al., 2026), cultural factors play a significant role in how people approach and integrate AI technologies into their lives.”

Google News Preferences Add PsyPost to your preferred sources

The researchers conducted an online survey involving 562 participants between the ages of 18 and 60. The sample was split evenly, featuring 281 individuals from the United Kingdom and 281 individuals from Arab countries. To qualify for the Arab sample, participants had to reside in a Gulf nation, ensuring a shared cultural and political background.

The participants completed several psychological questionnaires. First, they rated their own competency in using and managing artificial intelligence on a scale from one to six. Next, they answered questions designed to measure five major personality traits: openness, conscientiousness, extraversion, agreeableness, and neuroticism. Neuroticism is a trait associated with a tendency to experience anxiety and negative emotions.

Participants also completed an assessment to determine their locus of control, indicating whether they felt their life was guided by their own actions or by outside forces. Finally, they answered a modified well-being questionnaire. This specific survey asked them to rate how often they felt positive emotions, engagement, meaning, and accomplishment when thinking about artificial intelligence and its presence in society.

The data revealed significant cultural differences in how people view these technologies. Arab participants reported that artificial intelligence contributed much more positively to their well-being than British participants did. The British group actually scored higher on measures linking artificial intelligence to negative emotions and feelings of loneliness.

The scientists found that technical skills played a massive role in shaping user attitudes. Across both cultural groups, individuals who reported higher competency with artificial intelligence perceived the technology as having a much more positive impact on their well-being. This provides evidence that understanding how these systems work and knowing how to use them helps reduce uncertainty and increases the perceived benefits of the technology.

“We expected personality traits to be the dominant predictors, but AI competency emerged as equally strong in predicting positive perceptions of AI,” Liebherr and Ali said. “This was encouraging because personality traits tend to be relatively stable, whereas competency can potentially be improved.”

Personality traits also strongly predicted user experiences, though the specific traits mattered differently depending on the culture. Individuals who scored high in neuroticism tended to view artificial intelligence as less beneficial and more concerning in both the British and Arab groups. This aligns with broader psychological concepts suggesting that people who are naturally prone to anxiety are more sensitive to the potential risks of new technologies.

“Another notable finding was the consistent role of anxiety-related traits: individuals who are generally more prone to worry tended to perceive AI more negatively across both cultural groups,” the researchers explained.

Other personality traits varied by region. Extraversion and conscientiousness predicted positive perceptions of artificial intelligence in the Arab sample. Agreeableness predicted positive perceptions in the British sample.

“This suggests that the influence of personality on attitudes toward AI may vary across cultural contexts, possibly due to other culture-related variables moderating this relationship,” Liebherr and Ali told PsyPost. “These cultural variations highlight the complexity of how personal and cultural factors interact in shaping AI perceptions.”

A consistent finding across both cultures was the importance of an internal locus of control. Participants who believed they were largely in control of their own life paths viewed artificial intelligence as a positive contributor to their well-being. The scientists suggest that feeling a strong sense of personal agency helps people feel more comfortable integrating new tools into their routines.

The statistical models used by the scientists explained a substantial amount of the differences in user attitudes. The tested variables accounted for 31 percent of the varied perceptions in the British sample and 47 percent in the Arab sample. The analysis also revealed that demographic factors like age and gender did not influence how individuals perceived the technology’s contribution to their well-being.

“A central message of the study is that people’s experiences with AI are shaped not only by how AI products are designed but also by their own characteristics and skills,” Liebherr and Ali said.

As with all research, there are a few limitations in mind when interpreting these findings. The study is correlational, which means it cannot prove that specific traits directly cause positive or negative views of artificial intelligence. High technical competency might lead to more positive perceptions, but people with positive attitudes might also simply be more motivated to learn about the technology.

The researchers also point out that the survey measured people’s subjective perceptions of their well-being, rather than objective changes in their mental health. The survey did not specify which types of artificial intelligence the participants should think about when answering the questions. A person might have a very different reaction to a helpful medical tool than they would to an automated hiring system or a social media algorithm.

“Future research should examine how these psychological factors interact with specific AI applications and how perceptions change over time as people gain experience,” Liebherr and Ali said. “It is also important to investigate both potential benefits and risks of AI use. For example, conversational AI systems may provide support and information, but there is also a need to study whether heavy reliance on such systems could have unintended negative consequences for well-being.”

“As we explored in our work “Can ChatGPT be addictive? A call to examine the shift from support to dependence in AI conversational large language models” (Yankouskaya et al., 2025), understanding the full spectrum of AI’s impact, from enhancement to potential problematic use, is crucial for developing responsible AI systems and usage guidelines. We are also interested in understanding how interventions aimed at improving AI competency and sense of control might positively influence well-being outcomes.”

“A practical implication of our findings is that improving people’s competency and skills in AI (not just their understanding) may help them feel more comfortable with these technologies,” the researchers added. “For developers and policymakers, this means providing transparent systems, clear explanations, and opportunities for users to build skills and maintain a sense of control. Explainable AI (XAI) is particularly important in this regard.”

“Our study found that internal locus of control (the belief that one can influence their own outcomes) was a significant predictor of positive AI perceptions. By helping users understand how AI makes decisions, XAI can enhance this sense of control, which in turn may lead to more positive perceptions of AI’s contribution to well being. Supporting users in developing competency and providing them with tools to understand and control AI systems may be just as important as improving the technology itself.”

The study, “Artificial Intelligence vs. Users’ Well-Being and the Role of Personal Factors: A Study on Arab and British Samples,” was authored by Magnus Liebherr, Areej Babiker, Sameha Alshakhsi, Dena Al-Thani, Ala Yankouskaya, Christian Montag, and Raian Ali.

Previous Post

Grandiose narcissists tend to show reduced neural sensitivity to errors

Next Post

Competitive gaming communities can become essential social sanctuaries

RELATED

Young children are more likely to trust information from robots over humans
Artificial Intelligence

The presence of robot eyes affects perception of mind

February 21, 2026
Psychology study reveals a fascinating fact about artwork
Artificial Intelligence

AI art fails to trigger the same empathy as human works

February 20, 2026
Secure attachment to both parents − not just mothers − boosts children’s psychological development
Parenting

Big five personality traits predict fertility expectations across reproductive age

February 19, 2026
Brain imaging study finds large sex-differences in regions tied to mental health
Addiction

Neural signatures of impulsivity and neuroticism are largely distinct in youth

February 19, 2026
Wine expertise reshapes the brain: Sommeliers show altered neural activity when tasting wine
Personality Psychology

An AI analyzed wine reviews and found a surprising link to personality

February 17, 2026
ChatGPT’s social trait judgments align with human impressions, study finds
Artificial Intelligence

AI chatbots generate weight loss coaching messages perceived as helpful as human-written advice

February 16, 2026
Testosterone heightens men’s sensitivity to social feedback and reshapes self-esteem
Personality Psychology

Gender-affirming hormone therapy linked to shifts in personality traits

February 15, 2026
Scientists use machine learning to control specific brain circuits
Artificial Intelligence

Scientists use machine learning to control specific brain circuits

February 14, 2026

STAY CONNECTED

LATEST

Competitive gaming communities can become essential social sanctuaries

How personality and culture relate to our perceptions of artificial intelligence

Grandiose narcissists tend to show reduced neural sensitivity to errors

Left-wing authoritarians use egotistical social tactics more often

Adding extra salt to your food might increase your risk of depression

Reading may protect older adults against loneliness better than some social activities

Neurological risks rise as vaccination rates fall and measles returns

New research suggests the “lying flat” lifestyle actively decreases long-term happiness

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc