Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

New paper explores the blurred lines between AI and human communication

by Mane Kara-Yakoubian
July 14, 2024
in Artificial Intelligence
Reading Time: 3 mins read
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

Artificial intelligence (AI) is becoming increasingly sophisticated. In a thought-provoking paper published in Phenomenology and the Cognitive Sciences, Thomas Fuchs argues that true understanding and empathy require the subjectivity that is inherent to living beings, cautioning against the blurring lines between real and simulated interactions.

A robot may be able to engage in conversation that mirrors one’s emotions and respond with uncanny human-like behavior, but it’s just that–it’s almost natural.

The heart of human understanding hinges on the assumption of each other’s subjectivity, meaning that we see the other as a sentient being with feelings, intentions and consciousness. The concept of conviviality refers to a shared form of life that encompasses our mutual experience and existential realities; only through this shared framework can we achieve true emphatic connections, something AI fundamentally lacks.

Fuchs draws on the theory of embodied and enactive cognition to further support his stance. This theory suggests our thoughts and emotions are not just products of brain activity but are influenced by our bodily experiences. AI systems and robots, no matter how advanced, do not possess such biological embodiment. They can simulate interactions, but these simulations lack the intrinsic aliveness and subjective experience that define living beings.

One of the most profound concerns raised by Fuchs is the increasing difficulty in distinguishing between real and simulated interactions. As AI systems improve in mimicking human behavior, we run into the risk of confusing these interactions with genuine ones. This is particularly critical in sensitive areas like virtual psychotherapy. Indeed, AI-driven chatbots can provide comforting responses, but without the depth of understanding and empathy that human therapists offer. This raises ethical issues, such as users being misled into believing they are understood by a conscious being.

Fuchs makes a distinction between empathic and semantic understanding. Empathic understanding involves grasping another’s emotional expressions through intercorporeal empathy, which is grounded in shared, embodied experiences. AI can mimic emotional expressions, but this is merely an illusion. On the other hand, semantic understanding refers to comprehending verbal utterances as expressions of intentions and feelings.

While AI might pass the Turing Test by imitating human conversation, it lacks genuine comprehension. Fuchs illustrates this point by referring to John Searle’s “Chinese Room” argument, to demonstrate AI can produce appropriate responses without truly understanding the context or content.

Fuchs argues that consciousness and subjectivity are inherently tied to vital embodiment. True understanding and intentionality arise from the biological processes of living beings, such as homeostasis, metabolism, and emotional experiences. AI systems, which lack these processes, cannot possess genuine subjectivity. The notion of “strong AI,” capable of replicating human intelligence and understanding, is, therefore, a misinterpretation of what consciousness entails.

Google News Preferences Add PsyPost to your preferred sources

Fuchs further stresses the importance of precise language when discussing AI capabilities. Terms like understanding, empathy, and intentionality should be used carefully to prevent misconceptions about AI’s abilities. This clarity is essential to maintain ethical boundaries and prevent misleading interactions with AI.

As AI becomes more lifelike, Fuchs warns of the ethical and psychological dangers of “digital animism,” where people attribute human-like characteristics to machines. This can lead to deceptive interactions, particularly for vulnerable groups such as children, the elderly, and those seeking mental health support. Fuchs advocates for transparency in AI interactions, ensuring users are aware they are dealing with artificial agents and not real human beings.

Overall, the author presents a powerful argument against attributing genuine understanding and subjectivity to AI systems. Indeed, while AI can simulate human interactions, it cannot replicate the embodied experiences and consciousness that underpin true empathy and understanding.

The paper, “Understanding Sophia? On human interaction with artificial agents” was authored by Thomas Fuchs.

Previous Post

Meta-analysis of 108 studies confirms women experience impostor syndrome more frequently and intensely

Next Post

Older adults find psychological benefits in psychedelic retreats, despite milder acute effects

RELATED

Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Unrestricted generative AI harms high school math learning by acting as a crutch

April 21, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

People remain “blissfully ignorant” of AI use in everyday messages, new research shows

April 20, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Disclosing autism to AI chatbots prompts overly cautious, stereotypical advice

April 18, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Scientists tested the creativity of AI models, and the results were surprisingly homogeneous

April 18, 2026
People ascribe intentions and emotions to both human- and AI-made art, but still report stronger emotions for artworks made by humans
Artificial Intelligence

New research links personality traits to confidence in recognizing artificial intelligence deception

April 13, 2026
Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

People consistently devalue creative writing generated by artificial intelligence

April 5, 2026
People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Job seekers mask their emotions and act more analytical when evaluated by artificial intelligence

April 3, 2026

STAY CONNECTED

RSS Psychology of Selling

  • A new framework maps how influencers, brands, and platforms all compete for long-term value
  • Why personalized ads sometimes backfire: A research review explains when tailoring messages works and when it doesn’t
  • The common advice to avoid high customer expectations may not be backed by evidence
  • Personality-matched persuasion works better, but mismatched messages can backfire
  • When happy customers and happy employees don’t add up: How investor signals have shifted in the social media age

LATEST

Unrestricted generative AI harms high school math learning by acting as a crutch

Lifting weights builds a sharper mind and reduces anxiety in older women

How a perceived lack of traditional values makes minorities seem younger

Does listening to true crime make you a more creative criminal?

Autism spectrum disorder is associated with specific congenital malformations

Study links internalized pornographic standards to body image issues among incel men

Listening to bad music makes you crave sugar, study finds

People remain “blissfully ignorant” of AI use in everyday messages, new research shows

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc