Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Psychology shows why using AI for Valentine’s Day could be disastrous

by Julian Givi, Colleen P. Kirk, and Danielle Hass
February 9, 2026
in Artificial Intelligence, Relationships and Sexual Health
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

As Valentine’s Day approaches, finding the perfect words to express your feelings for that special someone can seem like a daunting task – so much so that you may feel tempted to ask ChatGPT for an assist.

After all, within seconds it can dash off a well-written, romantic message. Even a short, personalized limerick or poem is no sweat.

But before you copy and paste that AI-generated love note, you might want to consider how it could make you feel about yourself.

We research the intersection of consumer behavior and technology, and we’ve been studying how people feel after using generative AI to write heartfelt messages. It turns out that there’s a psychological cost to using the technology as your personal ghostwriter.

The rise of the AI ghostwriter

Generative AI has transformed how many people communicate. From drafting work emails to composing social media posts, these tools have become everyday writing assistants. So it’s no wonder some people are turning to them for more personal matters, too.

Wedding vows, birthday wishes, thank you notes and even Valentine’s Day messages are increasingly being outsourced to algorithms.

The technology is certainly capable. Chatbots can craft emotionally resonant responses that sound genuinely heartfelt.

But there’s a catch: When you present these words as your own, something doesn’t sit right.

Google News Preferences Add PsyPost to your preferred sources

When convenience breeds guilt

We conducted five experiments with hundreds of participants, asking them to imagine using generative AI to write various emotional messages to loved ones. Across every scenario we tested – from appreciation emails to birthday cards to love letters – we found the same pattern: People felt guilty when they used generative AI to write these messages compared to when they wrote the messages themselves.

When you copy an AI-generated message and sign your name to it, you’re essentially taking credit for words you didn’t write.

This creates what we call a “source-credit discrepancy,” which is a gap between who actually created the message and who appears to have created it. You can see these discrepancies in other contexts, whether it’s celebrity social media posts written by public relations teams or political speeches composed by professional speechwriters.

When you use AI, even though you might tell yourself you’re just being efficient, you can probably recognize, deep down, that you’re misleading the recipient about the personal effort and thought that went into the message.

The transparency test

To better understand this guilt, we compared AI-generated messages to other scenarios. When people bought greeting cards with preprinted messages, they felt no guilt at all. This is because greeting cards are transparently not written by you. Greeting cards carry no deception: Everyone understands you selected the card and that you didn’t write it yourself.

We also tested another scenario: having a friend secretly write the message for you. This produced just as much guilt as using generative AI. Whether the ghostwriter is human or an artificial intelligence tool doesn’t matter. What matters most is the dishonesty.

There were some boundaries, however. We found that guilt decreased when messages were never delivered and when recipients were mere acquaintances rather than close friends.

These findings confirm that the guilt stems from violating expectations of honesty in relationships where emotional authenticity matters most.

Somewhat relatedly, research has found that people react more negatively when they learn a company used AI instead of a human to write a message to them.

But the backlash was strongest when audiences expected personal effort – a boss expressing sympathy after a tragedy, or a note sent to all staff members celebrating a colleague’s recovery from a health scare. It was far weaker for purely factual or instructional notes, such as announcing routine personnel changes or providing basic business updates.

What this means for your Valentine’s Day

So, what should you do about that looming Valentine’s Day message? Our research suggests that the human hand behind a meaningful message can help both the writer and the recipient feel better.

This doesn’t mean you can’t use generative AI as a brainstorming partner rather than a ghostwriter. Let it help you overcome writer’s block or suggest ideas, but make the final message truly yours. Edit, personalize and add details that only you would know. The key is co-creation, not complete delegation.

Generative AI is a powerful tool, but it’s also created a raft of ethical dilemmas, whether it’s in the classroom or in romantic relationships. As these technologies become more integrated into everyday life, people will need to decide where to draw the line between helpful assistance and emotional outsourcing.

This Valentine’s Day, your heart and your conscience might thank you for keeping your message genuinely your own.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

Previous Post

Why some brain cells resist the toxic proteins linked to Alzheimer’s disease

Next Post

Unexpected study results complicate the use of brain stimulation for anxiety

RELATED

Russian propaganda campaign used AI to scale output without sacrificing credibility, study finds
Artificial Intelligence

Knowing an AI is involved ruins human trust in social games

March 28, 2026
Positivity resonance predicts lasting love, according to new psychology research
Relationships and Sexual Health

Women in romantic relationships report higher sexual satisfaction than men

March 27, 2026
Scientists just uncovered a major limitation in how AI models understand truth and belief
Artificial Intelligence

Most Americans don’t fear an AI apocalypse, according to new research

March 26, 2026
Low user engagement limits effectiveness of digital mental health interventions
Attachment Styles

Hiding your true self in a relationship is linked to a higher risk of cheating

March 26, 2026
Perceived sex ratios influence women’s body image and dieting motivation, study finds
Evolutionary Psychology

Women experience greater jealousy when their romantic rivals have highly feminine faces

March 25, 2026
Testosterone levels help explain why women tend to experience lower sexual desire for their partners
Relationships and Sexual Health

New study challenges the idea that sexual consent is widely misinterpreted in romantic relationships

March 24, 2026
AI can generate images that are just as effective at triggering human emotions as traditional photographs
Artificial Intelligence

AI can generate images that are just as effective at triggering human emotions as traditional photographs

March 24, 2026
Loneliness follows a U-shaped path across adulthood, study finds
Relationships and Sexual Health

New relationships take a surprising physical toll on older adults

March 23, 2026

STAY CONNECTED

RSS Psychology of Selling

  • What communication skills do B2B salespeople actually need in a digital-first era?
  • A founder’s smile may be worth millions in startup funding, research suggests
  • What actually makes millennials buy products on sale?
  • The surprising coping strategy that may help salespeople avoid burnout
  • When saying sorry with a small discount actually makes things worse

LATEST

Countries holding stronger precarious manhood beliefs tend to be less happy, study finds

Metacognitive training reduces hostility between left-wing and right-wing voters

Pink noise worsens sleep quality when used to block out traffic and city noise

Co-occurring depression and cannabis use linked to less efficient brain networks

Knowing an AI is involved ruins human trust in social games

Brain scans reveal how poor sleep fuels negative emotions in alcohol addiction

Audio tapes reveal mass rule-breaking in Milgram’s obedience experiments

People with social anxiety experience more meaningful interactions in small groups

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc