Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Psychology shows why using AI for Valentine’s Day could be disastrous

by Julian Givi, Colleen P. Kirk, and Danielle Hass
February 9, 2026
in Artificial Intelligence, Relationships and Sexual Health
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

As Valentine’s Day approaches, finding the perfect words to express your feelings for that special someone can seem like a daunting task – so much so that you may feel tempted to ask ChatGPT for an assist.

After all, within seconds it can dash off a well-written, romantic message. Even a short, personalized limerick or poem is no sweat.

But before you copy and paste that AI-generated love note, you might want to consider how it could make you feel about yourself.

We research the intersection of consumer behavior and technology, and we’ve been studying how people feel after using generative AI to write heartfelt messages. It turns out that there’s a psychological cost to using the technology as your personal ghostwriter.

The rise of the AI ghostwriter

Generative AI has transformed how many people communicate. From drafting work emails to composing social media posts, these tools have become everyday writing assistants. So it’s no wonder some people are turning to them for more personal matters, too.

Wedding vows, birthday wishes, thank you notes and even Valentine’s Day messages are increasingly being outsourced to algorithms.

The technology is certainly capable. Chatbots can craft emotionally resonant responses that sound genuinely heartfelt.

But there’s a catch: When you present these words as your own, something doesn’t sit right.

Google News Preferences Add PsyPost to your preferred sources

When convenience breeds guilt

We conducted five experiments with hundreds of participants, asking them to imagine using generative AI to write various emotional messages to loved ones. Across every scenario we tested – from appreciation emails to birthday cards to love letters – we found the same pattern: People felt guilty when they used generative AI to write these messages compared to when they wrote the messages themselves.

When you copy an AI-generated message and sign your name to it, you’re essentially taking credit for words you didn’t write.

This creates what we call a “source-credit discrepancy,” which is a gap between who actually created the message and who appears to have created it. You can see these discrepancies in other contexts, whether it’s celebrity social media posts written by public relations teams or political speeches composed by professional speechwriters.

When you use AI, even though you might tell yourself you’re just being efficient, you can probably recognize, deep down, that you’re misleading the recipient about the personal effort and thought that went into the message.

The transparency test

To better understand this guilt, we compared AI-generated messages to other scenarios. When people bought greeting cards with preprinted messages, they felt no guilt at all. This is because greeting cards are transparently not written by you. Greeting cards carry no deception: Everyone understands you selected the card and that you didn’t write it yourself.

We also tested another scenario: having a friend secretly write the message for you. This produced just as much guilt as using generative AI. Whether the ghostwriter is human or an artificial intelligence tool doesn’t matter. What matters most is the dishonesty.

There were some boundaries, however. We found that guilt decreased when messages were never delivered and when recipients were mere acquaintances rather than close friends.

These findings confirm that the guilt stems from violating expectations of honesty in relationships where emotional authenticity matters most.

Somewhat relatedly, research has found that people react more negatively when they learn a company used AI instead of a human to write a message to them.

But the backlash was strongest when audiences expected personal effort – a boss expressing sympathy after a tragedy, or a note sent to all staff members celebrating a colleague’s recovery from a health scare. It was far weaker for purely factual or instructional notes, such as announcing routine personnel changes or providing basic business updates.

What this means for your Valentine’s Day

So, what should you do about that looming Valentine’s Day message? Our research suggests that the human hand behind a meaningful message can help both the writer and the recipient feel better.

This doesn’t mean you can’t use generative AI as a brainstorming partner rather than a ghostwriter. Let it help you overcome writer’s block or suggest ideas, but make the final message truly yours. Edit, personalize and add details that only you would know. The key is co-creation, not complete delegation.

Generative AI is a powerful tool, but it’s also created a raft of ethical dilemmas, whether it’s in the classroom or in romantic relationships. As these technologies become more integrated into everyday life, people will need to decide where to draw the line between helpful assistance and emotional outsourcing.

This Valentine’s Day, your heart and your conscience might thank you for keeping your message genuinely your own.The Conversation

 

This article is republished from The Conversation under a Creative Commons license. Read the original article.

 

RELATED

New research reveals masturbation is on the rise and challenges old ideas about its role
Relationships and Sexual Health

Peri-orgasmic phenomena: Women report diverse symptoms ranging from laughter to foot pain

February 9, 2026
Narcissistic students perceive student-professor flirting as less morally troubling
Attachment Styles

Attachment anxiety shapes how emotions interfere with self-control

February 8, 2026
Artificial intelligence predicts adolescent mental health risk before symptoms emerge
Artificial Intelligence

Scientists reveal the alien logic of AI: hyper-rational but stumped by simple concepts

February 7, 2026
The surprising way the brain’s dopamine-rich reward center adapts as a romance matures
Neuroimaging

The surprising way the brain’s dopamine-rich reward center adapts as a romance matures

February 7, 2026
Stanford scientist discovers that AI has developed an uncanny human-like ability
Artificial Intelligence

The scientist who predicted AI psychosis has issued another dire warning

February 7, 2026
New psychology research changes how we think about power in the bedroom
Relationships and Sexual Health

New psychology research changes how we think about power in the bedroom

February 6, 2026
Sorting Hat research: What does your Hogwarts house say about your psychological makeup?
Relationships and Sexual Health

This behavior explains why emotionally intelligent couples are happier

February 6, 2026
Scientists shocked to find AI’s social desirability bias “exceeds typical human standards”
Artificial Intelligence

Deceptive AI interactions can feel more deep and genuine than actual human conversations

February 5, 2026

PsyPost Merch

STAY CONNECTED

LATEST

Why some brain cells resist the toxic proteins linked to Alzheimer’s disease

Study finds associations between gut microbiota composition and autism

Peri-orgasmic phenomena: Women report diverse symptoms ranging from laughter to foot pain

Evolutionary motives of fear and coercion shape political views on wealth redistribution

Scientists: Ultra-processed foods are engineered to hijack your brain and should be treated like Big Tobacco

Caring for grandchildren is linked to better brain health in older adults

Scientists identify key brain mechanism behind ayahuasca’s ability to reduce PTSD symptoms

Personality traits shape how pilots react to simulated in-flight crises

RSS Psychology of Selling

  • Sales agents often stay for autonomy rather than financial rewards
  • The economics of emotion: Reassessing the link between happiness and spending
  • Surprising link found between greed and poor work results among salespeople
  • Intrinsic motivation drives sales performance better than financial rewards
  • New research links faking emotions to higher turnover in B2B sales
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy