Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Readers struggle to understand AI’s role in news writing, study suggests

by Eric W. Dolan
June 29, 2025
in Artificial Intelligence
Reading Time: 5 mins read
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

A new study published in Communication Reports has found that readers interpret the involvement of artificial intelligence in news writing in varied and often inaccurate ways. When shown news articles with different byline descriptions—some noting that the article was written with or by AI—participants offered a wide range of explanations about what that meant. Most did not see AI as acting independently. Instead, they constructed stories in their minds to explain how AI and human writers might have worked together.

As AI technologies become more integrated into journalism, understanding how people interpret AI’s role becomes increasingly important. Generative artificial intelligence refers to tools that can produce human-like text, images, or audio based on prompts or data. In journalism, this often means AI is used to summarize information, generate headlines, or even write full articles based on structured data.

Since 2014, some newsrooms have used AI to automate financial and sports stories. But the release of more advanced tools, such as ChatGPT in late 2022, has expanded the possibilities and made AI much more visible in everyday news production. For example, in 2023, a large media company in the United Kingdom hired reporters whose work includes AI assistance and noted this in their bylines. However, readers are not always told exactly how AI contributed, which can create confusion or suspicion.

The researchers behind the new study wanted to know how people understand bylines that mention AI and whether their interpretations are influenced by their familiarity with media and attitudes toward artificial intelligence. They were especially interested in whether people could accurately infer what AI did during the creation of a news article just based on the wording in the byline. This is important because trust in journalism depends on transparency, and previous controversies—such as Sports Illustrated being accused of using AI-generated content without disclosure—have shown that unclear authorship can damage credibility.

To explore these questions, the research team designed an online study involving 269 adult participants. The sample closely reflected the U.S. population in terms of age, gender, and ethnicity. Participants were recruited through Prolific, an online platform often used for social science research, and were paid for their time. After giving consent, participants completed a short questionnaire measuring their media literacy and general attitudes toward artificial intelligence. Then, each person was randomly assigned to read a slightly edited Associated Press article about a health story. The article was the same for everyone, except for one line at the top—the byline.

This byline varied in five different ways: some said the story was written by a “staff writer,” while others said it was written “by staff writer with AI tool,” “with AI assistance,” “with AI collaboration,” or simply “by AI.” After reading the article, participants were asked to explain what they thought the byline meant and how they interpreted the role of AI in writing the article.

The responses showed that readers tried to make sense of the byline even when it wasn’t entirely clear. This act of constructing meaning from limited information is known as “sensemaking”—a process where people use what they already know or believe to understand something new or ambiguous. In this case, people relied on their personal experiences, assumptions about journalism, and existing knowledge of AI.

Many participants assumed that AI helped in some way, even if they couldn’t say exactly how. Some thought the AI wrote most of the article, with a human editor stepping in to clean things up. Others believed that a human wrote the bulk of the article, but used AI for smaller tasks, such as checking facts or suggesting better wording.

Google News Preferences Add PsyPost to your preferred sources

One person imagined the journalist typed in a few keywords, and AI pulled together text from the internet to generate the article. Another described a collaborative effort where AI gathered background information, and the human writer then evaluated its accuracy. These mental models—often called “folk theories”—illustrate how readers try to fill in the gaps when information is missing or vague.

Interestingly, even when the byline said the article was written “by AI,” many participants still assumed a human had been involved in some way. This suggests that most people do not see AI as a fully independent writer. Instead, they believe human oversight is necessary, whether for guidance, supervision, or final editing.

Some participants expressed skepticism or even frustration with the byline. When the article said it was written by a “staff writer,” but didn’t include a name, some assumed that this was an attempt to hide the fact that AI had actually written it. Others said the writing quality was poor, and attributed that to AI involvement—even when the article had been described as written by a human. In both cases, the absence of a named author led to negative judgments. This finding supports earlier research showing that readers expect transparency in authorship, and when those expectations are not met, they may distrust the content.

To further understand what influenced these interpretations, the researchers grouped participants based on their media literacy and their general attitudes toward AI. Media literacy refers to how well people understand the media they consume, including how news is produced.

The researchers found that participants with higher media literacy were more likely to believe that AI had done most of the writing. Those with lower media literacy were more likely to assume that a human wrote the article, or that the work was a human-AI collaboration. Surprisingly, prior attitudes toward AI did not significantly affect how participants interpreted the byline.

This suggests that how much people know about the media may matter more than how they feel about artificial intelligence when trying to figure out who wrote a story. It also shows that simply including a phrase like “with AI assistance” is not enough to give readers a clear understanding of AI’s role. The study found that people often misinterpret or overthink these statements, and the lack of standard language around AI involvement only adds to the confusion.

The study has some limitations. Because the researchers did not include a named author in any of the byline conditions, it’s possible that participants reacted negatively because they missed seeing a real person’s name—something they expect from journalism. It’s also worth noting that the article used in the study was based on science reporting, which tends to be more objective and less interpretive. Reactions to AI involvement might be stronger for topics like politics or opinion writing. Future studies could explore how these findings apply to other types of journalism and examine how people respond when articles include a full disclosure or transparency statement about AI use.

Despite these limitations, the study raises important questions for news organizations. As AI becomes more common in the newsroom, it is not enough to say that a story was produced “with AI.” Readers want to know what exactly the AI did—did it write the first draft, summarize data, suggest edits, or merely spellcheck the final copy? Without this clarity, readers are left to guess, and those guesses often lean toward suspicion or confusion.

The researchers argue that greater transparency is needed, not only as a matter of ethics but as a way to maintain trust in journalism. According to guidelines from the Society of Professional Journalists, journalists are expected to explain their processes and decisions to the public. This expectation should extend to AI use. As with human sources, AI contributions need to be clearly cited and described.

The study, “Who Wrote It? News Readers’ Sensemaking of AI/Human Bylines,” was authored by Steve Bien-Aimé, Mu Wu, Alyssa Appelman, and Haiyan Jia.

Previous Post

MIND diet linked to better attentional control in schoolchildren, study finds

Next Post

Positive early experiences may buffer suicidal thoughts in those with trauma symptoms, new study finds

RELATED

Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Unrestricted generative AI harms high school math learning by acting as a crutch

April 21, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

People remain “blissfully ignorant” of AI use in everyday messages, new research shows

April 20, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Disclosing autism to AI chatbots prompts overly cautious, stereotypical advice

April 18, 2026
Live music causes brain waves to synchronize more strongly with rhythm than recorded music
Artificial Intelligence

Scientists tested the creativity of AI models, and the results were surprisingly homogeneous

April 18, 2026
People ascribe intentions and emotions to both human- and AI-made art, but still report stronger emotions for artworks made by humans
Artificial Intelligence

New research links personality traits to confidence in recognizing artificial intelligence deception

April 13, 2026
Scientists just found a novel way to uncover AI biases — and the results are unexpected
Artificial Intelligence

Artificial intelligence makes consumers more impatient

April 11, 2026
Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

People consistently devalue creative writing generated by artificial intelligence

April 5, 2026
People cannot tell AI-generated from human-written poetry and they like AI poetry more
Artificial Intelligence

Job seekers mask their emotions and act more analytical when evaluated by artificial intelligence

April 3, 2026

STAY CONNECTED

RSS Psychology of Selling

  • A new framework maps how influencers, brands, and platforms all compete for long-term value
  • Why personalized ads sometimes backfire: A research review explains when tailoring messages works and when it doesn’t
  • The common advice to avoid high customer expectations may not be backed by evidence
  • Personality-matched persuasion works better, but mismatched messages can backfire
  • When happy customers and happy employees don’t add up: How investor signals have shifted in the social media age

LATEST

Short video addiction is linked to lower life satisfaction through loneliness and anxiety

Unrestricted generative AI harms high school math learning by acting as a crutch

Lifting weights builds a sharper mind and reduces anxiety in older women

How a perceived lack of traditional values makes minorities seem younger

Does listening to true crime make you a more creative criminal?

Autism spectrum disorder is associated with specific congenital malformations

Study links internalized pornographic standards to body image issues among incel men

Listening to bad music makes you crave sugar, study finds

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc