Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Psychopharmacology
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Cognitive Science

New cognitive science research gives insight into how infants understand on-screen animated events

by Eric W. Dolan
November 30, 2021
in Cognitive Science
Share on TwitterShare on Facebook

New research published in the cognitive science journal Open Mind provides insight into how 19-month-old infants interpret animations on a screen. The study suggests that infants understand by this age that the animations they see are decoupled from their immediate environment and independent from the physical screen they appear on.

“To my mind, the way in which young infants make sense of symbolic objects in their environment is a rich topic to investigate for at least two reasons,” said study author Barbu Revencu of the Central European University.

“First, while there is a lot of research on infants’ and children’s understanding of drawings, pictures, or scale models, not much work has been dedicated to the cognitive mechanism underlying these processes. What is the input to this mechanism? And what is the output? How do we tag the drawing of a pipe as a pipe while fully aware that it is not a pipe? Is there a common process underlying our interpretation of diagrams, animations, puppet shows, and (internet) memes? Why is it the case that we use visual symbols in communication to begin with, and what information can we efficiently convey through them?”

“Second, the topic has methodological consequences for the field I am working in,” Revencu explained. “In developmental psychology, we often bring infants to the lab and show them simple animated events. We then measure their reactions to these animations under the assumption that the reactions will tell us something about infants’ underlying cognitive processes. But simple animated events are so different from the real world!”

“I am not referring only to visual differences between an animation and a real-world scene, but to the fact that animations are typically communicative while real-world scenes typically are not. If infants view animations and screens as representational, we are also testing their communicative inferences when presenting them with these stimuli, which is interesting in itself.”

In a series of experiments, the researchers tested various hypotheses about how infants interpret on-screen animations. Their first experiment confirmed that 19-month-old infants could accurately and reliably follow the trajectory of a ball as it fell off a wooden seesaw and into a box.

The researchers then had infants watch as an animated ball appeared to fall from a cartoon seesaw on a television into one of two real boxes below the screen. When asked where the ball was, the infants “often preferred to point to the screen” rather than the boxes. “When they did provide a response, however, they chose boxes at random instead of basing their answers on the side in which the ball was seen falling,” the researchers said.

The results suggest that the infants did not expect the falling animated balls to end up in real boxes. However, it is also possible that the infants did not understand that the question “where is the ball” referred to the animated ball on the screen. The researchers ruled out this possibility in their third experiment — when the real off-screen boxes were replaced with animated boxes on the TV screen, the infants overwhelmingly pointed to the correct box.

In their fourth experiment, the researchers tested whether infants accept that an event displayed on one screen can move to a different screen.

The infants viewed two different TV screens, which were placed side-by-side. One screen depicted an animated bear leaving and entering a house, while the other screen depicted an animated rabbit doing the same. Although both houses were identical, the backgrounds for each animation were noticeably different.

The researchers first confirmed that the infants had learned which animal lived on which screen, then physically moved the location of the two screens. During this process, the screens were covered and the two backgrounds were surreptitiously swapped. The screens were then uncovered and the infants were asked to identify where the animals lived. (The characters were not visible on the screen at this point.)

Infants tended to link the animated characters to their virtual environments as opposed to their physical one. In other words, they selected the screen based on its background image rather than its physical position.

The findings indicate that “that by 19 months, infants have figured out that on-screen animated events are not happening in the here-and-now,” Revencu told PsyPost.

However, “the pattern of findings only provides negative evidence for infants’ understanding of screens as representational, because it rules out alternative accounts,” Revencu explained. “Experiments 1-3 suggest that infants do not think that on-screen events can extend beyond the screen, while Experiment 4 suggests that they do not track animations by the physical device on which they are presented. While this pattern of findings is compatible with a representational understanding of animations, it would be ideal if we gathered direct evidence for the representational hypothesis as well.”

The study, “For 19-Month-Olds, What Happens On-Screen Stays On-Screen“, was authored by Barbu Revencu and Gergely Csibra.

RELATED

Scientists identify a fat-derived hormone that drives the mood benefits of exercise
Artificial Intelligence

Conversational AI can increase false memory formation by injecting slight misinformation in conversations

January 7, 2026
Language learning rates in autistic children decline exponentially after age two
Autism

Language learning rates in autistic children decline exponentially after age two

January 6, 2026
What we know about a person changes how our brain processes their face
Cognitive Science

Fascinating new neuroscience model predicts intelligence by mapping the brain’s internal clocks

January 5, 2026
Genetic risk for alcoholism linked to brain immune cell response, study finds
Cognitive Science

Faster biological aging predicts lower cognitive test scores 7 years later

January 4, 2026
Neuroscientists just rewrote our understanding of psychedelics with a groundbreaking receptor-mapping study
Cognitive Science

Researchers validate intelligence assessment across diverse demographic groups

December 29, 2025
Lifelong diet quality predicts cognitive ability and dementia risk in older age
Artificial Intelligence

Users of generative AI struggle to accurately assess their own competence

December 29, 2025
Lifelong diet quality predicts cognitive ability and dementia risk in older age
Cognitive Science

Lifelong diet quality predicts cognitive ability and dementia risk in older age

December 29, 2025
Social energy research: New psychology findings provide insight into why some interactions drain us
Cognitive Science

Mental fatigue has psychological triggers − new research suggests challenging goals can head it off

December 28, 2025

PsyPost Merch

STAY CONNECTED

LATEST

How genetically modified stem cells could repair the brain after a stroke

Psychologists identify a potential bridge between narcissism and OCD

Conversational AI can increase false memory formation by injecting slight misinformation in conversations

Voters from both parties largely agree on how to punish acts of political violence

Psychopathy and sadism show opposite associations with reproductive success

Adults with ADHD crave more relationship support but often feel shortchanged

Women experiencing more sexual guilt have worse sexual functioning

Early life adversity may fundamentally rewire global brain dynamics

RSS Psychology of Selling

  • New study reveals why some powerful leaders admit mistakes while others double down
  • Study reveals the cycle of guilt and sadness that follows a FOMO impulse buy
  • Why good looks aren’t enough for virtual influencers
  • Eye-tracking data shows how nostalgic stories unlock brand memory
  • How spotting digitally altered ads on social media affects brand sentiment
         
       
  • Contact us
  • Privacy policy
  • Terms and Conditions
[Do not sell my information]

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy