Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

Scientists use machine learning to control specific brain circuits

by Karina Petrova
February 14, 2026
in Artificial Intelligence
Share on TwitterShare on Facebook

A team of researchers in Japan has developed an artificial intelligence tool called YORU that can identify specific animal behaviors in real time and immediately interact with the animals’ brain circuits. This open-source software, described in a study published in Science Advances, allows biologists to study social interactions with greater speed and precision than previously possible. By treating complex actions as distinct visual objects, the system enables computers to “watch” behaviors like courtship or food sharing and respond within milliseconds.

Biologists have struggled for years to automate the analysis of how animals interact. Social behaviors such as courtship or aggression involve dynamic movements where individuals often touch or obscure one another from the camera’s view. Previous software solutions typically relied on a method called pose estimation. This technique tracks specific body points like a joint, a knee, or a wing tip across many video frames to calculate movement.

These older methods often fail when animals get too close to one another. When two insects overlap, the computer frequently loses track of which leg belongs to which individual. This confusion makes it difficult to trigger experiments at the exact moment a behavior occurs. To solve this, a team including Hayato M. Yamanouchi and Ryosuke F. Takeuchi sought a different approach. They worked under the guidance of senior author Azusa Kamikouchi at Nagoya University.

The group aimed to build a system capable of “closed-loop” feedback. This term refers to an experimental setup where a computer watches an animal and instantly creates a stimulus in response. For example, a computer might turn on a light the moment a fly extends its wing. Achieving this requires software that processes video data faster than the animal moves.

The researchers built their system using a deep learning algorithm known as object detection. Unlike pose estimation, this method analyzes the entire shape of an animal in a single video frame. The team named their software YORU. This acronym stands for Your Optimal Recognition Utility.

YORU identifies a specific action as a distinct “behavior object.” The software recognizes the visual pattern of two ants sharing food or a male fly vibrating its wing. This approach allows the computer to classify social interactions even when the animals are touching. By viewing the behavior as a unified object rather than a collection of points, the system bypasses the confusion caused by overlapping limbs.

The team tested YORU on several different species to verify its versatility. They recorded videos of fruit flies courting, ants engaging in mouth-to-mouth food transfer—a behavior known as trophallaxis—and zebrafish orienting toward one another. The system achieved detection accuracy rates ranging from roughly 90 to 98 percent compared to human observation.

The software also proved effective at analyzing brain activity in mice. The researchers placed mice on a treadmill within a virtual reality setup. YORU accurately identified behaviors such as running, grooming, and whisker movements. The system matched these physical actions with simultaneous recordings of neural activity in the mouse cortex. This confirmed that the AI could reliably link visible movements to the invisible firing of neurons.

Google News Preferences Add PsyPost to your preferred sources

The most advanced test involved a technique called optogenetics. This method allows scientists to switch specific neurons on or off using light. The team genetically modified male fruit flies so that the neurons responsible for their courtship song would be silenced by green light. These neurons are known as pIP10 descending neurons.

YORU watched the flies in real time. When the system detected a male extending his wing to sing, it triggered a green light within milliseconds. The male fly immediately stopped his courtship song. This interruption caused a decrease in mating success that was statistically significant.

Hayato M. Yamanouchi, co-first author from Nagoya University’s Graduate School of Science, highlighted the difference in their approach. He noted, “Instead of tracking body points over time, YORU recognizes entire behaviors from their appearance in a single video frame. It spotted behaviors in flies, ants, and zebrafish with 90-98% accuracy and ran 30% faster than competing tools.”

The researchers then took the experiment a step further by using a projector. They wanted to manipulate only one animal in a pair without affecting the other. They genetically modified female flies to have light-sensitive hearing neurons. Specifically, they targeted neurons in the Johnston’s organ, which is the fly’s equivalent of an ear.

When the male fly extended his wing, YORU calculated the female’s exact position. The system then projected a small circle of light onto her thorax. This light silenced her hearing neurons exactly when the male tried to sing. The female ignored the male’s advances because she could not hear him.

This experiment confirmed the software’s ability to target individuals in a group. Azusa Kamikouchi explained the significance of this precision. “We can silence fly courtship neurons the instant YORU detects wing extension. In a separate experiment, we used targeted light that followed individual flies and blocked just one fly’s hearing neurons while others moved freely nearby.”

The speed of the system was a primary focus for the researchers. They benchmarked YORU against SLEAP, a popular pose-estimation tool. YORU exhibited a mean latency—the delay between seeing an action and reacting to it—of approximately 31 milliseconds. This was roughly 30 percent faster than the alternative method. Such speed is necessary for studying neural circuits, which operate on extremely fast timescales.

The system is also designed to be user-friendly for biologists who may not be experts in computer programming. It includes a graphical user interface that allows researchers to label behaviors and train the AI without writing code. The team has made the software open-source, allowing laboratories worldwide to download and adapt it for their own specific animal models.

While the system offers speed and precision, it relies on the appearance of behavior in a single frame. This design means YORU cannot easily identify behaviors that depend on a sequence of events over time. For example, distinguishing between the beginning and end of a foraging run might require additional analysis. The software excels at spotting “states” of being rather than complex narratives.

The current version also does not automatically track the identity of individual animals over long periods. If two animals look identical and swap places, the software might not distinguish between them without supplementary tools. Researchers may need to combine YORU with other tracking software for studies requiring long-term individual histories.

Hardware limitations present another challenge for the projector-based system. Fast-moving animals might exit the illuminated area before the light pulses if the projector has a slight delay. Future updates could incorporate predictive algorithms to anticipate where an animal will be millisecond by millisecond.

Despite these limitations, YORU represents a new way to interrogate the brain. By allowing computers to recognize social behaviors as they happen, scientists can now ask questions about how the brain navigates the complex social world. The ability to turn specific senses on and off during social exchanges opens new avenues for understanding the neural basis of communication.

The study, “YORU: Animal behavior detection with object-based approach for real-time closed-loop feedback,” was authored by Hayato M. Yamanouchi, Ryosuke F. Takeuchi, Naoya Chiba, Koichi Hashimoto, Takashi Shimizu, Fumitaka Osakada, Ryoya Tanaka, and Azusa Kamikouchi.

RELATED

Younger women find men with beards less attractive than older women do
Artificial Intelligence

Bias against AI art is so deep it changes how viewers perceive color and brightness

February 13, 2026
AI outshines humans in humor: Study finds ChatGPT is as funny as The Onion
Artificial Intelligence

AI boosts worker creativity only if they use specific thinking strategies

February 12, 2026
Psychology study sheds light on the phenomenon of waifus and husbandos
Artificial Intelligence

Psychology study sheds light on the phenomenon of waifus and husbandos

February 11, 2026
How people end romantic relationships: New study pinpoints three common break up strategies
Artificial Intelligence

Psychology shows why using AI for Valentine’s Day could be disastrous

February 9, 2026
Artificial intelligence predicts adolescent mental health risk before symptoms emerge
Artificial Intelligence

Scientists reveal the alien logic of AI: hyper-rational but stumped by simple concepts

February 7, 2026
Stanford scientist discovers that AI has developed an uncanny human-like ability
Artificial Intelligence

The scientist who predicted AI psychosis has issued another dire warning

February 7, 2026
Scientists shocked to find AI’s social desirability bias “exceeds typical human standards”
Artificial Intelligence

Deceptive AI interactions can feel more deep and genuine than actual human conversations

February 5, 2026
How AI’s distorted body ideals could contribute to body dysmorphia
Artificial Intelligence

How AI’s distorted body ideals could contribute to body dysmorphia

January 28, 2026

STAY CONNECTED

LATEST

One holiday sees a massive spike in emergency contraception sales, and it isn’t Valentine’s Day

Religiosity may protect against depression and stress by fostering gratitude and social support

Virtual parenting games may boost desire for real children, study finds

Donald Trump is fueling a surprising shift in gun culture, new research suggests

This mental trait predicts individual differences in kissing preferences

Strong ADHD symptoms may boost creative problem-solving through sudden insight

Who lives a good single life? New data highlights the role of autonomy and attachment

Waist-to-hip ratio predicts faster telomere shortening than depression

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc