Subscribe
The latest psychology and neuroscience discoveries.
My Account
  • Mental Health
  • Social Psychology
  • Cognitive Science
  • Neuroscience
  • About
No Result
View All Result
PsyPost
PsyPost
No Result
View All Result
Home Exclusive Artificial Intelligence

The secret to sustainable AI may have been in our brains all along

by Karina Petrova
October 31, 2025
in Artificial Intelligence
[Adobe Stock]

[Adobe Stock]

Share on TwitterShare on Facebook

Researchers have developed a new method for training artificial intelligence that dramatically improves its speed and energy efficiency by mimicking the structured wiring of the human brain. The approach, detailed in the journal Neurocomputing, creates AI models that can match or even exceed the accuracy of conventional networks while using a small fraction of the computational resources.

The study was motivated by a growing challenge in the field of artificial intelligence: sustainability. Modern AI systems, such as the large language models that power generative AI, have become enormous. They are built with billions of connections, and training them can require vast amounts of electricity and cost tens of millions of dollars. As these models continue to expand, their financial and environmental costs are becoming a significant concern.

“Training many of today’s popular large AI models can consume over a million kilowatt-hours of electricity, which is equivalent to the annual use of more than a hundred US homes, and cost tens of millions of dollars,” said Roman Bauer, a senior lecturer at the University of Surrey and a supervisor on the project. “That simply isn’t sustainable at the rate AI continues to grow. Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.”

To find a more efficient design, the research team looked to the human brain. While many artificial neural networks are “dense,” meaning every neuron in one layer is connected to every neuron in the next, the brain operates differently. Its connectivity is highly sparse and structured. For instance, in the visual system, neurons in the retina form localized and orderly connections to process information, creating what are known as topographical maps. This design is exceptionally efficient, avoiding the need for redundant wiring. The brain also refines its connections during development, pruning away unnecessary pathways to optimize its structure.

Inspired by these biological principles, the researchers developed a new framework called Topographical Sparse Mapping, or TSM. Instead of building a dense network, TSM configures the input layer of an artificial neural network with a sparse, structured pattern from the very beginning. Each input feature, such as a pixel in an image, is connected to only one neuron in the following layer in an organized, sequential manner. This method immediately reduces the number of connections, known as parameters, which the model must manage.

The team then developed an enhanced version of the framework, named Enhanced Topographical Sparse Mapping, or ETSM. This version introduces a second brain-inspired process. After the network trains for a short period, it undergoes a dynamic pruning stage. During this phase, the model identifies and removes the least important connections throughout its layers, based on their magnitude. This process is analogous to the synaptic pruning that occurs in the brain as it learns and matures, resulting in an even leaner and more refined network.

To evaluate their approach, the scientists built and trained a type of network known as a multilayer perceptron. They tested its ability to perform image classification tasks using several standard benchmark datasets, including MNIST, Fashion-MNIST, CIFAR-10, and CIFAR-100. This setup allowed for a direct comparison of the TSM and ETSM models against both conventional dense networks and other leading techniques designed to create sparse, efficient AI.

The results showed a remarkable balance of efficiency and performance. The ETSM model was able to achieve extreme levels of sparsity, in some cases removing up to 99 percent of the connections found in a standard network. Despite this massive reduction in complexity, the sparse models performed just as well as, and sometimes better than, their dense counterparts. For the more difficult CIFAR-100 dataset, the ETSM model achieved a 14 percent improvement in accuracy over the next best sparse method while using far fewer connections.

Google News Preferences Add PsyPost to your preferred sources

“The brain achieves remarkable efficiency through its structure, with each neuron forming connections that are spatially well-organised,” said Mohsen Kamelian Rad, a PhD student at the University of Surrey and the study’s lead author. “When we mirror this topographical design, we can train AI systems that learn faster, use less energy and perform just as accurately. It’s a new way of thinking about neural networks, built on the same biological principles that make natural intelligence so effective.”

The efficiency gains were substantial. Because the network starts with a sparse structure and does not require complex phases of adding back connections, it trains much more quickly. The researchers’ analysis of computational costs revealed that their method consumed less than one percent of the energy and used significantly less memory than a conventional dense model. This combination of speed, low energy use, and high accuracy sets it apart from many existing methods that often trade performance for efficiency.

A key part of the investigation was to confirm the importance of the orderly, topographical wiring. The team compared their models to networks that had a similar number of sparse connections but were arranged randomly. The results demonstrated that the brain-inspired topographical structure consistently produced more stable training and higher accuracy, indicating that the specific pattern of connectivity is a vital component of its success.

The researchers acknowledge that their current framework applies the topographical mapping only to the model’s input layer. A potential direction for future work is to extend this structured design to deeper layers within the network, which could lead to even greater gains in efficiency. The team is also exploring how the approach could be applied to other AI architectures, such as the large models used for natural language processing, where the efficiency improvements could have a profound impact.

The study, “Topographical sparse mapping: A neuro-inspired sparse training framework for deep learning models,” was authored by Mohsen Kamelian Rad, Ferrante Neri, Sotiris Moschoyiannis, and Roman Bauer.

Previous Post

Vulnerability to stress magnifies how a racing mind disrupts sleep

Next Post

New $2 saliva test may aid in psychiatric diagnosis

RELATED

Russian propaganda campaign used AI to scale output without sacrificing credibility, study finds
Artificial Intelligence

Knowing an AI is involved ruins human trust in social games

March 28, 2026
Scientists just uncovered a major limitation in how AI models understand truth and belief
Artificial Intelligence

Most Americans don’t fear an AI apocalypse, according to new research

March 26, 2026
AI can generate images that are just as effective at triggering human emotions as traditional photographs
Artificial Intelligence

AI can generate images that are just as effective at triggering human emotions as traditional photographs

March 24, 2026
ChatGPT’s social trait judgments align with human impressions, study finds
Artificial Intelligence

Efforts to make AI inclusive accidentally create bizarre new gender biases, new research suggests

March 22, 2026
Building muscle strength may help prevent depression, especially in women
Artificial Intelligence

News chatbots that present multiple viewpoints tend to earn the trust of conspiracy believers

March 20, 2026
Lifelong diet quality predicts cognitive ability and dementia risk in older age
Artificial Intelligence

Emotionally intelligent AI chatbots improve mental health but destroy real-world social ties

March 19, 2026
AI-assisted venting can boost psychological well-being, study suggests
Artificial Intelligence

Popular AI chatbots generate unsafe diet plans for teenagers

March 18, 2026
Generative AI chatbots like ChatGPT can act as an “emotional sanctuary” for mental health
Artificial Intelligence

Using AI to verify human advice could damage your professional relationships

March 17, 2026

STAY CONNECTED

RSS Psychology of Selling

  • What communication skills do B2B salespeople actually need in a digital-first era?
  • A founder’s smile may be worth millions in startup funding, research suggests
  • What actually makes millennials buy products on sale?
  • The surprising coping strategy that may help salespeople avoid burnout
  • When saying sorry with a small discount actually makes things worse

LATEST

Limiting social media to one hour a day reduces loneliness in distressed individuals

Does crying actually make you feel better? New psychology research shows it depends on a key factor

Countries holding stronger precarious manhood beliefs tend to be less happy, study finds

Metacognitive training reduces hostility between left-wing and right-wing voters

Pink noise worsens sleep quality when used to block out traffic and city noise

Co-occurring depression and cannabis use linked to less efficient brain networks

Knowing an AI is involved ruins human trust in social games

Brain scans reveal how poor sleep fuels negative emotions in alcohol addiction

PsyPost is a psychology and neuroscience news website dedicated to reporting the latest research on human behavior, cognition, and society. (READ MORE...)

  • Mental Health
  • Neuroimaging
  • Personality Psychology
  • Social Psychology
  • Artificial Intelligence
  • Cognitive Science
  • Psychopharmacology
  • Contact us
  • Disclaimer
  • Privacy policy
  • Terms and conditions
  • Do not sell my personal information

(c) PsyPost Media Inc

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Add New Playlist

Subscribe
  • My Account
  • Cognitive Science Research
  • Mental Health Research
  • Social Psychology Research
  • Drug Research
  • Relationship Research
  • About PsyPost
  • Contact
  • Privacy Policy

(c) PsyPost Media Inc