Researchers have developed a new type of artificial neuron that physically emulates the electrochemical processes of biological brain cells. This innovation, which relies on the movement of atoms rather than electrons, could lead to computer chips that are vastly smaller and more energy-efficient. The study was published in the journal Nature Electronics.
The primary motivation for this research stems from the immense energy demands of modern artificial intelligence. Large AI models require vast computational resources, consuming electricity on a scale comparable to that of entire communities. In contrast, the human brain performs complex tasks like learning and recognition with remarkable efficiency, operating on only about 20 watts of power.
Neuromorphic computing is a field dedicated to designing systems that replicate the brain’s principles to achieve this level of efficiency. While many existing brain-inspired chips simulate neural activity using conventional digital electronics, this new work sought to create a device that physically embodies the analog dynamics of a real neuron.
To understand the innovation, it helps to first consider how a biological neuron functions. Neurons in the brain use a combination of electrical and chemical signals. An electrical pulse travels along the neuron until it reaches a junction called a synapse, where it is converted into a chemical signal. These chemical signals, often carried by charged particles called ions like sodium and potassium, cross the gap to the next neuron, where they can trigger a new electrical pulse. This process of ion movement is fundamental to how the brain processes information and learns.
The researchers constructed an artificial neuron that mirrors this ion-based mechanism. The device is composed of just three simple components: one specialized device called a diffusive memristor, one transistor, and one resistor. This compact design allows an entire artificial neuron to occupy the footprint of a single transistor, a substantial reduction from the tens or even hundreds of transistors needed for conventional artificial neuron circuits. Instead of moving electrons, the standard for nearly all modern electronics, this system operates by controlling the movement of silver ions within a thin oxide material.
When an electrical voltage is applied to the device, it causes the silver ions to move and form a conductive channel, generating an output spike of electricity. This action is analogous to a biological neuron firing. The physics governing the motion and diffusion of these silver ions is very similar to the dynamics of ions moving across a brain cell’s membrane. By using the physical movement of atoms, the device directly replicates the hardware-based learning process of the brain, often called “wetware.”
This approach differs from traditional computing, which relies on fast but volatile electrons. Computing with electrons is well-suited for software-based learning, where algorithms are run on general-purpose hardware. The brain, however, learns by physically reconfiguring its connections through ion movement, an inherently energy-efficient method. This is why a child can learn to recognize a new object after seeing only a few examples, while a computer often needs to be trained on thousands of images. The ion-based device brings artificial systems a step closer to this efficient, hardware-based learning style.
To verify its capabilities, the team demonstrated that their artificial neuron could successfully reproduce six key characteristics observed in biological neurons. These included leaky integration, where the neuron sums incoming signals over time but with a gradual decay. It also exhibited threshold firing, meaning it produces an output spike only after its input signals accumulate past a certain point. The researchers also confirmed cascaded propagation by showing that the output from one of their artificial neurons could successfully trigger a second one in a series.
The device also displayed more complex behaviors. It showed intrinsic plasticity, a process where a neuron’s recent firing history influences its future responsiveness, making it easier to fire again after recent activity. It also had a refractory period, a brief pause after firing during which it is resistant to firing again, which helps regulate neural activity. Finally, the neuron exhibited stochasticity, an element of randomness in its firing pattern that is also found in the brain and can be beneficial for certain computational tasks and for preventing systems from getting stuck in repetitive loops.
To assess how these neurons would perform in a complex network, the researchers created a detailed computational model of their device. They used this model to simulate a type of brain-inspired network called a recurrent spiking neural network. This network was then tested on a standard benchmark task: classifying spoken digits from a dataset of audio recordings. The simulated network, built from the principles of their new neuron, achieved a classification accuracy of 91.35 percent, a result that shows its potential as a building block for powerful and efficient computing systems.
The research does face some practical hurdles before it can be widely implemented. The silver used in the proof-of-concept device is not easily integrated into standard semiconductor manufacturing processes. Future work will involve exploring alternative materials and ions that offer similar dynamic properties but are compatible with existing fabrication technologies.
The next step for the researchers is to build and integrate large numbers of these artificial neurons to test their collective ability to replicate the brain’s efficiency and capabilities on a larger scale. Beyond creating more powerful AI, such brain-faithful systems could offer a unique platform for neuroscientists, potentially revealing new insights into the workings of the human brain itself.
The study, “A spiking artificial neuron based on one diffusive memristor, one transistor and one resistor,” was authored by Ruoyu Zhao, Tong Wang, Taehwan Moon, Yichun Xu, Jian Zhao, Piyush Sud, Seung Ju Kim, Han-Ting Liao, Ye Zhuo, Rivu Midya, Shiva Asapu, Dawei Gao, Zixuan Rong, Qinru Qiu, Cynthia Bowers, Krishnamurthy Mahalingam, S. Ganguli, A. K. Roy, Qing Wu, Jin-Woo Han, R. Stanley Williams, Yong Chen & J. Joshua Yang.