Brain-inspired AI could cut energy use and boost performance  
en-GBde-DEes-ESfr-FR

Brain-inspired AI could cut energy use and boost performance  


Artificial intelligence (AI) could soon become more energy-efficient and faster, thanks to a new approach developed at the University of Surrey that takes direct inspiration from biological neural networks of the human brain.

In a study published in Neurocomputing, researchers from Surrey’s Nature-Inspired Computation and Engineering (NICE) group have shown that mimicking the brain’s sparse and structured neural wiring can significantly improve the performance of artificial neural networks (ANNs) - used in generative AI and other modern AI models such as ChatGPT – without sacrificing accuracy.

The method, called Topographical Sparse Mapping (TSM), rethinks how AI systems are wired at their most fundamental level. Unlike conventional deep-learning models – such as those used for image recognition and language processing – which connect every neuron in one layer to all neurons in the next, wasting energy, TSM connects each neuron only to nearby or related ones, much like how the brain’s visual system organises information efficiently. Through this natural design, the model eliminates the need for vast numbers of unnecessary connections and computations.

An enhanced version, called Enhanced Topographical Sparse Mapping (ETSM), goes a step further by introducing a biologically inspired “pruning” process during training – similar to how the brain gradually refines its neural connections as it learns. Together, these approaches allow AI systems to achieve equal or even greater accuracy while using only a fraction of the parameters and energy required by conventional models.

Dr Roman Bauer, Senior Lecturer at the University of Surrey’s School of Computer Science and Electronic Engineering, and project supervisor, said:

“Training many of today’s popular large AI models can consume over a million kilowatt-hours of electricity, which is equivalent to the annual use of more than a hundred US homes, and cost tens of millions of dollars. That simply isn’t sustainable at the rate AI continues to grow. Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.”

Surrey’s enhanced model achieved up to 99% sparsity – meaning it could remove almost all of the usual neural connections – but still matched or exceeded the accuracy of standard networks on benchmark datasets. Because it avoids the constant fine-tuning and rewiring used by other approaches, it trains faster, uses less memory and consumes less than one per cent of the energy of a conventional AI system.

Mohsen Kamelian Rad, a PhD student at the University of Surrey and lead author of the study, said:

“The brain achieves remarkable efficiency through its structure, with each neuron forming connections that are spatially well-organised. When we mirror this topographical design, we can train AI systems that learn faster, use less energy and perform just as accurately. It’s a new way of thinking about neural networks, built on the same biological principles that make natural intelligence so effective”

While the current framework applies the brain-inspired mapping to an AI model’s input layer, extending it to deeper layers could make networks even leaner and more efficient. The research team is also exploring how the approach could be used in other applications, such as more realistic neuromorphic computers, where the efficiency gains could have an even greater impact.

[ENDS]

Topographical sparse mapping: A neuro-inspired sparse training framework for deep learning models; Mohsen Kamelian Rada, Ferrante Neri, Sotiris Moschoyiannis, Roman Bauer; Neurocomputing; 10.1016/j.neucom.2025.131740 ; 30th October
Regions: Europe, United Kingdom
Keywords: Applied science, Artificial Intelligence, Computing, Technology, Science, Energy

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonials

For well over a decade, in my capacity as a researcher, broadcaster, and producer, I have relied heavily on Alphagalileo.
All of my work trips have been planned around stories that I've found on this site.
The under embargo section allows us to plan ahead and the news releases enable us to find key experts.
Going through the tailored daily updates is the best way to start the day. It's such a critical service for me and many of my colleagues.
Koula Bouloukos, Senior manager, Editorial & Production Underknown
We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet

We Work Closely With...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by AlphaGalileo Terms Of Use Privacy Statement