Researchers have developed a powerful new software toolbox that allows realistic brain models to be trained directly on data. This open-source framework, called JAXLEY, combines the precision of biophysical models with the speed and flexibility of modern machine learning techniques. The study, published in Nature Methods could be a major step toward faster and more accurate simulations of brain function.
Understanding how neurons give rise to thought, perception, or memory remains one of the biggest challenges in neuroscience. To explore these questions, researchers use computer models that simulate the activity of real brain cells. Such biophysical models are digital reconstructions of neuronal networks that reveal how messages travel as electrical signals through individual neurons or across entire networks.
But getting these models to act like real neurons is a complex technical challenge: each model contains thousands of equations, and adjusting all those parameters to accurately match experimental data is computationally demanding, often requiring weeks of manual tuning or inefficient trial-and-error simulations.
To overcome these limitations, researchers at Neuro-Electronics Research Flanders (NERF; a research initiative backed by imec, KU Leuven and VIB) and VIB.AI together with colleagues from and the University of Tübingen, have developed JAXLEY: a new open-source toolbox that brings modern machine learning techniques to detailed neuronal models, offering a faster and more realistic way in which neuroscientists simulate the electrical activity within networks.
Bridging neuroscience and machine learning
JAXLEY takes inspiration from machine learning. By automatically calculating how small changes in one parameter affect the outcome, it allows detailed brain models to adjust themselves based on data rather than through trial and error. And because it runs on the same GPUs that power AI, JAXLEY can analyse large datasets in parallel, dramatically increasing the speed and efficiency of realistic brain modelling.
By combining the precision of biophysical models with the scalability of AI methods, JAXLEY overcomes computational barriers that have so far limited the scale of brain simulations. The toolbox is fully open-source, making it accessible to the entire research community. This work opens new directions for exploring how the brain learns and remembers, replacing manual tuning with adaptive, data-driven learning inspired by the brain itself.
To demonstrate its versatility, the researchers put JAXLEY to the test on a wide range of tasks. It accurately recreated the activity of real neurons and trained large biophysical networks to perform memory and visual tasks with up to 100,000 parameters. These results show that even complex realistic neuronal networks can now be trained directly on experimental data or computational tasks, something that was previously difficult to achieve.
“JAXLEY fundamentally changes how we approach brain modeling,” says Pedro Gonçalves, group leader at NERF and VIB.AI. “It enables us to build realistic models that can be optimized and scaled efficiently, opening new ways to understand how neural computations emerge from the brain’s underlying processes.”