Dancing on the brain: Researchers map neural responses to dancing to understand how we perceive it
en-GBde-DEes-ESfr-FR

Dancing on the brain: Researchers map neural responses to dancing to understand how we perceive it

20/11/2025 University of Tokyo

Dance is a form of cultural expression that has endured all of human history, channeling a seemingly innate response to the recognition of sound and rhythm. A team at the University of Tokyo and collaborators demonstrated distinct fMRI activity patterns in the brain related to a specific audience’s level of expertise in dance. The findings were born from recent breakthroughs in dance motion-capture datasets and AI (artificial intelligence) generative models, facilitating a cross-modal study characterizing the art form’s complexity.

Previous studies on dance have typically been limited to artificially controlled movement or music in isolation, or coarse binary descriptors from categorized clips. The ability to elicit holistic, cross-modal correspondence of real-world performances to local brain activity allowed for the capture of fine-grained, high-dimensional relationships in dance. This research project, led by Professor Hiroshi Imamizu of the University of Tokyo, Associate Professor Yu Takagi of the Nagoya Institute of Technology and their team, builds upon quantitative encoding advances in AI-based naturalistic modeling to compare brain responses to stimuli.

“In our research we strived to understand how the human brain directs movement of the body. As an everyday life example, dance provided the perfect theme,” said Imamizu. “Our team had great passion for genres like street dance and ballet, and by collaborating with street dance experts, the research soon took a life of its own.”

According to the team, a major problem to date was that in order to identify and respond to the multitude of stimuli in the real world, humans must process a wealth of perceptual information.

“That’s where the release of the AIST Dance Video Database was a stroke of fortune for us. It has over 13,000 recordings covering 10 genres of street dance,” said Imamizu. “It also led to an AI model which generates choreography from music. It almost felt that our research was being pushed by this new era of technology itself.”

In describing the study, the researchers said one of the underlying problems they would like to solve is to understand how the brain and AI correspond to each other. Can AI models represent the human mind? And conversely, can brain functions be used to grasp the inner working of AI?

To answer this, the team recruited 14 participants of mixed dance backgrounds and monitored their brain responses while viewing 1,163 dance clips of varied dancers and styles.

“By linking a choreographing AI to fMRI, or functional magnetic resonance imaging, a technique that can visualize active regions of the brain by recognizing small changes in blood flow, we could pinpoint where the brain binds music and movement,” said Takagi. “Cross-modal features consistently predicted activity in higher‑order association areas better than motion-only or sound-only features — evidence that integration of different sensory modalities such as sight and sound is central to how we experience dance.”

The findings also suggested that the model’s next-motion prediction architecture aligns well with human cognition, revealing parallels between how biological and artificial systems process and integrate audiovisual information.

Furthermore, to identify how dance features mapped to brain responses and emotional experiences, the team created a list of concepts informed by expert dancers with multiple rating scales. Feedback results from an online survey were processed through a brain‑activity simulator they’d developed, showing that different impressions correspond to distinct, distributed neural patterns, in which aesthetic and emotional responses were not reducible to a single scale dimension.

“Surprisingly, compared to nonexpert audiences, our brain-activity simulator was able to more precisely predict responses in experts. Even more interesting was the fact that while nonexperts exhibited individual differences in response patterns, the videos elicited a more diverse number of patterns in experts,” said Imamizu. “In other words, the results suggest that brain responses diverged rather than converged with expertise. This has very interesting implications for understanding the relation of experience and diversity of expressions in art. We believe that the freedom demonstrated to connect tightly controlled research methods with large, diverse real-world datasets opens up a new dimension of research possibilities.”

For the impassioned members of the team, the results brought them full circle. “We would love nothing more than to see our developed brain-activity simulator be used as a tool to create new dance styles which move people. We very much wish to explore applications to other forms of art also,” said Imamizu.

###

Funding: This work was supported by JSPS KAKENHI grant numbers 19H05725, JP19H05725, JP24H00172 and 22K03073, and PRESTO grant number JP-MJPR23I6.

Useful links:
Graduate School of Humanities and Sociology - https://www.l.u-tokyo.ac.jp/eng/index.html
Department of Psychology - https://www.l.u-tokyo.ac.jp/psy/en/overview/
AIST Dance Video Database - https://aistdancedb.ongaaccel.jp

About The University of Tokyo:
The University of Tokyo is Japan's leading university and one of the world's top research universities. The vast research output of some 6,000 researchers is published in the world's top journals across the arts and sciences. Our vibrant student body of around 15,000 undergraduate and 15,000 graduate students includes over 4,000 international students. Find out more at www.u-tokyo.ac.jp/en/ or follow us on X (formerly Twitter) at @UTokyo_News_en.
Journal article: Yu Takagi, Daichi Shimizu, Mina Wakabayashi, Ryu Ohata, and Hiroshi Imamizu, “Cross-modal deep generative models reveal the cortical representation of dancing”, Nature Communications, November 18, 2025, DOI: 10.1038/s41467-025-65039-w/
Link: https://www.nature.com/articles/s41467-025-65039-w
Attached files
  • Title: Mapping a living beat in the brain. Caption: Results from the research team’s brain-activity simulator show regions of brain where activity was significantly predicted by the simulator from features of music and movement, employing cross-modal AI generation (center). Street dance invoked the strongest response from the regions. ©2025 Takagi et al. CC-BY-ND
20/11/2025 University of Tokyo
Regions: Asia, Japan
Keywords: Applied science, Artificial Intelligence, Society, Psychology, Social Sciences, Arts, Performing arts

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonials

For well over a decade, in my capacity as a researcher, broadcaster, and producer, I have relied heavily on Alphagalileo.
All of my work trips have been planned around stories that I've found on this site.
The under embargo section allows us to plan ahead and the news releases enable us to find key experts.
Going through the tailored daily updates is the best way to start the day. It's such a critical service for me and many of my colleagues.
Koula Bouloukos, Senior manager, Editorial & Production Underknown
We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet

We Work Closely With...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by AlphaGalileo Terms Of Use Privacy Statement