Large Language Models Are Zero-Shot Cross-Domain Diagnosticians in Cognitive Diagnosis
en-GBde-DEes-ESfr-FR

Large Language Models Are Zero-Shot Cross-Domain Diagnosticians in Cognitive Diagnosis

04.12.2025 Frontiers Journals

This research presents a groundbreaking approach to addressing the zero-shot cross-domain cognitive diagnosis (ZCCD) challenge in intelligent education systems through the innovative application of large language models (LLMs). The study introduces the LLM-guided Cognitive State Transfer (LCST) paradigm, which effectively bridges the gap between source domains with rich response logs and target domains lacking interaction data. By reformulating cognitive diagnosis as a natural language processing task, the LCST framework leverages LLMs’ exceptional reasoning capabilities to transfer students’ cognitive states across disparate knowledge domains without requiring target-domain training data.
The methodology demonstrates particular strength in its four-module architecture: pre-established cognitive state extraction, cross-domain bridging through concept reasoning, target-domain diagnosis, and feedback constraint mechanisms. Through extensive experimentation across five real-world educational datasets (mathematics, physics, chemistry, history, and geography), the LCST paradigm consistently outperforms traditional random initialization methods and even approaches the performance upper bounds of oracle methods in certain scenarios. Notably, the framework achieves this while maintaining computational efficiency, with smaller LLMs like Llama3.2:3b demonstrating near-optimal performance at significantly reduced computational costs compared to larger models.
The study makes three substantial contributions to the field of AI in education: (1) it formally defines and provides the first comprehensive solution to the ZCCD problem, (2) it pioneers the use of LLMs as educational experts for cross-domain cognitive diagnosis, and (3) it establishes a new benchmark for prompt engineering techniques specifically tailored for educational diagnostic tasks. While the research acknowledges current limitations in prompt sensitivity and domain knowledge coverage, it simultaneously opens promising avenues for future work in adaptive fine-tuning and external knowledge integration. The LCST framework ultimately represents a significant step toward more flexible, data-efficient intelligent tutoring systems capable of providing personalized learning support even in resource-constrained educational contexts.
DOI:10.1007/s44366-025-0054-y
Angehängte Dokumente
  • Figure 1. Illustration of the zero-shot cross-domain task.
  • Figure 2. Overview of the LCST. LCST: large language model-guided cognitive state transfer.
04.12.2025 Frontiers Journals
Regions: Asia, China
Keywords: Humanities, Education

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Referenzen

We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet
AlphaGalileo is a great source of global research news. I use it regularly.
Robert Lee Hotz, LA Times

Wir arbeiten eng zusammen mit...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2025 by DNN Corp Terms Of Use Privacy Statement