A New Tool to Spot Duplicative Reviews
en-GBde-DEes-ESfr-FR

A New Tool to Spot Duplicative Reviews

30/04/2026 TranSpread

The need is hard to ignore. The paper notes that the number of systematic reviews indexed in PubMed rose from an estimated 1,432 per year in 2000 to 29,073 in 2019. That explosive growth has expanded access to synthesized evidence, but it has also fueled duplication, redundancy, and inconsistency across reviews addressing the same question. Existing tools such as Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) 2020, A Measurement Tool to Assess Systematic Reviews (AMSTAR) 2, and Risk of Bias in Systematic Reviews (ROBIS) can evaluate reporting quality, rigor, and bias, yet none was designed specifically to measure whether one review substantially duplicates another. Based on these challenges, there is a need to carry out in-depth research on how duplication in systematic reviews can be systematically identified and assessed.

Researchers from the Affiliated Traditional Chinese Medicine Hospital of Guangzhou Medical University, Lanzhou University, and Hong Kong Baptist University reported (DOI: 10.26599/eCMTA.2026.9570031) in 2026 in Evidence-Based Chinese Medicine and Technology Assessment a protocol for developing the systematic review duplication tool. The article was received on December 16, 2025, revised on February 18, 2026, and accepted on March 20, 2026. The protocol lays out a plan to create and validate a standardized instrument for comparing pairs of intervention-based systematic reviews that address the same disease or condition, with the goal of identifying whether their overlap reflects meaningful replication or avoidable redundancy.

The proposed design is ambitious and deliberately practical. The team will build the tool in three phases: preparatory work, tool development, and dissemination. At the center is a four-domain framework covering research topic, research methods, research results, and methodological quality. Instead of forcing a simple yes-or-no decision, the tool is intended to generate a qualitative duplication profile showing where two reviews truly converge and where they differ in important ways. Development will include literature-based item generation, pilot testing with 40 systematic reviews across 17 disease categories, a two-round modified Delphi process with expert input, consensus meetings, and reliability testing. The final product is expected to be available in both web-based and Excel-based versions, including a full version for in-depth assessment and a simplified version for rapid screening.

“Medicine does not necessarily need more reviews. It needs reviews that are genuinely needed.” That is the larger message carried by this protocol. Framed that way, the proposed tool reads less like a technical checklist and more like a gatekeeper for a crowded evidence landscape. It points toward a future in which novelty is judged more transparently, duplication is identified earlier, and new evidence syntheses are expected to prove not only that they are possible, but that they are worth doing.

If validated as planned, the tool could shape the full life cycle of evidence synthesis. Researchers could use it before launching a review to determine whether a meaningful gap remains. Editors and peer reviewers could apply it when judging novelty and contribution. Guideline developers and health technology assessment teams could use it to choose among overlapping reviews more confidently. The authors also suggest that a structured duplication framework may be especially valuable in areas with heterogeneous interventions and outcome measures, where repeated reviews can fragment rather than strengthen the evidence base. In that sense, the SRD tool is positioned as both a methodological innovation and a practical step toward a leaner, clearer, and more trustworthy research system.

###

References

DOI

10.26599/eCMTA.2026.9570031

Original Source URL

https://www.sciopen.com/article/10.26599/eCMTA.2026.9570031

Funding Information

The study was sponsored by the Project of Administration of Traditional Chinese Medicine of Guangdong Province (grant no.: 20251280), Guangzhou Health Science and Technology Project (grant no.: 20242A011005), Guangzhou Science and Technology Fund (grant nos.: 2024A03J0791, 2025A03J3510, 2025A03J3512, and 2025A03J3428), Guangzhou Key Science and Technology Project of TCM (grant no.: 2025ZD010), Plan on Enhancing Scientific Research in GMU (grant nos.: GMUCR 2024-01018 and GMUCR2024-02030), and the Young Scientific and Technological Talents Research Project of the Affiliated Traditional Chinese Medicine Hospital (grant nos.: 2022RC07, 2024SZYRC13 and 2024SZYRC15).

About Evidence-Based Chinese Medicine and Technology Assessment

Evidence-Based Chinese Medicine and Technology Assessment (eCMTA) is a journal focused on strengthening the scientific foundation of Chinese medicine and integrative healthcare. It publishes multidisciplinary research that supports better clinical, public health, and policy decision-making, covering topics such as evidence synthesis, guideline development, technology assessment, health economics, clinical trials, and real-world studies. The journal places particular emphasis on building reliable evidence for Chinese medicine through modern research methods and transparent evaluation standards. It also maintains rigorous peer review and follows recognized publishing ethics principles in safeguarding research integrity. Published by Tsinghua University Press in collaboration with Beijing University of Chinese Medicine, the journal serves as a platform for bridging traditional medical knowledge with contemporary evidence-based practice.

Paper title: Development of the systematic review duplication (SRD) tool: a protocol for assessing duplication in intervention-based systematic reviews
Attached files
  • Three distinct phases of tool development. Phase 1: preparatory work (team formation and tool conceptualization); Phase 2: tool development (initial item generation, pilot testing and item validation, Delphi survey, reliability testing); Phase 3: dissemination.
30/04/2026 TranSpread
Regions: North America, United States, Asia, China, Hong Kong
Keywords: Health, Policy, Business, Universities & research

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonials

For well over a decade, in my capacity as a researcher, broadcaster, and producer, I have relied heavily on Alphagalileo.
All of my work trips have been planned around stories that I've found on this site.
The under embargo section allows us to plan ahead and the news releases enable us to find key experts.
Going through the tailored daily updates is the best way to start the day. It's such a critical service for me and many of my colleagues.
Koula Bouloukos, Senior manager, Editorial & Production Underknown
We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet

We Work Closely With...


  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2026 by AlphaGalileo Terms Of Use Privacy Statement