Improving reproducibility of social science research
en-GBde-DEes-ESfr-FR

Improving reproducibility of social science research


An LMU study led by sociologist Katrin Auspurg reveals that results are often hard to reproduce and suggests measures to improve reliability.

Are studies in the social sciences computationally reproducible? LMU researchers have tackled this question in a large-scale audit. The investigation was undertaken against the backdrop of the so-called replication crisis in empirical research, which has shown that researchers often cannot recreate published results even when the underlying data are available.

“When dropping an apple under the same conditions from the same height twice, it falls to the ground the same way both times,” explains Professor Katrin Auspurg, Chair of Quantitative Social Research at LMU. “Science should work the same way: applying identical methods to identical data under identical conditions should produce identical outcomes.” For the current study, Auspurg teamed up with Daniel Krähmer and Laura Schächtele from the Department of Sociology to investigate how often researchers share their materials for published articles and whether this allows the published results to be reproduced.

A matter of access to data preparation and analysis files

First, the team examined whether social scientists share their analysis code – that is, the scripts for conducting their statistical analyses. This is crucial for studies based on large survey data, as the analysis code is where researchers define which variables to use, which observations to include, and which models to employ. “Without analysis code, it’s often unclear how exactly the data were evaluated.”

The team emailed the authors of more than 1,000 articles that use data from the European Social Survey (ESS) – a widely used cross-national comparative survey in the social sciences. The result: Only 35 percent of researchers made their analysis code available upon request. The remainder either did not respond at all, could no longer find the code, or had not archived it. “For around two-thirds of studies, it was not possible to fully reconstruct how the researchers arrived at their results,” says Auspurg.

Time-consuming verification

In a second step, the researchers tested the reproducibility of those studies for which analysis code was available. Choosing 100 articles at random, they attempted to replicate 699 published results. Of these, they could numerically reproduce only around half. For 23 percent of the results, reproduction was thwarted by incomplete or poorly documented materials. For 26 percent, numerical deviations emerged. Overall, only around 18 percent of all considered results were exactly reproducible. In addition, verification was often time-consuming. “We often had to work through scripts for hours on end, reorganize materials, and correct minor errors before we could even test a result for reproducibility.”
Auspurg emphasizes that the study does not make a blanket judgment about social science research. “We found no evidence for systematic bias. Many of the deviations in our reproductions are small and do not necessarily overturn the conclusions of the original studies.” Nevertheless, the study shows that the information and documentation provided in social science articles is often insufficient to reliably verify results. “This is at odds with the principle that scientific results must be repeatable and replicable by other researchers.”

How to improve reproducibility

The team recommends a range of measures to improve reproducibility. “Scientific journals should require authors to share and archive all relevant data and materials for their articles,” explains Auspurg. This includes not only research data but also analysis code. Data providers, in addition, should clearly label all their data versions and keep older versions accessible in archives.
Transparency and reproducibility at LMU are strengthened through the Open Science Center, which advises researchers on good research practices. “This makes it easier for other teams to verify, reproduce, and extend existing analyses,” says Auspurg. “If researchers can build on each others’ work, this saves resources.”

The current study is part of the META-REP program, in which Katrin Auspurg is a member of the program committee. Coordinated by LMU Munich and funded by the German Research Foundation (DFG), the program investigates the reliability of research results in the behavioral, social, and cognitive sciences, and develops standards and tools to improve reproducibility and replicability of social science research.

Krähmer D, Schächtele L, Auspurg K.:Code sharing and reproducibility in survey-based social research: Evidence from a large-scale audit. In: Royal Society Open Science 2026
https://doi.org/10.1098/rsos.251997
Regions: Europe, Germany
Keywords: Society, Social Sciences

Disclaimer: AlphaGalileo is not responsible for the accuracy of content posted to AlphaGalileo by contributing institutions or for the use of any information through the AlphaGalileo system.

Testimonials

For well over a decade, in my capacity as a researcher, broadcaster, and producer, I have relied heavily on Alphagalileo.
All of my work trips have been planned around stories that I've found on this site.
The under embargo section allows us to plan ahead and the news releases enable us to find key experts.
Going through the tailored daily updates is the best way to start the day. It's such a critical service for me and many of my colleagues.
Koula Bouloukos, Senior manager, Editorial & Production Underknown
We have used AlphaGalileo since its foundation but frankly we need it more than ever now to ensure our research news is heard across Europe, Asia and North America. As one of the UK’s leading research universities we want to continue to work with other outstanding researchers in Europe. AlphaGalileo helps us to continue to bring our research story to them and the rest of the world.
Peter Dunn, Director of Press and Media Relations at the University of Warwick
AlphaGalileo has helped us more than double our reach at SciDev.Net. The service has enabled our journalists around the world to reach the mainstream media with articles about the impact of science on people in low- and middle-income countries, leading to big increases in the number of SciDev.Net articles that have been republished.
Ben Deighton, SciDevNet

We Work Closely With...


  • e
  • The Research Council of Norway
  • SciDevNet
  • Swiss National Science Foundation
  • iesResearch
Copyright 2026 by AlphaGalileo Terms Of Use Privacy Statement