For best experience please turn on javascript and use a modern browser!
You are using a browser that is no longer supported by Microsoft. Please upgrade your browser. The site may not present itself correctly if you continue browsing.
Bekijk de site in het Nederlands

A group of 270 scientists worldwide, fifteen of whom belong to the UvA, investigated the reproducibility of 100 studies that appeared in three eminent journals of psychology in 2008. The results of the Reproducibility Project: Psychology, which was launched four years ago, have now been published in Science. Fewer than half of the replications of psychological research projects yielded the same results as the original study.

The project is the most comprehensive study ever conducted into the rate and predictors of reproducibility in any field of science. The results provide suggestive evidence toward the challenges of reproducing research findings, including identifying predictors of reproducibility and practices to improve it.

Reproducibility means that the results recur when the same data are analysed again, or when new data are collected and analysed using the same methods. 'It is a determining factor in scientific practice, as the reliability of a study is increased by independent replication and the expansion of ideas and evidence,' says Kai Jonas, a researcher with the team of UvA psychologists.

Failure to replicate does not necessarily mean incorrect

The researchers emphasise that the failure of a replication does not automatically indicate that the original findings were incorrect.

Even though most replication teams worked with the original authors so as to be sure they used the same materials and methods, minute differences in when, where, or how the replication was carried out might have influenced the results. Or the replication might have failed to detect the original result by chance. Another possibility is that the original result might have been a false positive in which the result does not correspond to the reality.

Eric-Jan Wagenmakers of the UvA team was not surprised by the low number of replications: ‘There are many serious challenges that we – just as in other empirical disciplines - must overcome, including “publication bias”, frequentist statistical analysis (p-values) and the reward system in academia. The project results underscore the need for more transparency and greater methodological rigour.’

Transparency and pre-registration

Increasing numbers of researchers, organisations, funders, journals and publishers are working to improve reproducibility in scientific research. 'Efforts include increasing transparency of original research materials, codes and data so that other teams can more accurately assess, replicate and extend the original research', according to team member Denny Borsboom who, together with Wagenmakers, was involved in drafting the Transparency and Openness Promotion (TOP) Guidelines published in Science in June. 

Jonas adds: 'Through our efforts to improve research quality, we in the psychology field are frontrunners compared to other scientific disciplines. The University of Amsterdam's Psychology Research Institute plays a key role in this, for example through the initiation of journals such as Comprehensive Results in Social Psychology that publish only pre-registered papers. Pre-registration of research designs is intended to ensure that only statistically and methodologically sound articles with easily reproducible results make it to publication. This approach also prevents data manipulation and the ‘tweaking’ of research, for example by altering one's hypothesis after the study has been conducted. The findings of our Reproducibility Project confirm the relevance of this approach.'

Publication details

Open Science Collaboration (2015): Estimating the reproducibility of psychological science’ in:  Science, 349(6251). DOI: 10.1126/science.aac4716