A new, open peer review system would solve many of the problems plaguing the current scientific publication system. This is the conclusion of an article by psychologists from the University of Amsterdam.
A new, open peer review system would solve many of the problems plaguing the current scientific publication system. This is the conclusion of an article by psychologists from the University of Amsterdam, published this week in the online journal Frontiers in Computational Neuroscience. According to the authors, the new system would result in better reviews and meritocratic editorial hierarchies, and would improve the independent verification of scientific claims.
Most scientific articles are reviewed by several anonymous experts from the relevant scientific field before being published in a professional journal. Under the current system, these peer reviews are conducted behind closed doors. Furthermore, the assessment of other researchers’ work tends to be a time-consuming task that yields relatively few rewards. This can have a detrimental effect on the quality of reviews. UvA psychologists Jelte Wicherts, Rogier Kievit, Marjan Bakker and Denny Borsboom illustrate these problems by outlining their own experiences during a controversial IQ study, and present a new, open peer review system in which the names of those responsible for reviewing scientific manuscripts are disclosed. The reviews will then be assessed by fellow scientists on the basis of a basic grading system (a peer review of the peer review, so to speak), thus improving the quality of reviews and facilitating a more open debate.
In addition to advocating peer reviews of peer reviews, the researchers also call for a more transparent editorial hierarchy. Reviewers who have received positive assessments of their reviews will be able to move to higher editorial positions. The researchers also advocate the online publication of research data. This will allow for a more effective independent verification of results, thus minimising the likelihood of statistical errors and flattered statistical results.
J.M. Wicherts, R.A. Kievit, M. Bakker and D. Borsboom: ‘Letting the daylight in: Reviewing the reviewers and other ways to maximize transparency in science’, in: Frontiers in Computational Neuroscience (16 March 2012).
Further information: dr. Jelte Wicherts, email: firstname.lastname@example.org.