A review of peer review has been published
James Coe is Associate Editor for research and innovation at Wonkhe, and a partner at Counterculture
Tags
Technopolis Group, commissioned by UKRI, has published a study into the impact of different interventions in the peer review process for the award of research and innovation funding. This is not a review of UKRI’s work but instead a broader look at how peer review can more effectively facilitate a speedier, more efficient, more risk aware, less burdensome and voluminous, more equal, and better quality, peer review process.
The report is exceptionally detailed and deserves reading in full. However, in light of the varying reviews of research bureaucracy, the research ecosystem, and replicability issues in research, let’s focus on some of the practical suggestions emerging from the report.
Firstly, it is worth being clear what peer review is and what peer review is for. In the context of funding it is the process by which researchers submit a funding proposal for review by others working in their field of research. The whole premise is that the quality of a submission, and by proxy the suitability for funding of a project, is best judged by someone who has similar or adjacent expertise. Whether funding submissions properly capture the quality of research is a different topic for a different blog.
The literature review within the report makes clear that there are well explored issues with peer review. It takes a long time to do, it can lead to risk averse outcomes, like many research processes it is unsure what to do with interdisciplinary research, it is biased toward established names and there is some evidence of gender bias, and it does not provide sufficient feedback for future work. Despite this, there are few advocates for a system with no element of peer review. If the system is not going to be replaced it can be improved.
It is made abundantly clear in the report that no single intervention can resolve all of the issues with peer review. It is not that any one problem is insurmountable but that research exists as an ecosystem so inevitably every action in one area of the research ecosystem has an impact elsewhere. The central thesis of the report is that interventions should be considered within this system analysis not as discreet activities.
The most substantial recommendations are those that recommend methods that have the potential to become the “new normal”. These include a variety of techniques to work with groups that are currently underrepresented in the funding landscape. Improving quality through greater use of briefings and training for peer reviewers. Exploring the use of automated peer review allocation and the expansion of anonymised review. And perhaps most controversially the potential use of randomisation where applications are indistinguishable or where a review panel cannot reach a consensus.
The report is cognisant that interventions require staff resources which is particularly challenging during a period of resource constraint. However, there are opportunities for resource saving for example in more efficient allocation of peer reviewers. The report therefore suggests that in the round reviewers may make efficiencies in some areas that could be invested in more intensive areas of peer-review. All of this is underpinned by the need for more robust and adaptable IT systems.
It looks like your link is broken to the report which can be found here:
https://www.ukri.org/news/review-of-peer-review-published/
One suggestion made in the review is have more two-stage funding applications like Leverhulme – you submit a short outline application and then invited to submit a full worked up application if successful. Has the merit of reducing bureaucracy whereby everyone submits a full worked up bid (which for UKRI can be very onerous). Would like to see it rolled out to some of the public/ government funding agencies.