Anton Muscatelli is Principal and Vice Chancellor at the University of Glasgow

The Research Excellence Framework (REF) provides accountability for around £2bn of public investment in research, competitively allocated across universities.

It provides evidence of the societal and economic benefits of this investment. It leverages additional research funding which universities compete for, in the UK and internationally. The UK has long been regarded as a pioneer in its research evaluations. Other countries have taken elements of our good practice and replicated it in their own systems.

The assessment process has led to an increase both in our research quality and importantly the impact that this research has on our economy and society. The government’s ambition for the UK as a global science and technology superpower relies in part on the drive for excellence which comes from the international reputation of our research.

Rewarding good practice

The next REF planned in 2028 seeks to introduce a new element to the evaluation of environment – that it should also provide insights into the health of research in the UK’s higher education institutions. In order to do this, the “People, Culture and Environment” element will form an increased component of the assessment.

The hope is that this will evaluate and reward good research practice.

It should encourage, for example, collaborative team-based research, excellent research training, and a more inclusive and attractive research environment. It should also encourage greater movement between academic research and other sectors (e.g. industry and government).

None of this is controversial. An improved research culture is not an alternative to excellence but rather what will allow more of us to excel. A better research culture and excellence in research are mutually reinforcing. Indeed, my own university was one of the first to develop an action plan on research culture to drive such improvements.

The metric side

The more controversial issue is how one develops metrics and carefully codifies qualitative information, all of which will only be approximate indicators of research culture and environment, and how this information will be used to grade different institutions.

The existing REF has evolved over time and every new element has been piloted carefully to ensure that both the sector and the funding bodies had confidence in the results. This is a major innovation and must be handled with similar care – or else there will be a serious loss of confidence in the REF.

I welcome that, in the light of the concerns and doubts expressed, the research funding bodies are reflecting carefully, and have now suggested a more open-ended consultation on the current proposals. Let’s see the evidence on what can be genuinely and rigorously measured and compared across institutions, and then decide what weight to put on that evidence.

A pilot could provide confidence on whether the outcomes in “People, Environment and Culture” can be genuinely scored on the same scale as research output, impact and environment in REF 2021. Alternatively, if there is no time to conduct such a pilot one could ensure that the scores in this section do not impact on the formulaic allocation of funding until the following REF.

Or, better still, instead of granular profile scoring on the REF scale one could simply use this section to gain assurance that institutions meet the required standard and provide a simple “yes/no” of whether they should be eligible to receive public funding. After all, given the important nature of this section is it not better for it to be used to provide a level of firm assurance for funding bodies around people, environment and culture?

Using this section of REF to collect enough information to provide assurance to the funding bodies, but not to score different returns would also significantly reduce the bureaucracy of the REF exercise.

With change comes great responsibility

Having been involved previously with framing the 2021 REF rules through the Stern review, I know that because of the complexity of the exercise, the detail of the “rules of the game” matters a great deal. A small change can lead to unintended consequences. One feature of REF 2021 which was unintended by the Stern review were the “scale effects” in the ratio of impact studies to staff returned, whereby some institutions were able to include many more eligible researchers, driving a higher volume, but without having to scale up their impact studies proportionally. Hopefully this unintended feature will be addressed this time round.

If small changes can have big effects, big changes can have many more unintended consequences. This is why REF 2028 needs very careful design. For example, in previous exercises the number of research papers that universities could submit from a single individual was restricted, ensuring that the submission contained research from a wide and inclusive selection of staff at all career stages.

The new proposals for REF 2028 remove this restriction and incentivise universities to return very many publications from a small number of established staff. That is quite the opposite of the healthy team-based culture we are trying to encourage, and I urge the research funding bodies to maintain the inclusivity of previous assessments.

The UK has a great reputation for the quality of its research. REF provides considerable assurance of that and validates the public investment we receive. Let’s design it with care.

One response to “Changes to research assessment have unintended consequences

  1. REF has assessed environment in many interactions, and produced a metric based on the assessment. Research culture, I argue, is a fundamental part of a vibrant and sustained environment, if not the key ingredient. A metric for culture seems at odds with the spirit of culture, so perhaps it’s assessment is best left to the assessors’ judgement, with the usual guidance against which to judge, as it has done relatively successfully. From the debates, discussions and conversations thus far, we’re beginning to see interesting and useful ideas emerge. Let’s not overanalyses and complicated matters too much. We should also not ignore the work going on around The Hidden REF, from which interesting ideas are emerging.

Leave a Reply