Changing the Research Excellence Funding Game to Enable Return upon Investment

  • 13 January 2021
  • General
  • Professor Victor Newman

Academics and management have learnt to “game” the £1 billion Research Excellence Framework exercise leading to a growing overall stratification of UK higher education through a concentration of funds serving to reinforce current strengths whilst increasing existing inequalities.  At first sight this might seem like a loser’s complaint, but the question has to be asked, whether the current academic research “game” is actually worth playing or even investing in, when its outcomes remain largely virtual. 

When government spending is under such pressure due to Covid 19, the cost of borrowing rising and with the open goal of making regional levelling-up policies meaningful without weakening centres of excellence, HMG has to ask itself: just what kind of bangs is it getting for our proverbial bucks?

The UK government has placed research and innovation at the core of its industrial strategy but involving academics is problematical when they don’t have the toolkit nor the culture to play both research and innovation games simultaneously, when one is significantly easier than the other.

Rationale for Policy Change

Recently, David Sweeney (the Executive Chair of Research England) made an interesting observation on the difficulties of persuading career academics to shift academic culture toward rewarding research with measurable impact (in the form of outcomes as opposed to bibliometric activities) to the effect that: “We are edging toward translational work which is probably what the government wanted… [and toward] not… pretending that academic impact is [the same as] societal impact (which some academics can’t seem to get)!” 

David Sweeney’s aside briefly lifted the lid on the Research Excellence Framework (REF) money-jar exposing an expensive artificial research university habit (REF2014 exercise conduct alone cost approximately £250m) and gives an insight into the nature of its inhabitants: liberal in outlook but deeply conservative in maintaining traditional “industrial” practices resisting the call to adopt “translation” of research into measurable, beneficial outcomes. 

Ironically, if we adopt Freud’s definition of denial as a defence mechanism in rejecting painful facts, then not attacking this initial resistance to impact measurement head-on, may be beneficial if the trajectory is toward realising translation of benefits; but only as long as HMG can build pressure that rewards cultural and methodological change to include research that is integrated within an innovation process that leads to outcomes with measurable value for customers and investors. 

Unfortunately, the failure to attack the research impact problem from the policy and funding centre is made worse by the absence of an accepted, objective research impact model with operational criteria that could be locally translated. Thus £55m was estimated in extra costs for UK Higher Education Institutions to articulate their research impact by attempting to translate legacy research into impactful language in order to submit in the form of Impact Case Studies (ICSs) to the REF. A recent exercise by the author's in reviewing a major Publisher’s 2019 Impact Awards found negligible evidence of measurable impact among their award-winning case studies. This wasn’t the fault of the publisher, those judging the awards nor of those submitting their research to the award exercise: the common issue was the academic tendency to conflate publication and workshop activity with measurable outcomes realised in the non-academic world, a variant on “if we build it they will come” in the form of “if we publish it, then something, somewhere, with someone just might happen that could make a measurable difference.” 

The Transformation Prerequisites

If we locate the status-quo within the popular Kubler-Ross 5 stages of grief/ change curve model then research academics remain locked into the first 2 steps of shock and denial, and now require an incentive to move through the frustration and depression phases into experimenting and deciding to invest in the new reality. This will require at least three policy changes: 

  1. Change the Rules

Funders must explicitly change the rules of the current research game to make a new measurable impact game more explicit, rewarded and possible in the form of providing a generic impact research model connecting academic research to explicit innovative outcomes. The author's experience is that the restricted academic research model currently excludes customer-focused methodologies current in the world of innovation. 

  1. Change the Incentives

Make the rewards and incentives for learning to adapt by integrating research into an explicit innovation process focused on customer value, more attractive, upping the ante from 25% to 40% of REF funding. This could include exercises operationalizing legacy research in the marketplace.

  1. Manage the Transition

Facilitate 1, and 2 (above) by explicitly encouraging research universities to build impact research (and innovation) capability and to take necessary risks by including prototype collaborative research impact innovation projects explicitly within future research strategy portfolios. 

This could be encouraged by continuing to reward traditional low-impact, publication-driven research by making an explicit investment in prototyping measurable research impact projects a prerequisite for participation in future REF funding exercises. So if you want to play the old game, you will have to learn to play the new game as well as the old, in order to play at all.

Professor Victor Newman is industrial fellow at the University of Greenwich Business School. Former roles include Head of Pfizer's Global Research University, Director of Strategy & Economics at InnovateUK. Victor is also  an innovation practitioner, strategy board advisor, author and co-director in impactofimpact.com consulting practice for higher education.