The university research assessment dilemma: a decision support system for the next evaluation campaigns

Paolo Fantozzi, Valerio Ficcadenti, Maurizio Naldi

Research output: Contribution to journalArticlepeer-review

Abstract

Our study examines the UK’s Research Excellence Framework 2021 employing an algorithmic method to mimic its outcomes (expressed by their panel experts) and introduce a decision support system for evaluating research outputs. Using CrossRef, Scopus databases, and the Chartered Association of Business Schools’ journal classification, we assessed bibliometric features, finding the citation-based algorithm most effective in producing results close to the ones resulting from the REF panellists. Simulating panellists manually adjusting algorithmic paper classifications, our results closely align with actual evaluations, demonstrating the potential of algorithms to augment human assessments. We also show that the Grade Point Average metric may lead to evaluations that are far from those of panellists and should be avoided.
Original languageEnglish
Article number103957
JournalScientometrics
DOIs
Publication statusPublished - 27 Feb 2025

Keywords

  • Outputs evaluation
  • REF
  • Research Excellence Framework
  • Research evaluation
  • Research quality classification algorithms

Cite this