Abstract
Our study examines the UK’s Research Excellence Framework 2021 employing an algorithmic method to mimic its outcomes (expressed by their panel experts) and introduce a decision support system for evaluating research outputs. Using CrossRef, Scopus databases, and the Chartered Association of Business Schools’ journal classification, we assessed bibliometric features, finding the citation-based algorithm most effective in producing results close to the ones resulting from the REF panellists. Simulating panellists manually adjusting algorithmic paper classifications, our results closely align with actual evaluations, demonstrating the potential of algorithms to augment human assessments. We also show that the Grade Point Average metric may lead to evaluations that are far from those of panellists and should be avoided.
Original language | English |
---|---|
Article number | 103957 |
Journal | Scientometrics |
DOIs | |
Publication status | Published - 27 Feb 2025 |
Keywords
- Outputs evaluation
- REF
- Research Excellence Framework
- Research evaluation
- Research quality classification algorithms