Research performance reporting is fallacious

Adam BUTLER, Guandong XU, Katarzyna MUSIAL

Research output: Chapter in Book/Report/Conference proceedingChapters

Abstract

Citation-based research performance reporting is contentious. The methods used to categorize research and researchers are misleading and somewhat arbitrary. This paper compares cohorts of social science categorized citation data and ultimately shows that assumptions of comparability are spurious. A subject area comparison using research field distributions and networks between a 'reference author', bibliographically coupled data, keyword-obtained data, social science data and highly cited social science author data shows very dissimilar field foci with one dataset very much being medically focused. This leads to the question whether subject area classifications should continue to be used as the basis for the plethora of rankings and lists that use such groupings. It is suggested that bibliographic coupling and dynamic topic classifiers would better inform citation data comparisons. Copyright © 2018 IEEE.

Original languageEnglish
Title of host publicationProceedings of 2018 5th International Conference on Behavioral, Economic, and Socio-Cultural Computing, BESC 2018
Place of PublicationUSA
PublisherIEEE
Pages1-5
ISBN (Electronic)9781728102078
DOIs
Publication statusPublished - 2018

Citation

Butler, A., Xu, G., & Musial, K. (2018). Research performance reporting is fallacious. In Proceedings of 2018 5th International Conference on Behavioral, Economic, and Socio-Cultural Computing, BESC 2018 (pp. 1-5). IEEE. https://doi.org/10.1109/BESC.2018.8697265

Keywords

  • Bibliographic coupling
  • Research performance
  • Social networks
  • Citations
  • Social science

Fingerprint

Dive into the research topics of 'Research performance reporting is fallacious'. Together they form a unique fingerprint.