Skip to Main Content

Bibliometrics and Research Evaluation: More Resources

Altmetrics

Altmetrics are alternative metrics to the more traditional methods of measuring research impact, like h-index and journal impact factor. Altmetrics are often metrics about articles but also include metrics for other types of research products, like presentations and data sets. 

Limitations

Limitations of bibliometrics can include:

 - Quality: high citation counts may not indicate quality. An article may be cited frequently because other authors are refuting its findings.

 - Discipline variation: some research fields cite papers more than others. For example, in medicine and health there is a strong culture of citing and using other studies to validate findings.

 - Level of researcher experience: some metrics give experienced researchers an advantage over early career researchers. It is important not to compare researchers who are at different stages of their career.

 - Incomplete data: the tools used to gather bibliometric data do not cover all research areas or index all publications. The results will vary depending on the tool you use.

Leiden Manifesto

The Leiden Manifesto, published in Nature on April 23, 2015, was developed by Diana Hicks, professor of public policy at the Georgia Institute of Technology, Atlanta, Georgia, USA; Peter Wouters, professor of scientometrics; Ludo Waltman, researcher; Sarah de Rijcke, assistant professor at the Centre for Science and Technology Studies, Leiden University, the Netherlands; and Ismael Rafols, science-policy researcher at the Spanish National Research Council and the Polytechnic University of Valencia, Spain.

The 10 principles of the Leiden Manifesto about evaluating research are:

  1.  Quantitative evaluation should support qualitative, expert assessment
  2.  Measure performance against the research missions of the institution, group, or researcher.
  3.  Protect excellence in locally relevant research.
  4.  Keep data collection and analytical processes open, transparent, and simple.
  5.  Allow those evaluated to verify data and analysis.
  6. Account for variation by field in publication and citation practices.
  7.  Base assessment of individual researchers on qualitative judgement of their portfolio.
  8.  Avoid misplaced concreteness and false precision.
  9.  Recognize the systemic effects of assessment and indicators.
  10.  Scrutinize indicators regularly and update them.