Let’s move beyond the rhetoric: it’s time to change how we judge research
Five years ago, the Declaration on Research Assessment was a rallying point. It must now become a tool for fair evaluation, urges Stephen Curry in his call to begin changing research management practice:
There have been many calls for something better, including the Leiden Manifesto and the UK report "The Metric Tide', both released in 2015. Like DORA, these have changed the tenor of discussions around researcher assessment and paved the way for change.
It's time to shift from making declarations to finding solutions. . . .
Stephen Curry, writing in Nature, 8 February 2018, vol. 554, p. 147
At the recent Nordic Workshop for Bibliometrics and Research Evaluation in Helsinki, Heidi Holst Madsen, from the Royal Danish Library and Marianne Gauffriau and Lorna Wildgaard from the University of Copenhagen presented an intriguing method of conveying the limits of bibliometric analysis to clients They propose a "consumer label" explaining how a bibliometric analysis fulfills or falls short on each of the 10 principles of the Leiden Manifesto. Here is their explanation with two sample consumer labels.
In Denmark bibliometric evaluation has only very recently become a formal parameter of evaluation at Universities. The newly implemented model is inspired by the REF system, with the bibliometric evaluation informing departmental reflection as well as institutional evaluation. Unlike the universities of Bath or Bloomington Indiana there are no institutional principles for bibliometric analyses at the University of Copenhagen. We wondered therefore if the Leiden Manifesto could be implemented “bottom-up”, by bibliometricians and librarians, when doing bibliometric analyses. From experience we know that what we are asked for by university researchers and administration will not conform to the Leiden Manifesto principles, and while in an ideal world one would never deliver a bibliometric analysis that does not conform to all ten principles, in practice that can be difficult.
We developed two small case studies, within health sciences at the Copenhagen University Library Bibliometric Service, and investigated if the Leiden manifesto can be 1) used as a check list for ourselves to double-check just how responsible our analyses are, and 2) given to the consumer as a “consumer label” indicating the “quality of the ingredients” of the analysis. “Consumer labels” were developed for both cases, and discussed with the department head and the grant applicant, respectively. Please see the attached slides, that show the results of the Leiden Manifest as a consumer label for both cases.
Summary of the main results
In summary we found that in evaluation of our own practice bibliometricians could do more to ensure that those evaluated verify and legitimize the analysis (LM Principle 5). Further, in interpretation of the analysis, the consumer label mediated to the consumer in a very simple manner that not all research activities and publications are covered and how this can affect the results (Principle 3). This is a great benefit for us and consumer. Often we suspect the consumer skims the part of a bibliometric report concerning the limitations and restraints regarding how to interpret the bibliometric analysis. The consumer label was a useful tool when meeting with the consumer, as it helped us create a dialog about the division of responsibilities. It became clear to both consumer and bibliometrician that it is the responsibility of the client to supply the research mission and the bibliometrician to select appropriate indicators (Principle 2). Both clients were knowledgeable about bibliometric indicators but to systematically discuss a responsible use of the indicators was unfamiliar to them both
We acknowledge the potentials in the “consumer labels”, yet we also find that the implementation of the Leiden Manifesto becomes subjective. Specifically, the division of responsibilities is not described for any of the ten
principles and the standard for fulfilling a principle is not clear.
As a consumer label it is not intuitive – for example it may not be obvious to a consumer that the principle headlines come from the Leiden Manifesto, but the text is our evaluation. The Leiden Manifesto principles themselves are not self-explanatory, and it’s important that the consumers fully understand them. Further we have to question the reliability of the subjective evaluations of the Leiden Manifesto. If the evaluation is performed by a single bibliometrician, is the unreliability of this subjective interpretation too great? We do not have an answer yet. When we evaluated the analyses together, we did not immediately agree on the interpretation of all the principles and/or how the analyses conformed to it, pointing to a need for clarification of the principles.
Thus our next steps are to analyze more use cases and investigate if the Leiden manifesto needs updating/validating to increase its longevity/usefulness. Particularly we are concerned with the division of responsibilities, and how the concepts within the principles be interpreted and operationalized.
So the Leiden Manifesto is very much alive and kicking in Copenhagen and we look forward to working more with the application of the Leiden Manifesto over the coming months.
A paper has recently been published in Digital Library Perspecitves addressing concerns with applying the Leiden Manifesto. Find it here
The paper provides a critical discussion of the Leiden Manifesto for libraries already engaged in bibliometric practices. Full compliance with the Manifesto is time-consuming, expensive and requires a significant increase in bibliometric expertise with respect to both staffing and skill level. Despite these apparent disadvantages, it is recommended that all libraries embrace the Manifesto’s principles. The paper offers practical recommendations based on the work of the European Association for Research Libraries (LIBER) Working Group on Metrics. This work is in the beginning phase and summarizes literature on the topic, as well as the experiences of the members of the Working Group. The discussion reflects today's growing popularity of (quantitative) research assessment which is seen in enthusiasts introducing new metrics (i.e. altmetrics) and by critics demanding responsible metrics that increase objectivity and equity in evaluations.
Carey Ming-Li Chen & Wen-Yau Cathy Lin from Taiwan have just published an article reflecting on lessons to be learned from DORA and the Leiden Manifesto. Although the article is written in Chinese, it includes a long summary in English which can be found at the end. The article compares the Leiden Manifesto and DORA: their frameworks, citation impact, and channels of dissemination. Comparing the document type and journals of citing works, the authors find that DORA has been emphasized by editorial board of journals, whereas the Leiden Manifesto has been cited by articles in scientometrics related fields or interdisciplinary mega journals. Since the Leiden Manifesto was published in Nature, it seems that it disseminated quickly and rapidly accrued citations. However, DORA was designed to be signed, to enable institutions to show that they are working against the impact factor. Nevertheless, as researchers in the scientometrics and research policy fields, the authors agree that indicators themselves are not evil, but we definitely should treat them correctly. The authors see the role of the Leiden Manifesto not limited to discussing the misuse of the impact factor or other indicators; in addition it aims to facilitate consensus between practitioner and stakeholders in their understanding of indicators and research assessment.
The paper is available here.
Ludo Waltman, Paul Wouters and Nees Jan van Eck have developed 10 principles for the responsible use of university rankings. This blog post coincides with the release of the latest version of their CWTS Leiden university ranking. The principles were inspired by their work on the Leiden Manifesto and bear some similarity. The 10 principles are:
1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-independent indicators of university performance
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
5. Comparisons between universities should be made keeping in mind the differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on the ranks of universities in a university ranking should be avoided; the values of the underlying indicators should be taken into account
8. Dimensions of university performance not covered by university rankings should not be overlooked
9. Performance criteria relevant at the university level should not automatically be assumed to have the same relevance at the department of research group level
10. University rankings should be handled cautiously, but they should not be dismissed as being completely useless
See the full post here
Frederik Verleysen & Ronald Rousseau examine how well the Flanders research information system aligns with the Leiden Manifesto in: How the Existence of a Regional Bibliographic Information System can Help Evaluators to Conform to the Principles of the Leiden Manifesto. Journal of Educational Media & Library Sciences, 2017, 54(1).
It is shown that the use of Flanders’ regional bibliographic information system in a performance-based research funding system corresponds to a large extent with the principles of the Leiden Manifesto. Yet, it is argued that there is still room for improvement. We offer this Flemish perspective on the Leiden Manifesto as a suggestion to colleagues worldwide to compare their local bibliographic information systems with the principles set forth in the Leiden Manifesto.
Two years ago, on 23 April 2015, Nature published the Leiden Manifesto for research metrics offering 10 principles to guide use of metrics in research evaluation. In 2016, the Manifesto received the Ziman award of the European Association for the Study of Science and Technology (EASST) for collaborative promotion of public interaction with science and technology. The award citation noted that the Leiden Manifesto is an initiative to engage with the rise of metrics based research assessment by articulating a set of principles which draw on the insights of science and technology studies on the nature of knowledge.
The manifesto represents a serious and successful public-facing and comprehensible interpretation of the technical area of metrics. Much research evaluation practice and discourse is quite narrowly national in nature, in contrast the manifesto is shared by a wide audience generating a wider international European and global conversation. Its global relevance is shown by the number of translations. Volunteers have translated the manifesto into 15 languages: simplified & traditional Chinese, Russian, Korean, Spanish, French, German, Brazilian Portuguese, Japanese, Swedish, Finnish, Persian, Slovak, Basque and Catalan. The translations are mounted on this website which is seen by over 200 unique visitors every day. A video version of the manifesto is mounted on Vimeo and has been played over 2,300 times.
The Leiden Manifesto draws on state of the art knowledge on research metrics and is linked to an extensive range of international projects, publications, conferences, workshops and networks. The article has been viewed over 60,000 times on Nature’s website, and has accumulated 259 citations in Google Scholar, 128 in Scopus and 55 in Web of Science.
The manifesto addresses a broad audience tasked with assessing research performance with the ultimate goal of assuring public accountability. The initiative is designed to influence evaluation practice and to take specialized knowledge into a wide policy arena. The university senates of Ghent, Loughborough, Bath and Indiana Bloomington have developed principles for application of research metrics in their institutions that are explicitly based on the Leiden Manifesto. The principles have guided research policy discussions in Brazil, Panama and Portugal, and have been promoted by Thomson Reuters, then owner of the Web of Science database.
The authors will continue to track developments here.
Paula Stephan, Reinhilde Veugelers and Jian Wang have this week published in Nature a comment entitled: "Blinkered by bibliometrics" that references the Leiden Manifesto in arguing that reviewers in funding, promotion and hiring decisions continue the bad practice of using inadequate indicators to inform their judgements. They provide empirical evidence of the harm this does by documenting the very long time periods needed for citations to accumulate to truly novel work. Citations to work that is not novel come much earlier.
In the same issue, Nature announces it has signed up to DORA and is improving the metrics it provides.
On Wednesday March 29, 2017 Diana Hicks and Cassidy Sugimoto spoke at the faculty plenary of the University of Pittsburgh on the role of metrics in faculty evaluation. Hicks presented an analysis of the impact of the Leiden Manifesto focusing on the 4 sets of principles devised by different university faculties that build on the manifesto. Coverage of the event is found here.