Leiden manifesto for research Metrics
  • Home
  • Video version
  • Translations
  • Blog

Serbian translation posted

9/14/2017

 

LM authors post 10 principles for responsible use of university rankings

5/18/2017

 
Ludo Waltman, Paul Wouters and Nees Jan van Eck have developed 10 principles for the responsible use of university rankings.   This blog post coincides with the release of the latest version of their CWTS Leiden university ranking.  The principles were inspired by their work on the Leiden Manifesto and bear some similarity.  The 10 principles are:

1. A generic concept of university performance should not be used
2. A clear distinction should be made between size-dependent and size-independent indicators of university performance
3. Universities should be defined in a consistent way
4. University rankings should be sufficiently transparent
5. Comparisons between universities should be made keeping in mind the differences between universities
6. Uncertainty in university rankings should be acknowledged
7. An exclusive focus on the ranks of universities in a university ranking should be avoided; the values of the underlying indicators should be taken into account
8. Dimensions of university performance not covered by university rankings should not be overlooked
9. Performance criteria relevant at the university level should not automatically be assumed to have the same relevance at the department of research group level
10. University rankings should be handled cautiously, but they should not be dismissed as being completely useless

See the full post here

Aligning regional research information systems with the LM

5/15/2017

 

Frederik Verleysen & Ronald Rousseau examine how well the Flanders research information system aligns with the Leiden Manifesto in: 
How the Existence of a Regional Bibliographic Information System can Help Evaluators to Conform to the Principles of the Leiden Manifesto. Journal of Educational Media & Library Sciences, 2017, 54(1).

Abstract
It is shown that the use of Flanders’ regional bibliographic information system in a performance-based research funding system corresponds to a large extent with the principles of the Leiden Manifesto. Yet, it is argued that there is still room for improvement. We offer this Flemish perspective on the Leiden Manifesto as a suggestion to colleagues worldwide to compare their local bibliographic information systems with the principles set forth in the Leiden Manifesto. 

PDF here

Impact at 2 years

5/8/2017

 
Two years ago, on 23 April 2015, Nature published the Leiden Manifesto for research metrics offering 10 principles to guide use of metrics in research evaluation.  In 2016, the Manifesto received the Ziman award of the European Association for the Study of Science and Technology (EASST) for collaborative promotion of public interaction with science and technology.  The award citation noted that the Leiden Manifesto is an initiative to engage with the rise of metrics based research assessment by articulating a set of principles which draw on the insights of science and technology studies on the nature of knowledge.    

The manifesto represents a serious and successful public-facing and comprehensible interpretation of the technical area of metrics. Much research evaluation practice and discourse is quite narrowly national in nature, in contrast the manifesto is shared by a wide audience generating a wider international European and global conversation. Its global relevance is shown by the number of translations.  Volunteers have translated the manifesto into 15 languages: simplified & traditional Chinese, Russian, Korean, Spanish, French, German, Brazilian Portuguese, Japanese, Swedish, Finnish, Persian, Slovak, Basque and Catalan.  The translations are mounted on this website which is seen by over 200 unique visitors every day.  A video version of the manifesto is mounted on Vimeo and has been played over 2,300 times.

The Leiden Manifesto draws on state of the art knowledge on research metrics and is linked to an extensive range of international projects, publications, conferences, workshops and networks.  The article has been viewed over 60,000 times on Nature’s website, and has accumulated 259 citations in Google Scholar, 128 in Scopus and 55 in Web of Science.
 
The manifesto addresses a broad audience tasked with assessing research performance with the ultimate goal of assuring public accountability.  The initiative is designed to influence evaluation practice and to take specialized knowledge into a wide policy arena.  The university senates of Ghent, Loughborough, Bath and Indiana Bloomington have developed principles for application of research metrics in their institutions that are explicitly based on the Leiden Manifesto.  The principles have guided research policy discussions in Brazil, Panama and Portugal, and have been promoted by Thomson Reuters, then owner of the Web of Science database. 
​
The authors will continue to track developments here. 

Lament on continuing use of short term bibliometrics

4/28/2017

 
Paula Stephan, Reinhilde Veugelers and Jian Wang have this week published in Nature a comment entitled: "Blinkered by bibliometrics" that references the Leiden Manifesto in arguing that reviewers in funding, promotion and hiring decisions continue the bad practice of using inadequate indicators to inform their judgements.  They provide empirical evidence of the harm this does by documenting the very long time periods needed for citations to accumulate to truly novel work.  Citations to work that is not novel come much earlier.

In the same issue, Nature announces it has signed up to DORA and is improving the metrics it provides.  

Impact of LM presented at University of Pittsburgh

3/31/2017

 
On Wednesday March 29, 2017 Diana Hicks and Cassidy Sugimoto spoke at the faculty plenary of the University of Pittsburgh on the role of metrics in faculty evaluation.  Hicks presented an analysis of the impact of the Leiden Manifesto focusing on the 4 sets of principles devised by different university faculties that build on the manifesto. Coverage of the event is found here.

Indiana University Bloomington faculty council adopts policy on responsible metrics

3/13/2017

 
Arguing that the proliferation of new data sources and tools makes it imperative that institutions develop policies that promote responsible use of metrics, on April 27, 2016, the Bloomington Faculty Council approved such a policy.  Its five foundational priniciples are informed by the Leiden Manifesto:

  1. Systems used by faculty and administrators should acknowledge and take into account the heterogeneity of disciplines by making coverage transparent and including field normalized indicators.
  2. Quantitative indicators generated within these systems should be used to supplement rather than supplant other forms of review, such as peer review.
  3. The structure, data, and use of the system should align with the values of the institution and not incentivize behavior incompatible with these values.
  4. Systems should provide data that are accurate and can be made available for validation.
  5. Data about faculty members should be made available to those faculty members.
The policy intends to promote principled use of metrics, not restrict use of data.  
Find out more here, see the full policy here.

Bath University builds on the LM

3/3/2017

 
The University of Bath has developed a set of principles of research assessment and management that builds on the Leiden Manifesto and Metric Tide.  The principles state that all research assessment and management at the University will be centered on expert judgement, set in the broader environment, supported by reliable data, tailored and transparent.  Thank you Bath!  

Find their statement here.

Loughborough University posts a great implementation of the LM

1/27/2017

 
Loughborough University in line with its Building Excellence strategy seeks to improve the visibility of its research in an environment where bibliometric indicators are widely used.  The University ​recognises the importance of using bibliometrics responsibly. To this end, Loughborough University's Statement on the Responsible Use of Metrics was approved by Senate on 16 November 2016.  The main statement can be found here. 

Update: on February 23, 2017 Times Higher Education wrote about the Loughborough statement in an article titled: UK university launches responsible metrics guide.

LM provides inspiration for Ghent University vision for research evaluation

1/11/2017

 
While the Leiden Manifesto is widely respected as a statement of best practice in applying metrics to research evaluation, many struggle with its application.  Insight into how to design an evaluation system inspired by the Leiden Manifesto can be sought in Ghent University’s vision for evaluation of its research.  The LM provided a reference and source of inspiration for Ghent’s deliberation. 

The discussions at Ghent surfaced these strengths of the LM:
  • The recommendations are concrete and immediately applicable
  • There is no polarization of quality versus quantity, nor of metrics versus peer review methods
  • Emphasis on the research mission of the individual/group/institution
  • Importance of differentiation between disciplines
  • Multiple indicators rather than focus on one single (or one composite) indicator

In addition, these limits were noted:
  • The Leiden Manifesto is a good start but other elements also play a part when evaluating research
  • We should attend to the process of research when we evaluate (e.g. data management, leadership, team work, scientific integrity) rather than merely focus on the product
  • We also need standards/methodologies for peer review, not just for metrics. Something is not “excellent” because an expert says it is “excellent”
  • Institutions do not work in isolation, nor are they entirely autonomous. If national funding mechanisms or allocation models reward particular performances or favour certain metrics such as impact factors, it is impossible to develop internal recommendations against this. At best, you can set up a ‘safe haven’ within your institution where you try to limit but cannot entirely wipe out these pressures. This is particularly difficult in Belgium where funding is heavily dependent on output-based performance indicators.
 
Ghent University's Board of Governors agreed on eight building blocks for a quality evaluation of research:
  1. The choice of an appropriate evaluation method for research is in line with the objective of the evaluation.
  2. The evaluation takes into account the intended impact of the research; strictly academic, economic, societal, or a combination of these.
  3. The evaluation takes into account the diversity between disciplines.
  4. For each chosen evaluation method, the simplicity of the procedure is weighed up against the complexity of the research
  5. The evaluation criteria are drawn up and communicated to all stakeholders in advance.
  6. There are sufficient experts on the evaluation committee who are in a position to adequately assess the quality of the research
  7. The above principles are implemented by means of a smart choice of evaluation indicators and by adopting a holistic approach to peer review.
  8. Any committee or policy measure evaluating research, makes a best effort commitment to translate the above principles into practice.
 A translation of the full vision statement is available here.
<<Previous
Forward>>
Proudly powered by Weebly
  • Home
  • Video version
  • Translations
  • Blog