Loughborough University in line with its Building Excellence strategy seeks to improve the visibility of its research in an environment where bibliometric indicators are widely used. The University recognises the importance of using bibliometrics responsibly. To this end, Loughborough University's Statement on the Responsible Use of Metrics was approved by Senate on 16 November 2016. The main statement can be found here.
Update: on February 23, 2017 Times Higher Education wrote about the Loughborough statement in an article titled: UK university launches responsible metrics guide.
While the Leiden Manifesto is widely respected as a statement of best practice in applying metrics to research evaluation, many struggle with its application. Insight into how to design an evaluation system inspired by the Leiden Manifesto can be sought in Ghent University’s vision for evaluation of its research. The LM provided a reference and source of inspiration for Ghent’s deliberation.
The discussions at Ghent surfaced these strengths of the LM:
In addition, these limits were noted:
Ghent University's Board of Governors agreed on eight building blocks for a quality evaluation of research:
Many thanks to the research support network, Council for Finnish University Libraries for providing translations into Finnish & Swedish. That makes 14 translations posted.
In a plenary presentation at the September 2016 OECD Blue Sky III meeting in Ghent Belgium, Manuel Heitor Minister for Science, Technology and Higher Education, Portugal, cited the Leiden Manifesto as furthering a central debate in research assessment.
Since the last Blue Sky Forum in Ottawa, 2006, a set of major international declarations and movements have been promoted worldwide to call for new policy actions about the need to give priority to changes in research assessment practices and, above all, in scientific and academic career development paths. . . .
Although this debate started in the nineties, only in recent years the need to promote further reflection has been effectively recognized, as a result of excessive proliferation of ill--‐ informed and misapplied metrics. This has been clearly addressed in a set of major international reports and declarations, including the San Francisco Declaration of 2012, the Commission Recommendations on Self--‐Regulation in Professional Science of the German DFG, in 2013, and the Leiden Manifesto44 of April 2015. The principles set out in these documents emphasize the importance of peer review and best practices based on an integrated and responsible vision of research contents.
Source: Heitor, M. (2016) What do we need to measure to foster “Knowledge as Our Common Future”? A Position Paper, presented at OECD Blue Sky II, Ghent, Belgium, September
Thanks to Natsuo ONODERA & Masatsura IGAMI of the National Institute of Science and Technology Policy, Japan, we now have a Japanese translation available. That makes 12 translations and one extended commentary. Thank you to all our volunteer translators!
On September 3, 2016 in Barcelona at the annual meeting of the European Association for the Study of Science and Technology, the 2016 EASST Ziman Award for a ‘collaborative promotion of public interaction with science and technology’ was made to
The Leiden Manifesto ‐ declaration, website and international network
Diana Hicks (Georgia Institute of Technology), Paul Wouters, Ludo Waltman, Sarah de Rijcke (CWTS, Leiden) & Ismael Rafols (Ingenio, Valencia).
by Fred Steward, president of EASST, who said:
The Leiden Manifesto is an initiative to engage with the rise of metrics based research assessment by articulating a set of principles which draw on the insights of science and technology studies on the nature of knowledge.
It has a distinctive European dimension as a partnership between Dutch and Spanish centres in science, technology and innovation studies along with a US scholar and arises from the European hosting of an international scientometrics conference.
It addresses a broad audience of 'evaluators' who are often tasked with a role of assessing research performance with the ultimate goal of reassuring 'public' accountability. The manifesto represents a serious and successful public-facing and comprehensible interpretation of the technical area of metrics which is understandable by a wide audience. It draws on state of the art knowledge on research metrics and is linked to an extensive range of international projects, publications, conferences, workshops and networks.
Presented as a distillation of best practice it is at the same time informed by core STS concepts about knowledge. It emphasises situatedness both in terms of different cognitive domains and research missions as well as the wider socioeconomic, national and regional context. It also engages with performativity and the way in which indicators can change the knowledge system itself.
The initiative is designed to influence evaluation practice rather than simply to critique it. This is an impressive effort to take specialised scientometric knowledge into a wide policy arena. Much research evaluation practice and discourse is quite narrowly national in nature. This collaboration has turned it into a wider international European and global conversation. Its relevance to widely diverse national contexts is shown by the number of translations from Catalan to Chinese. It is generating a significant 'impact' through the creation of an extensive international network. Research evaluation is often treated in a technocratic and managerial fashion. This initiative promotes a more reflexive approach and recommends a coevolution approach.
John Ziman, President of EASST 1983-86 contributions to 'public interaction' involved a number of interventions on contemporary political aspects of science - social responsibility of scientists, expert conflict and innovation, freedom of scientists in the Soviet Union, and careers within the science system. They could be described as public actions aimed at politicians and scientists.
This initiative resonates with the Ziman tradition in being addressed to a broad interdisiplinary professional audience of evaluators and scientists on a visible public issue of research accountability.
So-Young, Yu, Jae Yun Lee, EunKyung Chung, and Boram Lee. (2015). A Review of Declarations on Appropriate Research Evaluation for Exploring Their Applications to Research Evaluation System of Korea, JOURNAL OF THE KOREAN SOCIETY FOR INFORMATION MANAGEMENT 32(4):249-272.DOI: 10.3743/KOSIM.2015.32.4.249
ResearchGate URL : https://goo.gl/HYDYzb
Inappropriate applications of bibliometric approaches and misinterpretation of analysis in research evaluation have been found and recognized nationally and internationally as the use of bibliometrics has been rapidly adopted in various sectors in research evaluation systems and research funding agencies. The flood of misuse led to several declarations and statements on appropriate research evaluation, including Leiden Manifesto, DORA, IEEE Statement, etc. The similar recommendations from five different declarations, Leiden Manifesto, IEEE Statement, DORA, Institut de France, and Thomson Reuters White paper were reviewed and meta-analyzed in this study and it is revealed that most of them emphasize evaluation on quality in various aspects with multiple indicators. Research evaluation assessing multiple aspects of individual research based on the understandings of its purpose and pertinent subject area was revealed as being mostly advised in the declarations, and this recommendation can be regarded as being mostly requested in national research evaluation system. For future study, interviews with relevant stakeholders of national research evaluation system in order to explore its application are needed to confirm the findings of this review.
Thanks to the team from Ewha Woman's University, Myongji University and Hannam University.