Many thanks to the research support network, Council for Finnish University Libraries for providing translations into Finnish & Swedish. That makes 14 translations posted.
In a plenary presentation at the September 2016 OECD Blue Sky III meeting in Ghent Belgium, Manuel Heitor Minister for Science, Technology and Higher Education, Portugal, cited the Leiden Manifesto as furthering a central debate in research assessment.
Since the last Blue Sky Forum in Ottawa, 2006, a set of major international declarations and movements have been promoted worldwide to call for new policy actions about the need to give priority to changes in research assessment practices and, above all, in scientific and academic career development paths. . . .
Although this debate started in the nineties, only in recent years the need to promote further reflection has been effectively recognized, as a result of excessive proliferation of ill--‐ informed and misapplied metrics. This has been clearly addressed in a set of major international reports and declarations, including the San Francisco Declaration of 2012, the Commission Recommendations on Self--‐Regulation in Professional Science of the German DFG, in 2013, and the Leiden Manifesto44 of April 2015. The principles set out in these documents emphasize the importance of peer review and best practices based on an integrated and responsible vision of research contents.
Source: Heitor, M. (2016) What do we need to measure to foster “Knowledge as Our Common Future”? A Position Paper, presented at OECD Blue Sky II, Ghent, Belgium, September
Thanks to Natsuo ONODERA & Masatsura IGAMI of the National Institute of Science and Technology Policy, Japan, we now have a Japanese translation available. That makes 12 translations and one extended commentary. Thank you to all our volunteer translators!
On September 3, 2016 in Barcelona at the annual meeting of the European Association for the Study of Science and Technology, the 2016 EASST Ziman Award for a ‘collaborative promotion of public interaction with science and technology’ was made to
The Leiden Manifesto ‐ declaration, website and international network
Diana Hicks (Georgia Institute of Technology), Paul Wouters, Ludo Waltman, Sarah de Rijcke (CWTS, Leiden) & Ismael Rafols (Ingenio, Valencia).
by Fred Steward, president of EASST, who said:
The Leiden Manifesto is an initiative to engage with the rise of metrics based research assessment by articulating a set of principles which draw on the insights of science and technology studies on the nature of knowledge.
It has a distinctive European dimension as a partnership between Dutch and Spanish centres in science, technology and innovation studies along with a US scholar and arises from the European hosting of an international scientometrics conference.
It addresses a broad audience of 'evaluators' who are often tasked with a role of assessing research performance with the ultimate goal of reassuring 'public' accountability. The manifesto represents a serious and successful public-facing and comprehensible interpretation of the technical area of metrics which is understandable by a wide audience. It draws on state of the art knowledge on research metrics and is linked to an extensive range of international projects, publications, conferences, workshops and networks.
Presented as a distillation of best practice it is at the same time informed by core STS concepts about knowledge. It emphasises situatedness both in terms of different cognitive domains and research missions as well as the wider socioeconomic, national and regional context. It also engages with performativity and the way in which indicators can change the knowledge system itself.
The initiative is designed to influence evaluation practice rather than simply to critique it. This is an impressive effort to take specialised scientometric knowledge into a wide policy arena. Much research evaluation practice and discourse is quite narrowly national in nature. This collaboration has turned it into a wider international European and global conversation. Its relevance to widely diverse national contexts is shown by the number of translations from Catalan to Chinese. It is generating a significant 'impact' through the creation of an extensive international network. Research evaluation is often treated in a technocratic and managerial fashion. This initiative promotes a more reflexive approach and recommends a coevolution approach.
John Ziman, President of EASST 1983-86 contributions to 'public interaction' involved a number of interventions on contemporary political aspects of science - social responsibility of scientists, expert conflict and innovation, freedom of scientists in the Soviet Union, and careers within the science system. They could be described as public actions aimed at politicians and scientists.
This initiative resonates with the Ziman tradition in being addressed to a broad interdisiplinary professional audience of evaluators and scientists on a visible public issue of research accountability.
So-Young, Yu, Jae Yun Lee, EunKyung Chung, and Boram Lee. (2015). A Review of Declarations on Appropriate Research Evaluation for Exploring Their Applications to Research Evaluation System of Korea, JOURNAL OF THE KOREAN SOCIETY FOR INFORMATION MANAGEMENT 32(4):249-272.DOI: 10.3743/KOSIM.2015.32.4.249
ResearchGate URL : https://goo.gl/HYDYzb
Inappropriate applications of bibliometric approaches and misinterpretation of analysis in research evaluation have been found and recognized nationally and internationally as the use of bibliometrics has been rapidly adopted in various sectors in research evaluation systems and research funding agencies. The flood of misuse led to several declarations and statements on appropriate research evaluation, including Leiden Manifesto, DORA, IEEE Statement, etc. The similar recommendations from five different declarations, Leiden Manifesto, IEEE Statement, DORA, Institut de France, and Thomson Reuters White paper were reviewed and meta-analyzed in this study and it is revealed that most of them emphasize evaluation on quality in various aspects with multiple indicators. Research evaluation assessing multiple aspects of individual research based on the understandings of its purpose and pertinent subject area was revealed as being mostly advised in the declarations, and this recommendation can be regarded as being mostly requested in national research evaluation system. For future study, interviews with relevant stakeholders of national research evaluation system in order to explore its application are needed to confirm the findings of this review.
Thanks to the team from Ewha Woman's University, Myongji University and Hannam University.
Leiden Manifesto's altmetrics.com score stands at 1125, placing it in the top 5% of all articles tracked, the 99% percentile of articles the same age, and the 98% percentile of Nature articles of the same age.
Leiden Manifesto aligns with conclusions of the just released independent review of the role of metrics in research assessment and management, The Metric Tide, commissioned by HEFCE in the UK.
The concluding section, entitled Responsible Metrics states:
Drawing on discussions over RRI, we propose the notion of responsible metrics as a way of framing appropriate uses of quantitative indicators in the governance, management and assessment of research. The notion of responsible metrics distils the essence of other important contributions to these debates, including the Leiden Manifesto and DORA. Responsible metrics can be understood in terms of a number of dimensions:
• Robustness: basing metrics on the best possible data in terms of accuracy and scope;
• Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment;
• Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;
• Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system;
• Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.
As stated in the Leiden Manifesto “research metrics can provide crucial information that would be difficult to gather or understand by means of individual expertise. But this quantitative information must not be allowed to morph from an instrument into the goal.” (pp. 134-135)
Check out their website at: https://responsiblemetrics.org