This invited lecture at the ESA conference in Prague drew on insights from the Leiden Manifesto and from two recent research projects at our institute in the Evaluation Practices in Context research group. These research projects show how indicators influence knowledge production in the life sciences and social sciences, and how in- and exclusion mechanisms get built into the scientific system through certain uses of evaluative metrics. Our findings point to a rather self-referential focus on metrics and a lack of space for responsible, relevant research in the scientific practices under study. On the basis of these findings I argued in the talk that we need an alternative moral discourse in research assessment, centered around the need to address growing inequalities in the science system. Secondly, the talk considered the most pertinent issues for the community of sociologists from the Leiden Manifesto for research metrics (Hicks, Wouters, Waltman, De Rijcke & Rafols, Nature, 23 April 2015).
Rushforth & De Rijcke (2015). Accounting for Impact? The Journal Impact Factor and the making of biomedical research in the Netherlands. Minerva, 53(2), 117-139.
De Rijcke, S. & Rushforth, A.D. (2015). To intervene, or not to intervene, is that the question? On the role of scientometrics in research evaluation. Journal of the Association for Information Science and Technology, 66 (9), 1954-1958.
Hicks, D., Wouters, P.F., Rafols, I., De Rijcke, S. & Waltman, L. (2015). The Leiden Manifesto for Research Metrics. Nature, 23 April 2015.
Hammarfelt & De Rijcke (2015). Accountability in Context: Effects of research evaluation systems on publication practices, disciplinary norms, and individual working routines in the faculty of Arts at Uppsala University. Research Evaluation, 24(1), 63-77.