Update Crafting Your Career (CYC)

Screen Shot 2013-09-16 at 11.44.38 AMCrafting your Career (the event co-organised by CWTS and the Rathenau Instituut, 30 October 2013) is attracting a lot of attention. With only two weeks to go, 173 people have registered (we’re aiming for 200) and over a 1000 people have taken our researcher motivation test.  CYC will facilitate a balanced discussion about the pros and cons of recent trends in research evaluation and their effects on scientific research and scientific careers. While we are busy putting together a program leaflet, our moderators are contacting speakers about the details of the interviews and panel debate. Our Rathenau colleagues are working out the details of the ‘fair’ that takes place during the extended break, and Laurens Hessels and I will have a short meeting tomorrow with KNAW-president prof. Hans Clevers to discuss his opening address.

One of our speakers, Dr. Ruth Müller, was interviewed by ScienceGuide on the occasion of our event. Here’s what she has to say about how post-docs structure academic careers in the life sciences, and the pressures they are experiencing.

Advertisement

Stick to Your Ribs: Interview with Paula Stephan — Economics, Science, and Doing Better

A good interview about what is wrong with the current incentives system in science and scholarship.

Science in Transition Conference

Over the next few years, science will have to make a number of important transitions. There is deeply-felt uncertainty and discontent on a number of aspects of the scientific system: the tools measuring scientific output, the publish-or-perish culture, the level of academic teaching, the scarcity of career opportunities for young scholars, the impact of science on policy, and the relationship between science, society and industry.

 The checks and balances of our scientific system are in need of revision. To accomplish this, science should be evaluated on the basis of its added value to society. The public should be given a better insight in the process of knowledge production: what parties play a role and what issues are at stake? Stakeholders from society should become more involved in this process, and have a bigger say in the allocation of research funding. This is the view of the Science in Transition initiators Huub Dijstelbloem (WRR/UvA), Frank Huisman (UU/UM), Frank Miedema (UMC Utrecht), Jerry Ravetz (Oxford) and Wijnand Mijnhardt (Descartes Centre, UU).

Date: 7 and 8 November 2013 

Location: KNAW, Kloveniersburgwal 29, 1011 JV Amsterdam

Free registration:  https://www.knaw.nl/nl/actueel/agenda/science-in-transition-conference 

 

Bibliometrics of individual researchers – the debate in Berlin

The lively debate we had at the ISSI conference in Vienna continued at the STI2013 conference, “Translational twists and turns: science as a socio-economic endeavour” 4-6 September in Berlin. A full plenary was devoted to the challenge of, and the dilemmas in, the application of bibliometrics to the (self)-evaluation of individual researchers, chaired by Ben Martin (SPRU). Martin opened the session with the tale of the rise and fall of a star researcher in economics in Germany. Based on a single dataset created in his PhD project, the economist published an impressive amount of publications. Because he was so productive, he was able to attract more external research funding. When a German university was seeking to increase its chance of getting one of the Excellence Initiative grants, he seemed the perfect person to hire. A few members of the hiring committee then started to actually read his publications. They were all rather similar. Not very surprising given that the research was all based on a single dataset from his PhD project. It turned out that he had published a large number of variations of basically the same article in different journals without anyone noticing these duplications. It was the beginning of the end. A number of journals began retracting these publications, although not with the cooperation of the researcher. This process is still ongoing. A sobering tale, according to Martin. He told the story at the start of the debate to warn against the misuses of performance indicators (such as the number of publications). For a recent overview of cases of fraud and Martin’s experiences as editor of Research Policy see (Martin, 2013).

The plenary had a series of presentations, varying from the state of the debate, to examples of a portfolio approach to individual evaluation, to tensions in science policy with respect to indicator based assessments, to the ethics of the evaluation of individual researchers. A report of the meeting will be published in the ISSI Newsletter shortly (Wouters et al. 2013). Here I wish to highlight the ethical questions which were the focus of Jochen Gläser’s presentation. Currently, there is no agreement on this in the field. It was even questioned whether we actually have an ethical problem. According to Peter van den Besselaar, we may have more a knowledge problem than an ethical problem. Often, it is not clear what the different patterns in the indicator measurements mean. This is partly due to the fact that scientometricians often only use a very limited set of databases, such as Scopus or the Web of Science. According to Van den Besselaar, this makes it more difficult to make the measurements more robust. I agree that combining a variety of databases and other data sources (such as surveys or interviews or national statistical materials) is the way to go. The strongest studies in science studies have often used a diversity of materials.

Nevertheless, I don’t think that this absolves us from facing ethical dilemmas, in particular whenever individual researchers are being assessed with the help of metrics. In his presentation, Gläser discussed whether we need more explicit ethical guidelines. After all, the bibliometric centres have developed guidelines and include extensive explanations of the limits of their indicator reports. Moreover, the details of the performance indicators are also published in the bibliometric literature. Still, he argued in favour of more attention to the ethics of bibliometrics because the position of bibliometrics has changed over the years. He identified three relevant developments: an increased demand for bibliometric services in research management; the emergence of “amateur bibliometrics” thanks to the larger availability of data and indicators; and an increased effectiveness of bibliometrics due to more advanced indicators and increased availability of data sets (including web data). The scope of bibliometric practices is therefore increasing and this requires a more explicit set of guidelines of how to apply bibliometric analyses. This holds for scientometric evaluation in general, but it is particularly pertinent when individual researchers are being assessed. Two indicators play an important role in these assessments, the h-index and the Journal Impact Factor and neither of them are fitted to this role (see Bornmann 2013 on the h-index). Gläser put forward a number of proposals. On the short term, he proposed to start collecting experiences and case descriptions in which things seem to go wrong with research assessments. On the medium term, he proposed to develop, as expert community, a set of guidelines that are made available to research directors, managers, science policy officials and deans, and in which the field reaches some consensus with respect to the state of the art. He also supported a suggestion I had made in a parallel session in Berlin to create an Ombudsoffice for research evaluation. This office should be able to look into complaints about the use of bibliometrics by universities and institutes in research management.

We can expect that this debate will continue at the next indicator conferences.

References:

Bornmann, L. (2013). A better alternative to the h index. Journal of Informetrics, 7(1), 100. doi:10.1016/j.joi.2012.09.004

Wouters, P.F., W. Glänzel, J. Gläser and I. Rafols, “The dilemmas of performance indicators of individual researchers – an urgent debate in bibliometrics”, ISSI Newsletter, forthcoming 2013

Martin, B. R. (2013). Whither research integrity? Plagiarism, self-plagiarism and coercive citation in an age of research assessment. Research Policy, 42(5), 1005–1014. doi:10.1016/j.respol.2013.03.011

%d bloggers like this: