Rankings under Groninger fire

Rafael Wittek, director of the Internuniversity Center for Social Science Theory and Methodology, based at the University of Groningen, recently attacked Dutch university policies at the occasion of the 25th anniversary of his famous graduate school. One of his targets was “the hype around rankings”. Accredited in 1986, the ICS was the first national social science graduate school in the Netherlands. The school emerged from Dutch networks of PhD students that were funded by the Ministry of Education and Science. According to Wittek, the universities are now trying to get a high score in the global rankings (such as the Times Higher Education ranking, the Shanghai ranking and of course also the Leiden ranking) and he argued that this is a wrongheaded approach. “Rankings as an indicator of quality are a hype. To adopt them is merely a policy reflex.”

I think the sociologist puts his finger on a sore spot in Dutch Science Policy and management. This is particularly true for his critique of the policies around PhD training and the national Graduate Schools. According to Wittek, “The Hague” has been too eager to follow new European guidelines and has promoted the competition, rather than the cooperation, among universities. “In the last couple of years, many national Graduate Schools have been dismantled and new local Graduate Schools have been created in their stead. Dutch universities increasingly claim the results of ‘their’ researchers and give them less possibilities to collaborate with colleagues from other universities”. His remarks will strike a chord with everybody (such as myself) who have been formed in the national research schools in which all or almost all universities worked together. It is indeed a loss that the Dutch ministry discouraged national Graduate Schools and completely switched towards stimulating local ones, although happily a few nation-wide schools are still alive and kicking (such as the Graduate School Science, Technology and Modern Culture).

Still, although his remarks are to the point, I do not think he is completely right. For example, it is simply not true that the Dutch universities would be involved in a ruthless competition with each other. On the contrary, the new trend is the emergence of regional clusters of universities as a new form of intimate collaboration to be able to compete globally with American and Asian universities. Increasing collaboration is moreover the trend in scientific publications, as demonstrated recently by a study of my colleagues at CWTS and by the recent Royal Society Report on scientific networks. The share of the multi-authored, multi-institutional, international publications is still rising, in all fields of research. And their average citation impact is greater than those of single-author or national publications. I don’t think that we should overestimate the power of university boards to limit the scale of scientific collaboration.

Nonetheless, Wittek’s criticism of ranking should certainly be taken very seriously. The sociologist sees a danger in the “policy reflex” for the quality of research and in particular in the areas of high-risk fundamental research. He thinks that researchers who are forced to score high in the rankings will be reluctant to take on big, important questions and will tend to develop a more limited and less risky research agenda. I agree. This is indeed the most important risk of rankings running wild, disconnected from the context of fundamental or applied research. But I think there may be a bit more at play than just policy reflexes. The universities are confronted with an accelerating process of global competition in which new scientific centres are emerging, among others China, India, Brazil, Turkey and Iran. In these countries, researchers tend to have to meet much stricter performance criteria than is usual in the Netherlands. This makes it difficult, perhaps even impossible, for Dutch university boards to ignore this. In the Netherlands this problem is particularly acute since the recent xenophobic hype around immigration in this country is making it already difficult enough to attract talented young researchers from non-European countries. Does this mean that an obsession with rankings is inevitable? I think not. I could imagine a number of alternative, more imaginative strategies to counter this race for the highest position in the rankings.

I do think Wittek is right that recognition by peers is the strongest motivator for researchers. He even thinks that scientists do not need any other stimulus. This last idea may be a bit over the top. But I do think he has a good point. Therefore, rankings can and should be used in direct connection with this peer stimulus. Policies that are only focused on getting higher in the global university rankings indeed do not make much sense. But this does not mean that it makes no sense at all to rank. Rankings can very well be used to get a better understanding of ones strong and weak points (both at the level of individual researchers, groups and institutes, and universities and countries). This can be done while taking into account the specific characteristics of the relevant disciplines. (For different disciplines different databases may be needed to measure the rankings). Ranking in context, that should be possible, shouldn’t it?

Advertisement

Anxiety about quality may hinder open access

Anxiety about the quality of open access journals hinders the further spread of open access publications. This conclusion was cited many times during the recent Co-ordinating workshop on Open Access to Scientific Information, in Brussels on May 4 this year. The workshop was attended by about 70 key players in Open Access and was organized by two EU directorates: Research and Information Society & Media. The critical role of quality control came to the fore in various ways.

Salvatore Mele (CERN), coordinator of the SOAP project presented the results of their study (based on a Web survey) of the attitudes prevailing among researchers with respect to open access. They reveal a remarkable gap between strong support for open access on the one hand and a lack of actual open access publishing on the other hand. 89 % of the researchers say they are in favour of open access publishing. At the same time, only between 8 and 10 % of the articles published are open access. According to the SOAP study, two factors are mainly responsible for this gap: the problem of financing open access publications and the perceived lack of quality of many open access journals. The Journal Impact Factor of journals was also mentioned as a reason not to publish in existing open access journals.

The weight of these factors does vary by field. For example, in chemistry 60 % of the researchers mention financial reasons as barrier to open access, whereas only 16 % of the astronomers see finance as problematic. In astronomy, worries about the quality of journals are mentioned most (by more than half of the astronomers) whereas this is only seen as a problem by about one-fifth of the chemists. This result points, by the way, to the need to develop specific open access policies for different scientific and scholarly fields. For example, in the humanities open access books will be an important issue.

Quality of the journals was also central in a new initiative made public at the workshop by the delegation of the ICT organization of the Dutch universities SURF: Clearing the Gate. This initiative is aimed at funding organizations such as the Dutch research council NWO. It calls upon them to develop a preference for open access publications for the research they fund. They should give priority to publications in high quality open access journals as a condition for funding. SURF is convinced that once this priority is installed, we will witness a strong growth in the number of available open access journals of a high to very high quality. The presentative of NWO joined this initiative and made clear that his organization already supports new open access journals in the social sciences and humanities. This Spring, NWO will publish a Call aimed at the other disciplines. NWO also supports the OAPEN initiative for open access books in the humanities. An important motivation for the organization is financial: “we do not want to pay twice for the same research”. For evaluators and scientometricians, this development is an interesting challenge as well. How to evaluate open access activities in research?

Note: My Dutch language report of the EU Open Access workshop meeting was published in the journal Onderzoek Nederland, nr. 277, 7 May 2011, p. 8.

My presentation at the EU workshop is available here.

Teaching in Madrid

Started my visiting professorship at the Faculty of Library and Information Science, Complutense University in Madrid today with a nice class discussion about research evaluation. Here is the presentation I gave about the role of information science in research evaluation.

Open Access in European Science Policy

http://prezi.com/_bfvsoqb7pvp/acumen-academic-careers-understood-through-measurement-and-norms/

Attended an interesting workshop today on Coordinating Open Access in European Science Policy. We presented our ACUMEN project there, making the connection with the current Open Access debates. Here is the full presentation.

Evaluating e-research

We had a very interesting discussion last week at the e-Humanities Group of the Royal Netherlands Academy of Arts and Sciences. The problem I presented is how to evaluate e-research, the newly emerging style of scientific and scholarly research that makes heavy use of, and contributes to, web based resources and analytical methods. The puzzle is that current evaluation practices are strongly biased towards one particular mode of scientific output: peer reviewed journal articles, and within that set in particular those articles published in journals that are used as source materials for the Web of Science, published by ISI/Thomson Reuters. If scholars in the sciences, social sciences and humanities are expected to contribute to e-science and e-research, it is vital that the reward and accounting system in the universities do honour work in this area. Here is the link to the presentation "Evaluating e-Research".

%d bloggers like this: