Rafael Wittek, director of the Internuniversity Center for Social Science Theory and Methodology, based at the University of Groningen, recently attacked Dutch university policies at the occasion of the 25th anniversary of his famous graduate school. One of his targets was “the hype around rankings”. Accredited in 1986, the ICS was the first national social science graduate school in the Netherlands. The school emerged from Dutch networks of PhD students that were funded by the Ministry of Education and Science. According to Wittek, the universities are now trying to get a high score in the global rankings (such as the Times Higher Education ranking, the Shanghai ranking and of course also the Leiden ranking) and he argued that this is a wrongheaded approach. “Rankings as an indicator of quality are a hype. To adopt them is merely a policy reflex.”
I think the sociologist puts his finger on a sore spot in Dutch Science Policy and management. This is particularly true for his critique of the policies around PhD training and the national Graduate Schools. According to Wittek, “The Hague” has been too eager to follow new European guidelines and has promoted the competition, rather than the cooperation, among universities. “In the last couple of years, many national Graduate Schools have been dismantled and new local Graduate Schools have been created in their stead. Dutch universities increasingly claim the results of ‘their’ researchers and give them less possibilities to collaborate with colleagues from other universities”. His remarks will strike a chord with everybody (such as myself) who have been formed in the national research schools in which all or almost all universities worked together. It is indeed a loss that the Dutch ministry discouraged national Graduate Schools and completely switched towards stimulating local ones, although happily a few nation-wide schools are still alive and kicking (such as the Graduate School Science, Technology and Modern Culture).
Still, although his remarks are to the point, I do not think he is completely right. For example, it is simply not true that the Dutch universities would be involved in a ruthless competition with each other. On the contrary, the new trend is the emergence of regional clusters of universities as a new form of intimate collaboration to be able to compete globally with American and Asian universities. Increasing collaboration is moreover the trend in scientific publications, as demonstrated recently by a study of my colleagues at CWTS and by the recent Royal Society Report on scientific networks. The share of the multi-authored, multi-institutional, international publications is still rising, in all fields of research. And their average citation impact is greater than those of single-author or national publications. I don’t think that we should overestimate the power of university boards to limit the scale of scientific collaboration.
Nonetheless, Wittek’s criticism of ranking should certainly be taken very seriously. The sociologist sees a danger in the “policy reflex” for the quality of research and in particular in the areas of high-risk fundamental research. He thinks that researchers who are forced to score high in the rankings will be reluctant to take on big, important questions and will tend to develop a more limited and less risky research agenda. I agree. This is indeed the most important risk of rankings running wild, disconnected from the context of fundamental or applied research. But I think there may be a bit more at play than just policy reflexes. The universities are confronted with an accelerating process of global competition in which new scientific centres are emerging, among others China, India, Brazil, Turkey and Iran. In these countries, researchers tend to have to meet much stricter performance criteria than is usual in the Netherlands. This makes it difficult, perhaps even impossible, for Dutch university boards to ignore this. In the Netherlands this problem is particularly acute since the recent xenophobic hype around immigration in this country is making it already difficult enough to attract talented young researchers from non-European countries. Does this mean that an obsession with rankings is inevitable? I think not. I could imagine a number of alternative, more imaginative strategies to counter this race for the highest position in the rankings.
I do think Wittek is right that recognition by peers is the strongest motivator for researchers. He even thinks that scientists do not need any other stimulus. This last idea may be a bit over the top. But I do think he has a good point. Therefore, rankings can and should be used in direct connection with this peer stimulus. Policies that are only focused on getting higher in the global university rankings indeed do not make much sense. But this does not mean that it makes no sense at all to rank. Rankings can very well be used to get a better understanding of ones strong and weak points (both at the level of individual researchers, groups and institutes, and universities and countries). This can be done while taking into account the specific characteristics of the relevant disciplines. (For different disciplines different databases may be needed to measure the rankings). Ranking in context, that should be possible, shouldn’t it?