Ranking universities

http://www.socialsciences.leiden.edu/cwts/products-services/leiden-ranking-2010-cwts/

In the last two weeks, several new university rankings were published. Since universities are facing ever tougher competition, their placement in university rankings becomes increasingly important. So, I’ll spend a couple of blogs on rankings, how the lists are constructed, and what one needs to take into consideration in their interpretation. It struck me that the business of ranking has become more sophisticated over the years. Now that rankings are an instrument for universities in the competition for resources, researchers and students, the competition between them is also increasing. This can work to increase the quality of these rankings, on the other hand it might also promote an overly simple interpretation. Ranking is a complicated business, because it means that a complex phenomenon such as quality, which is by definition composed of many independent dimensions, is reduced to a one-dimensional list. The attraction of rankings is exactly this reduction of reality to an ordered list in which one’s position is unambiguous. This also means that ranking is an inherently problematic business. For example, a university may have high quality teaching as its core mission. This means this university may not score high in a ranking that does not really take teaching into account. In other words, if one wants to evaluate the performance of an institution, one should take into account its mission. It would still be a difficult task to squeeze the complex network of performances of institutions into a simple ordered list. And perhaps we should abstain from ordered lists as such, and develop a completely new form of presentation of performance data. The importance of university missions and the fact that quality is a complex phenomenon that has many different aspects, is central in a European research project lead by CHEPS in which CWTS also participates. This project may produce a new way of monitoring university performance. But for now, we are stuck with one-dimensional rankings. There are five different university rankings that are commonly used, and I will spend a blog on each of them in the course of this week. These are: the Times Higher Education Supplement ranking, the QS ranking (a spin-off of the THES ranking), the Leiden ranking produced by CWTS, the Shanghai ranking, and the somewhat lesser-known Web of World Universities ranking. In the next blog, I’ll discuss how rankings are being used by universities, then I will discuss each ranking in more detail, to conclude with some ideas about the future of rankings.

Advertisement

Limitations of citation analysis

http://www.socialsciences.leiden.edu/cwts/education/graduate-course2-cwts

An observation at the CWTS Graduate Course Measuring Science: in most lectures, the presenters emphasize not only how indicators can be constructed, measured, and used, but also under what circumstances they should not be applied. Thed van Leeuwen, for example, showed on the basis of the coverage data of the Web of Science that citation analysis should not be applied in many fields in the humanities and social sciences, and certainly not for evaluation purposes. If the references in scientific articles in the Web of Science are analyzed, there are strong field differences in the extent to which they cite articles that are themselves covered by the Web of Science. In biochemistry this is very high (92 %), whereas in the humanities this drops to below 17 %. Since citation analysis is almost always based on Web of Science data, most relevant data on communication in the humanities is missed by citation analysis. Of course, this is well-known and it is the usual argument in the humanities and social sciences against the application of citation analysis. However, this also has meant that most scholars see CWTS principally as associated with any use of citation analysis. CWTS does currently not have a strong reputation as the source of critique of citation analysis, although it has systematically, at least since 1995, criticized the Impact Factor and has also been very critical of the very popular and equally problematic h-index. Interesting mismatch between practice and reputation?

“Idiocy of impact factors”

http://www.mareonline.nl/

Ron de Kloet, professor in medical pharmacology in Leiden and famous for his research on stress, about the journal impact factor in the university weekly Mare (my translation): "In the past, we did not have this complete idiocy around impact numbers". He thinks that those who have to judge scientists on their performance rely too easily on the journal impact factor. "In this way, the journal rather than the researcher is being assessed. And young researchers know that not their individual creativity counts but the visibility of the journal. This can make people obsessed and take away the pleasure in science." Wise words!

Teaching scientometrics and bibliometrics

http://www.socialsciences.leiden.edu/cwts/education/graduate-course2-cwts

Yesterday, the annual Graduate Course Measuring Science started here at CWTS. 24 PhD students and professionals from the information industry (publishers and software houses) are taking week-long a crash course in bibliometrics and scientometrics. Virtually all researchers at CWTS are teaching one or more slots, which gives the students the unique opportunity to get a firm grip on the field from a variety of angles and perspectives. For me, this is a convenient way of immersing myself in the way scientometrics is being done at CWTS and to look at the various methodological debates in the field from the perspective of CWTS. First impression yesterday: the students were bombarded with quite a lot of data and empirical findings, which they seemed to take up calmly. No furious debates yet. But it was only the opening day, so who knows? I am going to discuss the work on modelling the peer review system today, let us see how this goes.

Modelling the peer review system a la SIMCITY

http://www.socialsciences.leiden.edu/cwts/sti-conference2010/

At the STI conference 2010 my colleagues Andrea Scharnhorst, Krzysztof Suchecki from the Virtual Knowledge Studio and I presented our work in progress on modeling the peer review system. The basic idea is simple: is it possible to model the peer review system as if it were a computer game such as Simcity? We followed a strategy where we try to make the model as simple and stupid as possible. So, iniitally we are not trying to mimic reality, but to set up an extremely simplified model of how peer review works in science and academia. Our model consists of two populations: researchers and journals. The researchers have two different roles: they are authors of scientific papers and they are reviewers who judge the quality of scientific papers written by other researchers. Each researcher has her own specific behaviour and the same holds for the journals. The trick of the model is that we incorporated a simulation of quality control, using multi-dimensional vectors. This is extracted from what we know how peer review works. Bascially, reviewers are comparing what they perceive of the work in different dimensions (such as the quality of writing, the images, the statistical reliability, how interesting the quesions are, etc.) with what they perceive as the required quality. We assume that this expected quality relates to the quality of the work that the researcher produces herself. The project is in an early stage, and we are now in the process of writing it up for a proper first publication, mainly on the methodology. At the conference we presented the following poster, that contains more details (I posted it on my Facebook account since this blog software system is apparently not able to process images unless they are very small):

http://www.facebook.com/home.php?#!/photo.php?pid=5014709&fbid=440206878395&id=506988395.

 

Erasmus University embraces Open Access

http://www.eur.nl/eur/universitaire_plechtigheden/openingacademischjaar/

The Erasmus University has opened the new academic year last week by embracing Open Access for all its research publications. From 1 January 2011, it will be obligatory for researchers at the university to deposit their publications, after peer review and corrections, in the institutional repository RePub. The repository staff will take care of web based storage and accessibility in accordance with the specific requirements of the publisher of the research article. According to the Rector Magnificus of Rotterdam, prof. Henk Schmidt, the university aims to make a big leap forward in open access. "Research has made clear that Open Access publications lead to an increase in the number of citations of scientific work". He emphasized that open access is desirable from both a societal and a scientific point of view. The step by the Erasmus University clearly also has the potential to make academic work that has a different form from an article in the traditional research journals more visible and citeable. 

Measuring the world

http://de.wikipedia.org/wiki/Die_Vermessung_der_Welt

Recently, I read Daniel Kehlmann’s ficitonal history about Alexander von Humboldt and Carl Friedrich Gauss, Die Vermessung der Welt. intriguing way to write history of science, because it enables the author to insert internal dialogues which are actually quite plausible, yet by definition unproveable. The two characters are quite different and perhaps symbolize the two basic modalities in quantitative research, recognizable also within the field of scientometrics. Alexander von Humboldt is the outgoing guy, travelling the whole world. He is interested in the particulars of objects, collects huge amounts of birds, stones, insects, plants and describes their characteristics meticulously . Gauss, on the other hand, wants to stay home and thinks about the mathematical properties of the universe. He is interested in the fundamentals of mathematical operations and suspects that they can shed light on the structure of reality. In scientometrics, these two different attitudes come together but never without a fight. Building indicators means thinking through both the mathematical properties of indicators, because this directly affects the question of what the indicator is actually supposed to measure. In technical terms, the validity of the indicator. One also needs other types of insight to understand the validity, such as about what researchers are actually doing in their day to day routines, but a firm grip on the mathematical structure of indicators is indispensable. At the same time, the other attitude is also required. Von Humboldt’s interest in statistical description gives insight into the range of phenomena that one can describe with a particular indicator. A good scientometric group, in other words, needs both people like Gauss and people like Von Humboldt. And indeed, both types are present at CWTS. Let us see how the interactions between them will stimulate new fundamental research in scientometrics and indicator building.

The book has also some interesting observations about the obsession of the key actors for measuring the world and the universe. When Alexander von Humboldt travels through South America, he meets a priest Father Zea, who is sceptical about his expedition. He suspects that space is actually created by the people trying to measure space. He mocks Von Humboldt and reminds him of the time "when the things were not yet used to being measured". in that past, three stones were not yet equal to three leaves and fifteen grams of earth were not yet the same weight as fifteen grams of peas. Interesting idea of the things that need to get used to being measured, especially now that we are tagging our natural and social environments increasingly with RFID tags, social networking sites and smart phone applications such as Layar which adds a virtual reality layer of information to your current location. Later in the book, Gauss adds to this by pondering that his work in surveying (which he did for the money) did not only measure the land, but created a new reality by this act of measuring. Before, there had been only trees, moss, stones, and grass. After his work, a network of lines, angles, and numbers had been added to this. Gauss wondered whether Von Humboldt would be able to understand this.

Scientometric chicken

The beautiful building where CWTS is housed is part of Leiden’s cultural heritage. It was the first physiological laboratory of the university, hence it is named after Willem Einthoven. In this area a couple of proud cocks roam the area, quite beautiful and confident animals.

Until recently, they had not seemed to respond storngly to people. However, the day before yesterday, as I approached, they immediately ran over and clearly expected me to feed them. It was the day of the opening of the academic year, so I guessed someone in a black suit (which I was wearing for the occasion) must have the habit of feeding them. As I left them, they seemed genuinely disappointed.

STI Conference

http://www.socialsciences.leiden.edu/cwts/sti-conference2010/

Next week, we will host the 11th International Conference on Science and Technology Indicators here at CWTS in Leiden. The house will be packed.

 For me, it will be a great opportunity to get updated about the latest developments in the field of STI indicator research. I am especially interested in five different areas: the role of web based data and indicators; changes in the process of evaluation; indicators for the humanities and social sciences; indicators for emerging types of scientific and scholarly output; and last but not least the constructive roles of science and technology indicators.

 It is clear that more researchers are engaged in web based ways of working. This may mean that web based indicators are also becoming more relevant. However, this raises new problems with respect reliability and validity. Another question is whether the web will stimulate "lay scientometrics" where in principle anyone can do pretty sophisticated statistics with the help of software agents and robots. Will this create new challenges for professional scientometricians?

 The web promises to help also in another area: the creation of indicators that do justice to the way researchers and scholars in the humanities work. It is well-known problem that international peer reviewed journals are not always the predominant outlet for research in these areas. Writing books in other languages than English is often more relevant. New media moreover enable forms like films, performances, blogs and wikis. These alternative forms are currently not well covered in STI indicators. This raises the question how the web can help to develop indicators that do more justice to the actual research work that humanists and social scientists are doing. It also raises a dilemma: should we try to capture all relevant work in indicators? What are the downsides of "too much information"?

 This points to the way we are building an increasingly complex society, where knowledge and social interaction is made measurable in new ways (think about how retailers are monitoring their clients through their client cards), and where these measurements are fed back into the cycle of knowledge creation. I am curious how this will play out at the STI conference. 

  

%d bloggers like this: