How do the conclusions in our recent report on altmetrics “Users, narcissism, and control” relate to the discussions about altmetrics? We found that four arguments are regularly mentioned in favor of new methods instead of the more traditional citation analysis.
Perhaps one of the best representatives of this body of work is the Altmetrics Manifesto (Priem, Taraborelli, Groth, & Neylon, 2010). The manifesto notes that traditional forms of publication in the current system of journals and books are increasingly supplemented by other forms of science communication. These include: the sharing of ‘raw science’ like datasets, code, and experimental designs; new publication formats such as the ‘nanopublication’, basically a format for the publication of data elements (Groth, Gibson, & Velterop, 2010); and widespread self-publishing via blogging, microblogging, and comments or annotations on existing work (Priem et al., 2010).
The first argument in favor of new impact metrics is diversity and filtering. Because web based publishing and communication has become so diverse, we need an equally diverse set of tools to act upon these traces of communication. The altmetrics tools build on their use as information filters to also start measuring some forms of impact (often defined differently from citation impact).
The second argument is speed. It takes time for traditional publications to pick up citations and citation analysis is only reliable after some initial period (which varies by field). The promise of altmetrics is an almost instant measurement window. ‘The speed of altmetrics presents the opportunity to create real-time recommendation and collaborative filtering systems: instead of subscribing to dozens of tables-of-contents, a researcher could get a feed of this week’s most significant work in her field. This becomes especially powerful when combined with quick “alt-publications” like blogs or preprint servers, shrinking the communication cycle from years to weeks or days. Faster, broader impact metrics could also play a role in funding and promotion decisions.’ (Priem et al., 2010).
The third argument is openness. Because the data can be collected through Advanced Programming Interfaces (APIs), the data coverage is completely transparent to the user. This also holds for the algorithms and code used to calculate the indicators. An important advantage discussed in the literature is also the possibility to end the dependency on commercial databases such as Thomson Reuters’ Web of Science or Elsevier’s Scopus. The difficulties that are entailed in the bottom-up creation of a completely new usage, impact, or citation index is however usually not mentioned. Still, this promise of a non-commercial index that can be used to measure impact or other dimensions of scientific performance should not be disregarded. In the long term, this may be the direction in which the publication system is moving.
The fourth argument is that many web based traces of scientific communication activity can be used to measure aspects of scientific performance that are not captured by citation analysis or peer review. For example, download data could be used to measure actual use of one’s work. The number of hyperlinks to one’s website might also be an indication of some form of impact. Indeed, since the 1990s the fields of internet research, webometrics and scientometrics have developed a body of work comparing the roles of citations and hyperlinks and the possibility of building impact measurements on these analogies (Bar-Ilan & Peritz, 2002; Björneborn & Ingwersen, 2001; Hewson, 2003; Hine, 2005; Rousseau, 1998; Thelwall, 2005).
So, do these four arguments stand up in confrontation with the empirical results we had?
References:
Bar-Ilan, J., & Peritz, B. C. (2002). Informetric Theories and Methods for Exploring the Internet: An Analytical Survey of Recent Research Literature. Library Trends, 50(3), 371-392.
Björneborn, L., & Ingwersen, P. (2001). Perspectives of webometrics. Scientometrics, 50(1), 65-82.
Groth, P., Gibson, A., & Velterop, J. (2010). The anatomy of a nanopublication. Information Services & Use, 30, 51-56. doi:10.3233/ISU-2010-0613
Hewson, C. (2003). Internet research methods: a practical guide for the social and behavioural sciences. London etc.: Sage.
Hine, C. (2005). Virtual Methods: Issues in Social Research on the Internet. Berg.
Priem, J., Taraborelli, D., Groth, P., & Neylon, C. (2010). altmetrics: a manifesto – altmetrics.org. Retrieved January 8, 2012, from http://altmetrics.org/manifesto/
Rousseau, R. (1998). Sitations: an exploratory study. Cybermetrics, 1(1), 1. Retrieved from http://www.cindoc.csic.es/cybermetrics/articles/v1i1p1.html
Thelwall, M. (2005). Link Analysis: An Information Science Approach. San Diego: Academic Press.