When do important reorientations or shifts in research agendas come about in scientific fields? A brief brainstorm led us to formulate three possible causes. First of all, a scarcity of resources can bring about shifts in research agendas, for instance on an institutional level (because research management decides on cutting the budgets of ill-performing research units). A second, related cause, are alignments of agendas through strategic (interdisciplinary) alliances, for the purpose of obtaining funding. A third cause for reconsideration of research agendas are situations of crisis, for instance those brought about by large-scale scientific misconduct or by debates on undesirable consequences of measuring productivity only in terms of number of articles.
Zooming in on the latter point: the anxiety over the consequences of a culture of ‘bean counting’ seems to be getting bigger. Unfortunately, solid analyses are rare that tease out these exact consequences for the knowledge produced. A recent contribution to the European Journal of Social Psychology does however offer such an analysis. In the article, and appropriating Piet Vroon’s metaphor of the ‘exploded confetti factory’, professor Naomi Ellemers voices her concern over the production of increasing amounts of gradually shorter articles in social psychology (a field in crisis), the decreasing amounts of references to books, and the very small 5-year citation window that researchers tend to stick to (cf. Van Leeuwen 2013). Ellemers laments the drift toward publishing very small isolated effects (robust, but meaningless), which leaves less and less room for ‘connecting the dots’, i.e. cumulative knowledge production. According to Ellemers, the current way to assess productivity and research standing has the opposite effect of leading to a narrowing of focus. Concentrating on amount of (preferably first-authored) articles in high impact journals does not stimulate social psychologists to aim for connection, but instead leads them to focus on ‘novelty’ and difference. A second way to attain more insight, build a solid knowledge base and generate new lines of research is through intra- and interdisciplinary cooperation, she argues. If her field really wants to tackle important problems in their full complexity – including the wider implications of specific findings – methodological plurality is imperative. Ellemers recommends that the field extends its existing collaborations – mainly with the ‘harder’ sciences – to also include other social sciences. A third way to connect the dots, and at least as important for ‘real impact’, is to transfer social-psychological insights to the general public:
“There is a range of real-life concerns we routinely refer to when explaining the focal issues in our discipline for the general public or to motivate the investment of tax payers’ money in our research programs. These include the pervasiveness of discrimination, the development and resolution of intergroup conﬂict, or the tendency toward suboptimal decision making. A true understanding of these issues requires that we go beyond single study observations, to assess the context-dependence of established ﬁndings, explore potential moderations, and examine the combined effect of different variables in more complex research designs, even if this is a difﬁcult and uncertain strategy.” (Ellemers 2013, p. 5)
This also means, Ellemers specifies, that social psychologists perform more conceptual replications, and always specify how their own research fits in with and complements existing theoretical frameworks. It means that they should not refrain from writing meta-analyses and periodic reviews, and from including references to sources older than 10 years. This, Ellemers concludes, would all contribute to the goal of cumulative knowledge building, and would hopefully put an end to collecting unconnected findings, ‘presented in a moving window of references’.
What makes Ellemers’ contribution stand out is that she not only links recent debates about the reliability of social-psychological findings and ensuing ‘methodological fetishism’ to the current evaluation culture, but also that she doesn’t leave it at that. Ellemers subsequently outlines a research agenda for social psychology, in which she also argues for more methodological leniency, room for creativity and more comprehensive theory-formation about psychological processes and their consequences. Though calls for science-informed research management are also voiced in other fields and are certainly much needed, truly content-based evaluation procedures are very difficult to arrive at without substantive discipline-specific contributions like the one Ellemers provides.