Tuesday, May 30, 2017

Quantity swamps quality

Every now and then I have to review a lot of CVs. What is increasingly striking is the sheer quantity of "line items": papers, grants, citations, talks, seminars, Ph.Ds supervised, public outreach activities, referee activities, committee service, conference organisation, .....
Sometimes teaching, particularly of undergraduates, is almost an afterthought.

One concern is the difficulty of evaluating the quality of this hyperactivity.
This is why metrics are so seductive, particularly to the non-expert.
But even if you want to give some weight to numbers of papers, journal "quality", total research funding, ... I think they are quite hard to interpret.
In some research areas,  papers often have ten authors, and so it is very difficult to know an individual's contribution, even if they are first or last author. I increasingly encounter statements such as "Since I became a faculty member ten years ago I have attracted $7M of external funding". This sounds very impressive. However, once you look at the details you find a mix of grants with long lists of CIs, such as infrastructure grants. Again it is not clear whether the individual was really that central to many of the grants.

Another concern, is I am skeptical that these "highly productive" people have the time, energy, and focus needed to think deeply, work on challenging and ambitious projects, and produce much that is scientifically significant. Quantity crowds out quality.

Finally, it worries me that for many of these people the "system" seems to be "working" for them, i.e. they are getting jobs, promoted, funded, ...

Now I do concede that in some cases I do not have the expertise to fully appreciate the significance or value of what people are doing. However, I fear that is the exception, not the rule.

If my concerns are legitimate, what is the way forward?
If assessing, it is very important to have people involved who have the necessary expertise to critically and fairly evaluate the quality of people's individual scientific contributions. This means asking for and reading letters of reference, and actually reading some of their papers. Neither is infallible but it is a lot better than bean counting.

It is possible for individual fields to preserve a culture of quality taking precedence over quality. For example, in pure mathematics and economics, you will find that "publication rates" can be almost an order of magnitude lower than in physics and chemistry.

On the practical side, you might also consider editing and shortening your own CV so the signal to noise ratio is higher.

Finally, I hope you will personally resist uncritically following this rush to mediocrity.

Is my concern legitimate? If so, what are the ways forward?


  1. https://www.theguardian.com/education/2014/apr/15/recruiting-academic-staff-university
    by Jonathan Wolf, professor of philosophy at University College London and dean of arts and humanities.

    He ends the article " But if anyone has an App that will short-circuit the process, do please get in touch. We might even pay commission". Surely finding the right academic for a vacant position has become very difficult with so many dimensions and variables.

  2. Here (chemistry) we do it (tenure track hiring) the same way as always. Read the papers, read the research proposal, read the letter. Call a couple of letter writers on the phone. Call a couple of *other* people in the "old folks network" on the phone and ask what they remember from the subject at the last conference or two. That last one has a stupefyingly good useful information content. Of course it has the downside that they then know that you are looking at the person and might try too.

    And all this has worked very well.

    1. Glad to see someone does it the sane and sensible way, i.e. "old school".
      A key point here is that departmental faculty doing the evaluation and making the decisions. Unfortunately, in Australia this is happening less and less.

    2. Since you have invoked the old school, why not involve emeritus profs in the area one is searching for the candidate to read their papers in detail. Ever since the rise of the all administrative faculty in many universities worldwide metrics has been the used to recruit,
      later realising they have committed a blunder in selection.

  3. "I’ve just finished reading some of my early papers, and you know, when I’d finished I said to myself, “Rutherford, my boy, you used to be a damned clever fellow.” (1911) "

    Late Ernest Rutherford , you are very right. The more we read the old timers papers now , we understand how clever the old timers were and how diminished we are now.

  4. Somewhat unrelated (although this is about critical analyses of actual work), I wanted to point at the following club:

    Ross, do you have an opinion on the current effort (trying to use a word as neutral as possible) in machine learning in condensed matter physics or materials science (which often have a different focus w.r.t. machine learning)?
    If so, I'd be interested in a blog post about that.

    1. Thanks for the comment.
      Yes the condensed matter journal club is an excellent resource because it provides expert opinion and it is based on people being motivated to select, read and critique work because they are actually interested in it.

      I don't have a well-informed opinion about these recent applications of machine learning. I will look into it and see if I have anything useful to say.

  5. http://book.openingscience.org/
    two articles in the above web site are well written.
    Excellence by Nonsense: The Competition for Publications in Modern Science
    Mathias Binswanger

    Science Caught Flat-footed: How Academia Struggles with Open Science Communication
    Alexander Gerber