Friday, May 4, 2018

Metric madness outside the university

Universities are going off the rails because of the blind use of metrics. Unfortunately, this reflects what is also happening in wider society, due to the rise of neoliberalism.
Australia has recently been rocked by scandals involving large banks, leading to the resignation of CEO's, Board chairs, lawyers, ...

This stimulated the following column, by Ross Gittins, the Economics Editor of the Sydney Morning Herald. It is worth reading in full, but I reproduce a few choice extracts.
Banks' misbehaviour shows power of KPIs
... though the financial services industry must surely be the most egregious instance of the misuse of performance indicators and performance pay, let’s not forget “metrics” is one of the great curses of modern times.
It’s about computers, of course. They’ve made it much easier and cheaper to measure, record and look up the various dimensions of a big organisation’s performance, as well as generating far more measurable data about many dimensions of that performance.
 
Which gave someone the bright idea that all this measurement could be used as an easy and simple way to manage big organisations and motivate people to improve their performance. Setting people targets for particular aspects of their performance does that. And attaching the achievement of those targets to monetary rewards hyper-charges them. Hence all the slogans about “what gets measured gets done” and “anything that can be measured can be improved”. 
Thus have metrics been used to attempt to improve the performance of almost all the major institutions in our lives: not just big businesses, but primary, secondary and higher education, medicine and hospitals, policing, the public service – the Tax Office and Centrelink, for instance. Trouble is, whenever we discover new and exciting ways of minimising mental effort, we run a great risk that, while we’re giving our brains a breather, the show will run off the rails in some unexpected way. .... 
I’ve long harboured doubts about the metric mania, but it’s all laid out in a new book, The Tyranny of Metrics, by Jerry Muller, a history professor at the Catholic University of America, in Washington DC....

9 comments:

  1. https://www.ethz.ch/content/dam/ethz/special-interest/chab/chab-dept/department/images/Emeriti/richard_ernst/Publications/Ernst-Follies-Bibliometrics-Chimia-64-90-2010.pdf

    Richard R Ernst , Nobelist gives a clarion call below. We must thank R Ernst in pointing out that literature and classical music has been spared from metrics. If metrics had been followed Kazuo Ishiguro slow paced work " Remains of the Day" would have been missed by many for a good read.

    "And as an ultimate plea, the personal
    wish of the author remains to send all bibliometrics and its diligent servants to the darkest omnivoric black hole that is known in the entire universe, in order to liberate academia forever from this pestilence. And there is indeed an alternative: Very simply,
    start reading papers instead of merely
    rating them by counting citations!

    "Start reading papers" is mental effort as you have mentioned. It takes time to read and understand. Counting citations and kinship index (h index) a word coined by Prof Geoffery is minimising mental effort.
    Prof Geoffery's article to which Prof Eents wrote his follies article below.
    https://www.researchgate.net/publication/49664045_Bibliometrics_as_Weapons_of_Mass_Citation

    Then an article in guardian, very long requires mental effort.

    https://www.theguardian.com/science/2017/jun/27/profitable-business-scientific-publishing-bad-for-science
    The question is should the huge profit making publishing houses give funds to do research to unis. This query is related to the excellent observation from the guardian article below.

    "The way to make money from a scientific article looks very similar, except that scientific publishers manage to duck most of the actual costs. Scientists create work under their own direction – funded largely by governments – and give it to publishers for free; the publisher pays scientific editors who judge whether the work is worth publishing and check its grammar, but the bulk of the editorial burden – checking the scientific validity and evaluating the experiments, a process known as peer review – is done by working scientists on a volunteer basis. The publishers then sell the product back to government-funded institutional and university libraries, to be read by scientists – who, in a collective sense, created the product in the first place"

    ReplyDelete
  2. I’m afraid we are really in THE LAST DAYS, on the way to extinction ...

    ReplyDelete
  3. A high impact factor journal has low impact factor journals cited as references in a paper. If one sits and reads the references in luxury journals you will find references cited from low impact factor journals. This high IFJ citing Low IFJ paradox itself reveals that metrics is truly mismeasurement. One is surprised how the so called rational scientific community globally accepted this dangerously skewed measure for so many years to promote, select, fund faculties in universities and research centres globally. One cannot blame Australia alone for this.

    ReplyDelete
  4. http://blogs.sciencemag.org/books/2018/01/30/the-tyranny-of-metrics/

    A historian explores the dark side of metric-based performance ...

    next another review of Mullers book

    http://timharford.com/2018/02/review-of-the-tyranny-of-metrics-by-jerry-muller/
    then abc net au with Philip adams
    http://www.abc.net.au/radionational/programs/latenightlive/the-tyranny-of-metrics/9447000

    another one

    https://www.bloomberg.com/view/articles/2018-05-01/measuring-performance-is-not-one-size-fits-all
    one more
    https://www.themandarin.com.au/83318-nicholas-gruen-evaluation-knowledge-comes-not-numbers-questions/
    His twitter acct
    https://twitter.com/jerryzmuller?lang=en

    ReplyDelete
  5. https://aeon.co/ideas/against-metrics-how-measuring-performance-by-numbers-backfires

    J Mullers article "Against metrics: how measuring performance by numbers backfires"

    ReplyDelete
  6. It is easy to criticize blind use of metrics. The book by Muller, which I have read, offers many examples. But all of us use heuristics of various kinds to reduce the effort associated with complex tasks. I would welcome constructive suggestions from "metric critics" for scenarios like the following:

    1. You are on a panel reviewing proposals for funding of individual PIs. The panel will review 25 proposals, each 15 single spaced pages long + ~30 pages of supporting data (CVs etc.). At most 15% of the proposals can be selected for funding. You must prepare for and then run this review panel on top of all your day to day commitments.

    2. A department head is required to write annual evaluations for ~40 faculty who wrote ~300 unique papers in the past year. These evaluations are used, in part, to determine salary raises.

    ReplyDelete
    Replies
    1. Granted that I am not a Dept head, and I don't see a way out of #2. However, I do have experience with #1, and I read (studied!) to them all. I feel that if one can't do that, one should politely decline to be on the review panel. It is not appropriate in my personal opinion to do anything less when evaluating proposals on which people have spent so much effort to write them (as the funding field is rather competitive...).

      Delete
  7. https://www.nature.com/articles/423479a.pdf

    https://www.nature.com/articles/423479a
    This is a letter to Nature journal was by David Colquhon
    "A useful method for job interviews that has been used in our department is to ask candidates to nominate their best three or four papers, then question them on the content of those papers. This selects against publication of over-condensed reports in high-impact journals (unless it is one of the relatively few genuinely important papers of this type). It also selects against ‘salami slicing’, and is a wonderful way to root out guest authors, another problem of the age. Experience has shown that candidates can have astonishingly little knowledge of the papers on which their names appear.
    David Colquhoun
    Department of Pharmacology, University College
    London, Gower Street, London WC1E 6BT, UK

    The above letter is for a job interview. Now REF in UK has this. Unfortunately not able to access. This write up below is similar to what David C recommends in his job interview , implying

    "4 quality publications over a period of 6 years. One may have published 400, but one can only declare 4 papers which one consider to be of international impactful quality (not impact factor). The papers are “READ” by nominated experts in the field and scored, and a GPA is obtained for the Unite of research submission – which determines how much money we get depending on the score and the size of the submission"

    Then there is DORA below, which says only 21 out of 96 unis in UK have signed for responsible use of metrics which is a low percentage.

    https://www.nature.com/articles/d41586-018-01874-w

    Then an article of caution on cheats in citation game below

    https://www.nature.com/news/watch-out-for-cheats-in-citation-game-1.20246?WT.ec_id=NATURE-20160715&spMailingID=51828243&spUserID=MzgzOTg2NTc0MzQS1&spJobID=961919509&spReportId=OTYxOTE5NTA5S0
    same article above as pdf below.
    https://www.nature.com/polopoly_fs/1.20246!/menu/main/topColumns/topLeftColumn/pdf/535201a.pdf

    ReplyDelete
  8. This is probably the web site of UK which lays down 4 quality publications over a period of 6 years.
    http://www.ref.ac.uk/

    ReplyDelete