Tuesday, February 18, 2014

Reviewing grants: a report from the coal face

I  recently reviewed a bunch of grant applications from several different countries and so I thought it might be interesting to compare the approach of different funding agencies and offer some general comments.

First, I don't review everything I am asked to. It just takes too much time. But I do try to be a good citizen. I do make a particular effort in two cases: when I really like the work of the person or when I think the proposal is shoddy and should not be funded, but may have a chance because of luck/politics/hype.
Lately I am receiving a lot more proposals. I fear this may be because of my increased profile due to this blog.

Getting international expert reviews for funding agencies is an increasing challenge. Yet it is absolutely crucial to making sure that money is allocated in the best manner, particularly given low success rates, and the level of complexity and specialisation of proposals. This is particularly important for small countries, such as Australia, as there may be few locals who can really evaluate the feasibility and worth of a specialised proposal.
Nevertheless, for some of reasons discussed below funding agencies are not helping themselves. Goodwill is wearing thin.

I spend at most 1-2 hours on each proposal, including getting passwords, downloading documents, and writing the report. [Sorry if this offends you]. Hence, every minute counts. If funding agency websites are hard to navigate or the proposal has a lot of statistical fluff and bureaucratic mumbo jumbo to wade through it reduces the time I spend on actually evaluating the science.
I also don't what to spend pages reading about why climate change or Moore's law necessitates new technologies. I want to know what science you are going to do and why you are the best person to do it.

Important point for investigators.
Make sure the first page contains a very clear statement about what you are actually planning to do.

So here a few random observations. I list the country agencies in rank order of decreasing ease of evaluation.
I think all required one to read and agree to some ridiculously long statement about confidentiality, conflicts of interest, lack of affiliation with the Nazi party, commitment to diversity, ....

Israel.
The proposal was the shortest, but contained enough information. There was little mumbo jumbo.

Austria.
A program manager sent me the proposal with a form to complete. I did not have to mess with a website. The first few pages of the proposal were unnecessary containing all sorts of bureaucratic mumbo jumbo in both german and english.

USA. NSF.
Sometimes I have significant problems getting the proposal to print off the website.
Most of the grant money seems to go on overhead and summer salaries. More than $100K per year to support one graduate student!
They seem to have so little money now for simple old fashioned single investigator curiosity driven research. Everything has some strings/target attached [education, outreach, energy, nano, ....]
It is refreshing that there seems to be a recognition of quality over quantity. No discussion of metrics. Publication rates [a few per year] that are unacceptable in Australia seem o.k. The CV including publications was only 2 pages, including publications. I felt the 15 pages of science was too long, being like a review article with extra pages containing100 plus references. The one page Data Management Plan was mumbo jumbo to me.

UK.
I found the EPSRC website hard to navigate. At first I could not even find where the proposal was I needed to evaluate. There was a lot of bureaucratic stuff that meant little to me and so it was not clear to me why I even needed to see it. But again I had to waste time trying to figure out if it was relevant or not.

Australia. ARC.
The admin. part of the proposals are too long. At most 10 per cent of the pages are actually about science. The level of hype, both about claimed commercial applications and the quality of the investigators, is often breathtaking and on average significantly exceeds that of other countries. Repeated hyperbolae such as "prestigious, outstanding, cutting edge, world class, ..." quickly become tiresome.

Canada. NSERC.
I am glad Australia is not the worst!
I had to download 6 or 7 different documents. There was all this jargon about HQPs [what? ]. After a while I figured out these were "Highly Qualified Personnel". The investigator CV had to be in some standard Canadian web form that looked like it would have taken a week for the Investigator to enter on the web! The science was discussed in a breathtaking 2? pages!
One feature I found interesting was reading the PI's description of how she/he ran their research group. Given the goal of developing HQPs [!] this seemed appropriate.

A couple of final comments.

Grading on the curve.
Not all countries require a numerical or letter grade, either overall or for different parts.
I think grades are generally problematic, particularly for foreign referees.
You really need to know what grade is necessary to get funding.
For a while in Australia, only proposals ranked in the top 2 per cent were funded. But this was twenty per cent of proposals! [Research is meant to be rational!]
If a rigorous German paid you the complement as ranking you as "Very Good" and in the top 20 per cent it was the kiss of death.
Now Australia sends [at least to local assessors] a bunch of proposals to review. Obviously this is an onerous task, but at least proposals get ranked relative to one another.

Letters of support.
Some of the proposals contain letters of support from the university or collaborators. I generally find these meaningless. They are probably drafted by the PI. Levels of commitment are mostly platitudes. They just look like more paperwork for everyone.

So here is my concrete proposal.
Reviewers should just receive minimal information: scientific proposal, a brief CV,  budget summary, and past funding.

I welcome discussion and encourage others to share their experiences.

No comments:

Post a Comment