It's no secret that it's now more difficult than ever to get funded by the NIH, in spite of the fact that Congress doubled the NIH budget between 1998 and 2003. Researchers have been frustrated and wondering, over coffee and in print, where all the money has gone.
NIH director Elias Zerhouni explains what's going on in the November 17th issue of Science. Grant applications have nearly doubled since 1998 - from 24,151 in 1998 to an expected 46,000 applications in 2006. And this isn't just because individual scientists are applying for more grants - in 1998, 19,000 scientists applied for grants, while in 2006, there were 34,000 scientists who applied.
Where did all these people come from? (I should note that I'm one of these people - I just submitted my first NIH application this summer.) When Congress announced its intention to double the NIH budget, universities started expanding - adding new graduate programs, hiring new faculty, and building new core facilities. All this happened amazingly fast, and now we're feeling the crunch. It doesn't help that the NIH budget hasn't kept pace with inflation since 2003, but Zerhouni presents the numbers that lay to rest other explanations for the funding crunch that have been tossed around - such as an excessive investment in large clinical trials or big, Manhattan project-style science at the expense of smaller, innovative projects initiated by individual researchers.
Is this a good thing? The down side is that with more people we'll get more fraud, more mediocre science, and more fragmentation of the scientific community. It's already barely possible to seriously keep up with the literature in one's own field - which means it will be harder to find people on review committees who understand each other. It's much, much harder for a young investigator to get started - in the past, scientists in their 20's and early 30's have been among the most innovative and creative scientists, but young scientists today have their motivation and creativity squashed by the high barriers to independence, barriers which are only overcome when some of your best years are over with. Sure, older scientists are still damn good researchers, but if the start of your scientific career is creatively stunted, it can hobble your thinking later on.
In spite of these drawbacks, the fact is that there is still a hell of a lot of good scientific work to do, even if all of it isn't the most pathbreaking or innovative science. There are a lot of useful details to be worked out, enough to keep people busy for a long time. Money invested in new scientists will be money well spent - much more well spent than much of billions of dollars we lost in the attempted reconstruction of Iraq, money which did more to enrich the already bloated pockets of Dick Cheney's friends than it did to benefit the Iraqis.
The investment in research infrastructure made by US research universities and biotech companies in recent years has helped keep the US at the forefront of an increasingly competitive world-wide scientific community. If we want to stay there, we need to pay for it.
As Zerhouni put it:
"Since 1945, United States success in scientific research and development has been the result of the implicit partnership that exists among academia, the federal government, and industry. In this model, research institutions take the risk of building and developing our national scientific capacity; the federal government, through a competitive peer-review process, funds the best science; and industry plays the critical role of bringing new, safe, and effective products to the public. This strategy is the keystone to sustaining American competitiveness, and must be preserved."
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment