Skip to main content
Log in

Peer review of grant applications in biology and medicine. Reliability, fairness, and validity

  • Published:
Scientometrics Aims and scope Submit manuscript

Abstract

This paper examines the peer review procedure of a national science funding organization (Swiss National Science Foundation) by means of the three most frequently studied criteria reliability, fairness, and validity. The analyzed data consists of 496 applications for project-based funding from biology and medicine from the year 1998. Overall reliability is found to be fair with an intraclass correlation coefficient of 0.41 with sizeable differences between biology (0.45) and medicine (0.20). Multiple logistic regression models reveal only scientific performance indicators as significant predictors of the funding decision while all potential sources of bias (gender, age, nationality, and academic status of the applicant, requested amount of funding, and institutional surrounding) are non-significant predictors. Bibliometric analysis provides evidence that the decisions of a public funding organization for basic project-based research are in line with the future publication success of applicants. The paper also argues for an expansion of approaches and methodologies in peer review research by increasingly focusing on process rather than outcome and by including a more diverse set of methods e.g. content analysis. Such an expansion will be necessary to advance peer review research beyond the abundantly treated questions of reliability, fairness, and validity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Armstrong, P. W., Caverson, M. M., Adams, L., Taylor, M., Olley, P. M. (1997), Evaluation of the heart and stroke foundation of canada research scholarship program: Research productivity and impact, Canadian Journal of Cardiology, 13(5): 507–516.

    Google Scholar 

  • Bakanic, V., Mcphail, C., Simon, R. J. (1987), The manuscript review and decision-making process, American Sociological Review, 52(5): 631–642.

    Article  Google Scholar 

  • Bornmann, L., Daniel, H. D. (2003), Begutachtung durch Fachkollegen in der Wissenschaft. Stand der Forschung zur Reliabilität, Fairness und Validität des Peer-Review-Verfahrens. In: Schwarz, S., Teichler, U. (Eds), Universität auf dem Prüfstand. Konzepte und Befunde der Hochschulforschung. Campus, Frankfurt am Main, pp. 211–230.

    Google Scholar 

  • Bornmann, L., Daniel, H. D. (2005), Selection of research fellowship recipients by committee peer review. Reliability, fairness and predictive validity of Board of Trustees’ decisions, Scientometrics, 63(2): 297–320.

    Article  Google Scholar 

  • Bornstein, R. F. (1991), The predictive validity of peer review: A neglected issue, Behavioral and Brain Sciences, 14(1): 138–9.

    Google Scholar 

  • Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S., Swan, A. (2007), Incentivizing the open access research web: Publication-archiving, data-archiving and scientometrics, CTWatch Quarterly, 3(3): 17–18.

    Google Scholar 

  • Carter, G. M. (1978), The Consequences of Unfunded NIH Applications for the Investigator and His Research, Rand Corporation, Santa Monica.

    Google Scholar 

  • Carter, G. M. (1974), Peer Review, Citations, and Biomedical Research Policy: NIH Grants to Medical School Faculty, Rand Corporation, Santa Monica.

    Google Scholar 

  • Carter, G. M. (1978), A Citation Study of the NIH Peer Review Process, Rand Corporation, Santa Monica.

    Google Scholar 

  • Chapman, G. B., Mccauley, C. (1994), Predictive validity of quality ratings of National Science Foundation graduate fellows, Educational and Psychological Measurement, 54(2): 428–438.

    Article  Google Scholar 

  • Cicchetti, D. V. (1991A), The reliability of peer review for manuscript and grant submissions: A crossdisciplinary investigation, Behavioral and Brain Sciences, 14(1): 119–135.

    Google Scholar 

  • Cicchetti, D. V. (1991B), Reflections from the peer-review mirror, Behavioral and Brain Sciences, 14: 167–186.

    Google Scholar 

  • Cicchetti, D. V., Sparrow, S. A. (1981), Developing criteria for establishing interrater reliability of specific items: Applications to assessment of adaptive behavior. American Journal of Mental Deficiency, 86(2): 127–137.

    Google Scholar 

  • Claveria, L. E., Guallar, E., Cami, J., Conde, J., Pastor, R., Ricoy, J. R., Rodriguez, E., Ruizpalomo, F., Munoz, E. (2000), Does peer review predict the performance of research projects in health sciences? Scientometrics, 47(1): 11–23.

    Article  Google Scholar 

  • Cole, J. R., Cole, S. (1981), Peer Review in the National Science Foundation: Phase Two of a Study, National Academy Press, Washington D.C.

    Google Scholar 

  • Cole, S., Fiorentine, R. (1991), Discrimination against women in science: The confusion of outcome with process, In: H. Zuckerman, J. R. Cole, J. T. Bruer (Eds), The Outer Circle: Women in the Scientific Community, Yale University Press, 205–226.

  • Cole, S., Rubin, L., Cole, J. R. (1978), Peer Review in the National Science Foundation: Phase One of a Study, National Academy Press, Washington D.C.

    Google Scholar 

  • Daniel, H. D. (1993), Guardians of Science, VCH, New York.

    Book  Google Scholar 

  • Demicheli, V., Pietrantonj, C. (2004), Peer review for improving the quality of grant applications, The Cochrane Library, 4.

  • Deutsche Forschungsgemeinschaft (2007), Jahresbericht 2006, Bonn.

  • Dirk, L. (1999), A measure of originality: The elements of science, Social Studies of Science, 29(5): 765–776.

    Article  Google Scholar 

  • Garfield, E. (1979), Citation IndexingIts Theory and Application in Science, Technology, and Humanities, John Wiley and Sons, New York.

    Google Scholar 

  • Guetzkow, J., Lamont, M., Mallard, G. (2004), What is originality in the humanities and the social sciences? American Sociological Review, 69(2): 190–212.

    Article  Google Scholar 

  • Harnad, S. (1983), Peer Commentary on Peer Review: A Case Study in Scientific Quality Control. Cambridge University Press.

  • Harnad, S. (1985), Rational disagreement in peer review, Science, Technology, & Human Values, 10(3): 55–62.

    Article  MathSciNet  Google Scholar 

  • Hartmann, I. (1990), Begutachtung in der Forschungsförderung — Die Argumente der Gutachter in der deutschen Forschungsgemeinschaft. Rita G. Fischer, Frankfurt am Main.

    Google Scholar 

  • Hartmann, I., Neidhardt, F. (1990), Peer review at the Deutsche Forschungsgemeinschaft, Scientometrics, 19(5): 419–425.

    Article  Google Scholar 

  • Hemlin, S. (1993), Scientific quality in the eyes of the scientist. A questionnaire study, Scientometrics, 27(1): 3–18.

    Article  Google Scholar 

  • Hirschauer, S. (2004), Peer Review Verfahren auf dem Prüfstand: Zum Soziologiedefizit der Wissenschaftsevaluation, Zeitschrift für Soziologie, 33(1): 62–83.

    Google Scholar 

  • Hoffmann, H., Joye, D., Kuhn, F., Metral, G. (2002), Der SNF im Spiegel der Forschenden. SIDOS, Neuchâtel.

    Google Scholar 

  • Hosmer, D. W., Lemeshow, S. (2000), Applied Logistic Regression, John Wiley and sons, New York.

    Book  MATH  Google Scholar 

  • Howard, L., Wilkinson, G. (1999), Peer review and editorial decision-making, Neuroendocrinology Letters, 20(5): 256–260.

    Google Scholar 

  • Kalthoff, H. (1999), Die Herstellung von Gewissheit: Firmenkredite und Risikoanalyse in Mitteleuropa. Frankfurter Institut für Transformationsstudien, Europa-Universität Viadrina.

    Google Scholar 

  • Kelle, U., Prein, G., Bird, K. (1995), Computer-Aided Qualitative Data Analysis: Theory, Methods and Practice, Sage Publications.

  • Lakatos, I. (1970), Falsification and the methodology of scientific research programmes, Criticism and the Growth of Knowledge, 4: 91–195.

    MathSciNet  Google Scholar 

  • Langfeldt, L. (2001), The decision-making constraints and processes of grant peer review, and their effects on the review outcome, Social Studies of Science, 31(6): 820–841.

    Article  Google Scholar 

  • Longino, H. E. (2002), The Fate of Knowledge, Princeton University Press.

  • Lonkila, M. (1995), Grounded theory as an emerging paradigm for CAQDAS. In: U. Kelle, G. Prein, K. Bird (Eds), Computer-Aided Qualitative Data Analysis. London: Sage.

    Google Scholar 

  • Merton, R. K. (1973), The normative structure of science. In: The Sociology of Science, University of Chicago Press, pp. 267–279.

  • National Science Foundation (n.d.), US NSF — Budget, Retrieved August 6, 2008, from http://www.nsf.gov/about/budget/.

  • Neidhardt, F. (1988), Selbststeuerung in der Forschungsförderung: Das Gutachterwesen der DFG, Westdeutscher Verlag.

  • Oppenheim, C. (1996), Do citations count? Citation indexing and the Research Assessment Exercise (RAE), Serials: The Journal for the Serials Community, 9(2): 155–161.

    Article  Google Scholar 

  • Peters, D. P., Ceci, S. J. (1982), Peer-review practices of psychological journals: The fate of published articles, submitted again, Behavioral and Brain Sciences, 5(2): 187–195.

    Article  Google Scholar 

  • Reinhart, M. (forthcoming), Peer review and quality criteria in science funding: access point versus boundary organization.

  • Reinhart, M. (2006), Peer Review, Retrieved June 26, 2008, from http://www.forschungsinfo.de/iq/agora/Peer%20Review/peer_review.html.

  • Reinahrt, M., Sirtes, D. (2006), Wieviel Intransparenz ist für Entscheidungen über exzellente Wissenschaft notwendig? IfQ Working Paper, 1: 27–36.

    Google Scholar 

  • Schweizerischer Nationalfonds. (2002), Stiftungsurkunde / Statuten, Retrieved August 14, 2007, from http://www.snf.ch/SiteCollectionDocuments/por_org_statuten_d.pdf.

  • Schweizerischer Nationalfonds. (2007), Jahresbericht 2006, Bern.

  • Smith, A., Eysenck, M. (2002), The Correlation Between Rae Ratings and Citation Counts In Psychology, Department of Psychology, Royal Holloway, University of London, UK.

    Google Scholar 

  • Solomon, M. (2001), Social Empiricism, MIT Press.

  • Stricker, L. J. (1991), Disagreement among journal reviewers: No cause for undue alarm, Behavioral and Brain Sciences, 14: 163–164.

    Google Scholar 

  • Strulik, T. (2007), Evaluationen in der Wirtschaft — Rating-Agenturen und das Management des Beobachtetwerdens, Leviathan Sonderheft, 24: 288–314.

    Google Scholar 

  • Weller, A. C. (2001), Editorial Peer Review: Its Strengths and Weaknesses, Information Today.

  • Wood, F., Wessely, S. (1999), Peer review of grant applications: A systematic review. In: F. Godlee, T. Jefferson (Eds), Peer Review in Health Sciences. BMJ Books, London, 14–31.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Martin Reinhart.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Reinhart, M. Peer review of grant applications in biology and medicine. Reliability, fairness, and validity. Scientometrics 81, 789–809 (2009). https://doi.org/10.1007/s11192-008-2220-7

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11192-008-2220-7

Keywords

Navigation