Research performance evaluation: the experience of an independent medical research institute
Catherine C. Schapper A , Terence Dwyer A D , Geoffrey W. Tregear B , MaryAnne Aitken A and Moira A. Clay CA Murdoch Childrens Research Institute, Royal Children’s Hospital, Parkville, VIC 3052, Australia. Email: cathy.schapper@ranzcp.org; maryanne.aitken@mcri.edu.au
B Florey Neuroscience Institutes, Howard Florey Institute, Level 2, Alan Gilbert Building, 161 Barry St, Carlton South, VIC 3053, Australia. Email: geoffrey.tregear@florey.edu.au
C Telethon Institute for Child Health Research, PO Box 855, West Perth, WA 6872, Australia. Email: mclay@ichr.uwa.edu.au
D Corresponding author. Email: terry.dwyer@mcri.edu.au
Submitted: 10 June 2011 Accepted: 22 September 2011 Published: 25 May 2012
Abstract
Background. Evaluation of the social and economic outcomes of health research funding is an area of intense interest and debate. Typically, approaches have sought to assess the impact of research funding by medical charities or regional government bodies. Independent research institutes have a similar need for accountability in investment decisions but have different objectives and funding, thus the existing approaches are not appropriate.
Methods. An evaluation methodology using eight indicators was developed to assess research performance across three broad categories: knowledge creation; inputs to research; and commercial, clinical and public health outcomes. The evaluation approach was designed to provide a balanced assessment across laboratory, clinical and public health research.
Results and discussion. With a diverse research agenda supported by a large number of researchers, the Research Performance Evaluation process at the Murdoch Childrens Research Institute has, by necessity, been iterative and responsive to the needs of the Institute and its staff. Since its inception 5 years ago, data collection systems have been refined, the methodology has been adjusted to capture appropriate data, staff awareness and participation has increased, and issues regarding the methodology and scoring have been resolved.
Conclusions. The Research Performance Evaluation methodology described here provides a fair and transparent means of disbursing internal funding. It is also a powerful tool for evaluating the Institute’s progress towards achieving its strategic goals, and is therefore a key driver for research excellence.
What is known about the topic? Increasingly, research funders are seeking to evaluate the impact and outcomes of research spending in order to inform policy decisions and guide research funding expenditure. However, in most instances, research evaluation activities are not undertaken by the organisation conducting the actual research and may not meet their practical needs.
What does this paper add? The paper outlines a research performance evaluation methodology specifically tailored to the needs of the medical research institute conducting the research being evaluated, as a way of evaluating research performance to drive strategic goals and disburse internal funds.
What are the implications for practitioners? This paper provides a clear approach to internal research evaluation using a process that meets the needs of the organisation actually conducting the research, and provides institutional data for strategic planning activities.
References
[1] Frank C, Nason E. Health research: measuring the social, health and economic benefits. CMAJ 2009; 180 528–34.| Health research: measuring the social, health and economic benefits.Crossref | GoogleScholarGoogle Scholar |
[2] Watts G. Beyond the impact factor. BMJ 2009; 338 440–441.
| Beyond the impact factor.Crossref | GoogleScholarGoogle Scholar |
[3] Australian Research Council. The Excellence in Research for Australia (ERA) initiative. Canberra: Commonwealth of Australia; 2010. Available at http://www.arc.gov.au/era/default.htm [verified 8 May 2012].
[4] Hanney SR, Grant J, Wooding S, Buxton MJ. Proposed methods for reviewing the outcomes of health research: the impact of funding by the UK’s ‘Arthritis Research Campaign’. Health Res Policy Syst 2004; 2
| Proposed methods for reviewing the outcomes of health research: the impact of funding by the UK’s ‘Arthritis Research Campaign’.Crossref | GoogleScholarGoogle Scholar |
[5] Hanney S, Packwood T, Buxton M. Evaluating the benefits from health research and development centres: a categorization, a model and examples of application. Evaluation 2000; 6 137–160.
| Evaluating the benefits from health research and development centres: a categorization, a model and examples of application.Crossref | GoogleScholarGoogle Scholar |
[6] National Health and Medical Research Council. NHMRC research funding dataset 2000–2009. Canberra: Commonwealth of Australia; 2010. Available at http://www.nhmrc.gov.au/grants/dataset/rmis/index.htm [verified 8 May 2012].
[7] Weingart P. Impact of bibliometrics upon the science system: inadvertent consequences? Scientometrics 2005; 62 117–31.
| Impact of bibliometrics upon the science system: inadvertent consequences?Crossref | GoogleScholarGoogle Scholar | 1:CAS:528:DC%2BD2MXit1Wgur0%3D&md5=760b625ce6b487eb843d80dca44746f3CAS |
[8] Wooding S, Hanney S, Buxton M, Grant J. The returns from arthritis research. Leiden: RAND Europe; 2004. Available at http://www.rand.org/pubs/monographs/2004/RAND_MG251.pdf [verified 8 May 2012].
[9] Wooding S, Hanney S, Buxton M, Grant J. Payback arising from research funding: evaluation of the Arthritis Research Campaign. Rheumatology 2005; 44 1145–56.
| Payback arising from research funding: evaluation of the Arthritis Research Campaign.Crossref | GoogleScholarGoogle Scholar | 1:STN:280:DC%2BD2MvkslWntw%3D%3D&md5=ba240294ae2d135cf9d576a18be5150dCAS |
[10] Wells R, Whitworth J. Commentary. Assessing outcomes of health and medical research: do we measure what counts or count what we can measure? Aust New Zealand Health Policy 2007; 4
| Commentary. Assessing outcomes of health and medical research: do we measure what counts or count what we can measure?Crossref | GoogleScholarGoogle Scholar |
[11] Brown H. How impact factors changed medical publishing – and science. BMJ 2007; 7593 561–4.
| How impact factors changed medical publishing – and science.Crossref | GoogleScholarGoogle Scholar |
[12] Brumback RA. Worshiping false idols: the impact factor dilemma. J Child Neurol 2008; 23 365–7.
| Worshiping false idols: the impact factor dilemma.Crossref | GoogleScholarGoogle Scholar |
[13] Hirsch JE. An index to quantify an individual’s scientific research output. Proc Natl Acad Sci USA 2005; 102 16 569–72.
| An index to quantify an individual’s scientific research output.Crossref | GoogleScholarGoogle Scholar | 1:CAS:528:DC%2BD2MXht1Kgs7fL&md5=1435c7396a7ca3b9f8919028e9541f8eCAS |
[14] SCImago. SJR – SCImago Journal & Country Rank. Granada: SCImago; 2007. Available at http://www.scimagojr.com [verified 8 May 2012].
[15] Van Noorden R. A profusion of measures. Nature 2010; 465 864–6.
| A profusion of measures.Crossref | GoogleScholarGoogle Scholar | 1:CAS:528:DC%2BC3cXnsVymtbc%3D&md5=7b89f668b28fd96f3babc7f51374954bCAS |
[16] Panel on Return on Investment in Health Research. Making an impact: apreferred framework and indicators to measure returns on investment in health research. Ottawa, ON, Canada: Canadian Academy of Health Sciences; 2009. Available at http://www.cahs-acss.ca/wp-content/uploads/2011/09/ROI_FullReport.pdf [verified 8 May 2012].
[17] Editorial. Assessing assessment. Nature 2010; 465 845 [Published online 8 May 2012].
| Editorial. Assessing assessment.Crossref | GoogleScholarGoogle Scholar |
[18] Kuruvilla S, Mays N, Pleasant A, Walt G. Describing the impact of health research: a Research Impact Framework. BMC Health Serv Res 2006; 6
| Describing the impact of health research: a Research Impact Framework.Crossref | GoogleScholarGoogle Scholar |
[19] Kuruvilla S, Mays N, Walt G. Describing the impact of health services and policy research. J. Health Serv Res Policy 2007; 12 23–31.