Register      Login
Historical Records of Australian Science Historical Records of Australian Science Society
The history of science, pure and applied, in Australia, New Zealand and the southwest Pacific
RESEARCH ARTICLE (Open Access)

Setting priorities for publicly funded research: the CSIRO priorities method

Garrett Upstill https://orcid.org/0000-0002-0969-5481 A * and Thomas H. Spurling A
+ Author Affiliations
- Author Affiliations

A Office of the Swinburne Chief Scientist, Swinburne University of Technology, PO Box 218, Hawthorn, Vic 3122, Australia.

* Correspondence to: gupstill@gmail.com

Historical Records of Australian Science 36, HR25003 https://doi.org/10.1071/HR25003
Published online: 7 April 2025

© 2025 The Author(s) (or their employer(s)). Published by CSIRO Publishing on behalf of the Australian Academy of Science. This is an open access article distributed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (CC BY-NC-ND)

Abstract

The CSIRO Priorities Method is a way to rank and display research priorities for publicly funded research. This paper describes the development and evolution of the method that was employed in CSIRO throughout the 1990s and, since that time, in several other research organisations in Asia and Europe. It comprises three elements: a framework, a process, and a results screen, and has been used for priority setting at a national, organisational, program and project level. Its key attributes are its simplicity, robustness, and adaptability. This paper fills a gap in the literature about this Method by summarising its development and use, and by providing online references to previously unavailable documents.

Keywords: national, organisational and program priorities, R&D priorities, robust adaptable method, scientific priorities and resource allocation.

Introduction

How to maximise the return to the nation from publicly funded research is a challenge faced by all public research agencies, and one that receives continuing attention.1 The valuing and comparative ranking of different research areas is difficult because of the inherent risks and uncertainties in research outcomes, and this can be aggravated in public organisations where research may be longer term and directed toward public good and non-commercial outcomes.

The CSIRO Priorities Method (the Method) was developed in 1990 as a response to this challenge. It was designed to set priorities and guide spending in the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia’s largest scientific research organisation. CSIRO conducts publicly funded research across a wide spectrum of commercial and non-commercial areas. The Method was used extensively by CSIRO in the 1990s for priority setting at an organisational, sectoral and program level. It has also been adopted for use by other public research agencies, both domestically and internationally. (See ‘Use of Method’, below.)

The Method is targeted at maximising the national benefits, commercial and/or non-commercial, from a portfolio of publicly funded research. It allows comparison of the Attractiveness and Feasibility of research for different purposes. The Method is robust and adaptable, and has proven effective in various settings and areas of research.

This paper fills a gap in the literature about the Method, which has been poorly documented in the public literature and online, primarily because it was created for in-house use in CSIRO. Here we explain its development, outline its key elements, and describe its various applications since 1990. We also reference previously unavailable documents now archived on the Encyclopedia of Australian Science and Innovation website.2

The next section provides historical context and explains the steps leading to the development of the Method. The following section outlines its key elements: the framework, process, and results screen. The next section describes how the Method was used by CSIRO for organisational, sectoral and program priority-setting, as well as its use by other research agencies. The final section reviews the development of the Method and discusses its potential for future use.

Background

History

Some background is relevant to the development of the Method. CSIRO, whose origins date back to 1926, has long been a mainstay of Australian government-funded research. In 1990, it accounted for approximately 16% of government spending on research and development, and approximately 10% of total national spending, including both public and private sectors.3

The 1980s were a period of significant economic reform in Australia, and the federal government implemented a series of measures to improve the international competitiveness of the nation’s industries.4 The government looked to CSIRO to contribute to this agenda and to align its research more closely with industry. In 1987 the Minister for Science, Barry Jones, issued guidelines to CSIRO that made it clear that its primary task was ‘the conduct of strategic and applied research in support of national economic, social, and environmental objectives’, rather than solely scientific ends.5 In addition, and to promote closer links with research users, the government set a 30% external earnings target for all of CSIRO’s research.

In 1986 the CSIRO Act was amended.6 A CSIRO Board was established and the Honourable Neville Wran, a former Premier of New South Wales, was appointed as the first Chairman in 1986. Subsequently the organisation was restructured into six industry-focused institutes, namely the Institute of Plant Production & Processing; Institute of Animal Production & Processing; Institute of Industrial Technologies; Institute of Information Science & Engineering; Institute of Minerals, Energy & Construction; and the Institute of Natural Resources & Environment.

Among the early actions of the Board were the establishment of two committees: one on national research priority-setting, and one on research evaluation. Both committees contributed directly to the development of the CSIRO Priorities Method.

The first committee developed a draft priority setting process using criteria grouped under the headings of ‘Benefits’ and ‘Barriers’, and outlined a procedure for scoring and presenting results that was trialled in early 1990 with a senior CSIRO management group.7

The second committee commissioned a consultancy report on assessing the commercial prospects for CSIRO research. This report proposed, among other things, a conceptual framework for use in evaluating small projects ‘where substantial resources were not available for evaluation.’8

The origins of this framework lay in a two-year study by the US Industrial Research Institute in the 1980s into research productivity,9 and specifically, the Research and Development (R&D) Return, Productivity and Yield framework developed by Foster and others,10 as shown in Fig. 1. The Figure outlines how investment in R&D contributes to an industrial organisation’s profits. It divides the return on R&D into two subordinate ratios: ‘R&D Productivity’ and ‘R&D Yield’. Multiplying R&D Productivity by R&D Yield equals the ‘Return on R&D’ (profits/technical progress). While this is essentially an accounting identity, it allows separation of two measurable components.11

Fig. 1.

The R&D Return, Productivity and Yield framework (Foster and others 1985a, 1985b).


HR25003_F1.gif

R&D productivity and R&D yield are further divided into sub-components. R&D productivity is divided into ‘Potential Productivity’ (the maximum possible productivity improvement within the limits of the technology) and ‘Technology Development Efficiency’ (the efficiency of the R&D organisation compared to the maximum possible). R&D yield is divided into ‘Potential Yield’ (the maximum economic return possible given the structure of the market) and ‘Operating Efficiency’ (the efficiency of the commercialisation effort). This allows the separate consideration of R&D productivity and R&D yield: while R&D productivity relies on inputs related to scientific potential and research management, R&D yield relies on inputs related to market conditions and the potential return on research spending.

In the framework proposed for CSIRO12 the terminology in Fig. 1 was modified: R&D yield was divided into two components, ‘potential value’ and ‘ability to capture benefit for Australia’, and R&D productivity was divided into ‘technical potential’ and ‘research efficiency’.

Creating the CSIRO priorities method

In March 1990 Dr John Stocker took over as CSIRO’s Chief Executive. The Board tasked him with developing a process for establishing national research priorities to guide the organisation’s research expenditure over the 1990–3 funding triennium.13 Stocker secured support for the new process at a senior government level, and marshalled strong top-down and bottom-up support within CSIRO from Institute Directors and Divisional Chiefs as well as CSIRO researchers. A consequence was the closer engagement of scientists with industry research users in building an understanding of the markets involved and the way that technology could contribute.

The process also served as a vehicle for a productive dialogue at executive level with new and potential research users in both government and industry, for example, high level contacts with BHP and several other large companies. It provided the basis for a more disciplined dialogue and an opportunity for research partners to have input, as well as to explain CSIRO’s changed approach to resource allocation (J. W. Stocker [pers.comm.]).

The 1990 exercise centred on a framework, a process, and a results screen that were used to guide funding shifts.14 The framework adopted by CSIRO is set out in the next section. The process used by CSIRO in 1990, developed with the support of institute and corporate planners, is shown in Fig. 2.15 Steps included identification of a set of well-defined research purposes, using a national socio-economic research classification scheme, articulating and refining selection criteria, specifying data and information sets, and developing procedures for scoring. The results were displayed on a results screen, devised by CSIRO, which plotted the feasibility and attractiveness of research for each of the national research purposes.

Fig. 2.

CSIRO Priorities Method: example of process for priority setting.


HR25003_F2.gif

The CSIRO Priorities Method

The Method was used by CSIRO for setting organisational priorities during the 1990s and has been adapted for use in other priority setting exercises, domestically and internationally. (See next section).

The Method comprises three elements—a framework, a process, and a results screen.

The framework

The framework (Fig. 3) uses four criteria to determine the attractiveness and the feasibility of investment for a particular research purpose.

Fig. 3.

CSIRO Priorities Method: the framework.


HR25003_F3.gif
Attractiveness
  • Potential benefits: economic, environmental, and social benefits: the maximum commercial or other returns possible from technology improvements attributable to the research; and

  • Ability to capture the benefits: the ability of Australia’s organisations, private or public sector, to convert technical progress into commercial or other returns.

Feasibility
  • R&D potential: the scientific or technological potential of relevant research areas; and

  • R&D capacity: Australia’s ability to conduct the R&D and realise its potential in a timely way.

A score for ‘Attractiveness’ is achieved by multiplying the score (1–10) for ‘potential benefits’ by the score for ‘Ability to capture benefits’. This is a measure of the benefit of successful research and is determined by factors over which research organisations have little control.

A score for ‘Feasibility’ is achieved by multiplying the score for ‘R&D potential’ by the score for ‘R&D capacity’, and is a measure of the ability to achieve technical progress in Australia (per unit of R&D investment).

Multiplying the scores in this way can be interpreted as combining ‘scope’ and ‘efficiency’. For Attractiveness, one combines the ‘scope’ or maximum potential benefit with the ‘efficiency’ of existing Australian structures to capture these benefits. In the case of Feasibility, one combines the future ‘scope’ of research (for example, using a technology ‘S-curve’) with the ‘efficiency’ of the organisation to advance this research.

The process

Research purposes

Research purposes are the unifying categories that group together related research activities for priority-setting. They may be set at a national, organisational, or program and project level. In 1990 and 1993, 16 national research purposes were identified by CSIRO, using a national socio-economic research classification scheme.16 Later corporate exercises adopted a different classification, based on the organisation’s industry sectors.

Supporting information

Standardised information is critical when comparing different research purposes. For CSIRO’s organisational priority-setting process two information sets were assembled. The first, prepared in consultation with industry experts, provided, for each research purpose, data on the production, exports, imports, value added, research expenditure in Australia and overseas, and summary information on the major performers of research, research intensity, and key issues. A second information set, developed in consultation with scientists and research managers, provided a summary analysis of the technology prospects for each research purpose, considering factors such as resource costs, technology progress, and technical risks and uncertainties.

Discriminant questions

A set of discriminant questions was developed to help analyse background information and to focus discussion. The questions used by CSIRO in 1990 are shown in Table 1.

Table 1.Discriminant questions for the four selection criteria, CSIRO (1990).

Potential benefitsR&D potential
  • Who are the potential users and customers and how will they benefit?

  • What parts of industry and/or the community will benefit from successful research?

  • How will R&D contribute to industry growth and improved competitiveness?

  • What is the size of potential markets in Australia and overseas, in value terms, and what are their growth prospects over the medium to long term?

  • Are there any other important benefits, direct and indirect? eg. environmental (degradation avoided), social (social amenity, health, safety), employment creation.

  • Are there spillover benefits to other industries?

  • How close are the physical and technical limits in the relevant R&D?

  • Are fields mature or developing? (Where is current technology on the S-curve? Is the rate of change rapid, moderate, or slow?)

  • What are the prospects for developing commercially valuable intellectual property, scientific breakthroughs, or major improvements in mature technologies and fields?

Ability to capture benefitsR&D capacity
  • How will successful research be captured in Australia; what is Australia’s ability to exploit the results?

  • Are there potential commercial partners?

  • Can the benefits from the research output be protected?

  • What are the incentives/imperatives for adoption by commercial or public sectors?

  • What is the industry’s and/or community’s commitment to R&D and technical innovation?

  • Can Australian users compete internationally?

  • Are there factors and conditions likely to promote or impede uptake, such as regulations, industry structure, physical conditions, ethical, cultural/social, environmental or political factors?

  • Would the proposed research effort (in terms of critical mass and quality of researchers) be internationally/ nationally competitive in the research field?

  • What is the competitive advantage of Australia’s (CSIRO’s) research effort?

  • Who are the major international (national) research competitors?

  • Does Australia/CSIRO have the capacity to deliver the research, in terms of adequate skills, facilities, and time frame for effective application?

Workshop scoring

Finally, a standardised procedure is needed for scoring by the decision-making group. The process used in 1990 and 1993 by the CSIRO executive group, which combined individual judgement and consensus building, is shown in Table 2. After the first round of scoring the results were reviewed and discussed by the group in a workshop, participants had the opportunity to adjust their scores before the final group scores were averaged.

Table 2.Procedure adopted by CSIRO for the scoring of research purposes.

Prior to the workshopDuring the workshop
  • All research purposes are scored prior to the workshop and scores recorded on summary score sheet.

  • Key discriminant questions are used as guide when making assessments based on Data and Evaluation Sheets and other relevant input material provided.

  • Each research purpose is assessed in order and each criterion separately. For each criterion a score of between 1 and 10 is assigned.

  • The scores are reviewed, using the summary score sheet as a guide. Need to check for consistency within each criterion, with highest scoring for research purposes at 10 or thereabouts, and lowest at 1 or thereabouts, respectively.

  • The pre-workshop scores are collected from participants and entered in a spreadsheet to generate the preliminary attractiveness-feasibility plot.

  • Taking each criterion in turn, expert for each research purpose gives an overview.

  • Pre-workshop scores are surveyed to locate outliers within the group – those whose scores deviate most from the group mean.

  • Following discussion and debate participants are invited to rescore if they assess it to be necessary.

  • Participants complete score checks, the revised scores are entered into a spreadsheet and the revised screens are produced (Attractiveness, Feasibility, R&D Return).

  • The group reviews the screens to check that the relative positions properly depict the outcome of the discussions.

The results screen

The scores for each research purpose are displayed on a results screen, which plots scores for Feasibility against scores for Attractiveness. The feasibility vs attractiveness screen allows research purposes to be grouped by selectivity (Fig. 4). The closer the research purposes are to the top right-hand corner, the more deserving they are of support. Alternatively, research purposes that are closer the origin are less deserving of support and require more selectivity in funding.

Fig. 4.

CSIRO Priorities Method: the results screen.


HR25003_F4.gif

The results screen was adopted as a superior method to one that multiplied the feasibility and attractiveness scores together so as to obtain a single value for each research purpose. (This would mask valuable information by blending internal factors, which the organisation can largely control, with external factors, that it cannot.)

Use of the Method

CSIRO organisational priorities

1990

The Method was first applied by CSIRO in 1990 to determine the organisation’s research priorities for the 1990–3 triennium.17 It sought to identify research priorities from the national standpoint, and thereby guide the allocation of CSIRO research funds: the exercise involved assessing 16 research purposes (defined in terms of national socio-economic objectives) against the four framework criteria and then plotting the results on a feasibility vs attractiveness screen.

The research purposes were cross-disciplinary: Plant Production & Primary Products; Animal Production & Primary Products; Rural-based Manufacturing; Minerals; Energy Resource; Energy Supply; Manufacturing; Information & Communication Industries; Environmental Aspects of Economic Development; Environment; Transport; Construction; Commercial Services; Health; Defence; and Social Development & Community Services. The results of the 1990 exercise are shown in Fig. 5. The resource shifts that followed included the funding of a set of new research projects aligned with the revealed priorities, financed by a 1.5% annual priorities levy applied across the organisation.

Fig. 5.

Results screen for the 1990 CSIRO priority setting exercise.


HR25003_F5.gif
1993

The priority setting exercise for the 1993–6 funding triennium followed a similar approach, involving 16 national research purposes and culminating in a feasibility-attractiveness plot of the final results. The 1.5% levy was used to fund selected multi-divisional programs in high-priority areas over the triennium 1993–6.18

1996

The priority setting arrangements for the CSIRO 1996–9 funding triennium were changed, reflecting the arrival of a new chief executive, Dr Malcolm McIntosh, and the adoption of a new matrix -management structure with 22 industry-based sectors and 27 discipline-based research divisions. CSIRO planning was now sector-based, and the (industry and government) users of research played an increased role as members of the Advisory groups established for each sector.19 Feasibility and attractiveness analysis was used by the sectors to develop their triennial priorities.

1999

The 1999–2002 priority exercise was an integrated part of CSIRO’s sector-based corporate planning process. Feasibility and attractiveness analyses were central to priority setting for each sector.20 The inputs for R&D potential and R&D capacity primarily originated from CSIRO, while inputs on ‘potential value’ and the ‘ability to capture’ relied on contributions from the sector advisory groups. The priority-setting process included a final presentation by each sector to the executive group, which included an account of how each would manage with a hypothetical 20% more, or 20% less, funding over the coming triennium.

CSIRO program and project priorities

The Method was also used in CSIRO for setting at program and project levels. While the central logic was retained, the process was modified according to the needs of each exercise. Examples included:

  1. Use of the Method to consider research opportunities in different industries, and applying the four selection criteria to evaluate the feasibility and attractiveness of potential areas for research. Studies included the Australian plastic and polymers industry, and the areas of waste treatment management and biomaterials.21

  2. Use for prioritising projects within a CSIRO division. The Division of Animal Health, which addresses on endemic diseases affecting farm livestock, adapted the process for assessment of its projects as defined by industry research purpose (for example, sheep, beef cattle, dairy), and by problem areas (parasites, bacteria, viruses, toxins). Projects were scored and shown on a feasibility-attractiveness plot. This was then combined with a separate assessment of project quality to guide resource allocation decisions within the division.22

  3. A similar approach was taken by the CSIRO Division of Soils, which adapted the Method to identify areas of research opportunity for sustainable and profitable management of soil and land resources.23 A workshop was held to define a set of research purposes, and areas of research opportunity (for example, pastures, site rehabilitation, forestry.) Projects were scored and were displayed on an attractiveness vs feasibility screen. These results were the combined with an analysis of project quality to guide the allocation of research funding within the division.

  4. Project reporting and planning. In 1995 the CSIRO Institute of Industrial Technologies (comprising the Divisions of Applied Physics, Biotechnology, Chemicals & Polymers, Manufacturing Technology, and Materials Science) introduced a one-page project reporting format, that was used the 100-plus research projects in the Institute (Box 1).24

Box 1.Using the Priorities Method for project reporting and planning: Guidelines for project data sheets (CSIRO Institute of Industrial Technologies Upstill (1995))

1. Description5. R & D potential
  • Background: Brief account of the nature and purpose of the research, indicating current and planned activities.

  • A technical appraisal of the likely scientific return on research efforts in this field

  • Objectives: The scientific, technical and commercial objectives of project presented in a way that provides a basis for explaining potential benefits.

  • What are prospects for success? What new developments are possible or likely?

2. Milestones
  • Section should address maturity/predictability of this research field worldwide.

  • Major research and commercial objectives and commitments (not continuing phases of a project).

  • Is it rapidly changing? Will rewards come in form of major breakthroughs or improvements in mature technologies?

  • Limited to discrete measurable events which unambiguously have or have not happened by the due date-generally 2 or 3. Target dates will be the nearest quarter in most cases.

  • How far are current applications from physical limits? (“S” curve analysis may contribute).

  • Date and brief details of review process (most recent, next major review).

6. Research competitiveness (R & D Capacity)
3. Commercial benefit (potential benefit)
  • Is current research, including research collaboration arrangements, of critical size and internationally competitive. If not, what is being done about it?

  • What industry will benefit from the successful completion of the project? How?

  • What is competitive edge for research in this field?

  • What is the size of the potential market in Australia and overseas?

  • What is current (and anticipated) intellectual property position?

  • Are there any additional benefits? e.g. other economic, environmental or social benefits.

  • Where does research in project rank internationally? Who are major research competitors?

4. Capture (ability to capture benefits for Australia)7. Resources
  • What is current IP and patent position and strategy? How does this fit with major international competitors in this field?

  • To cover past year, current year, and next three financial years

  • How will successful research be captured in Australia?

  • Project Cost (including overheads); Total Staffing External income [Assured plus Projected (>80% probability)].

  • Which companies are current commercial partners?

8. Research classification
  • Which are potential commercial partners?

  • Socioeconomic Objective Classification; name of code, percentage of research attributed to this code.

  • Would benefits to Australia by way of licence fees or royalties make an off-shore partner attractive?

9. Update
  • Month and year of update.

Use by other countries

The CSIRO Priorities Method has also been employed in other countries.

  1. The Ministry of Research Science and Technology in New Zealand adapted the method in 1991 for a national research priority setting exercise.25 The socio-economic objective- determined groupings were economic (primary, secondary, tertiary), environment, social/cultural, Antarctica, health, space and defence. The criteria used for scoring were research potential, research capacity, ability to capture benefits and (as written) strategic importance. A weighted combination of the scores for each of these four criteria yielded a priority index, which was used to guide the allocation of NZ$260 m of public good science funds for the year 1991–2.

  2. The Method was introduced to LIPI (an Indonesian public research agency now part of BRIN, the National Research and Innovation Agency) during the course of the World Bank-funded Management Systems Strengthening (MSS-LIPI) project. It was used both within the LIPI discipline-based research centres to prioritise projects, and in a national research priority setting exercise.26

  3. The Method was an input in the development of the UK Technology Foresight exercise,27 and the results of this exercise were ranked and displayed on a feasibility vs attractiveness screen,28 which grouped technologies under headings of emerging, intermediate, or key priority areas (Fig. 6).

  4. In 1999 the International Center for Living Aquatic Resources Management (ICLARM) applied the Method to set research priorities in aquatic resource systems such as ponds, coastal waters, and coral reefs. Using the CSIRO approach ‘each resource system was scored for attractiveness (a measure of the potential benefits from conducting research, and the ability to utilise the outcomes in the region) against feasibility (a measure of the scientific potential to provide solutions to aquatic resource issues and ICLARM’s research capacity to undertake the research alone or in partnership)’.29

  5. In 2001 the Method was used in a foresight exercise in the Czech Republic to identify priorities for the National Research Programme (to be launched in 2004). The exercise considered the most important technologies for the national industry and service sectors, and prioritised them. The Method’s criteria were modified. The two criteria for ‘feasibility’ were research & technology importance and absorption potential of the application sector, and the two criteria for ‘attractiveness’, renamed ‘importance’, were economic, social & environmental importance and research & technology opportunities. The results of the national exercise were displayed on a screen in which feasibility was plotted against importance.30

  6. More recently, the Method has been adapted by the UN Environment Program Evaluation Office for use in prioritising evaluations conducted by the Office, and the ranking of opportunities for evaluation.31 The report established a set criteria and indicators, and used these to determine the relative attractiveness and feasibility for a set of potential evaluations. The report noted the return would be high when both attractiveness and feasibility measures were high, but increased selectivity would be needed when attractiveness and/or feasibility decline and the likely return was lower.

Fig. 6.

Generic priorities in science and technology—assessment of attractiveness and feasibility: UK Technology Foresight Exercise, 1995.


HR25003_F6.gif

Discussion and summary

The impetus for this paper came from the realisation that, while the CSIRO Priorities Method had been successfully used since 1990 for research priority setting, there was little information about it available in the public literature or online. We have endeavoured to fill this gap by outlining its evolution, citing key documentation, and describing its use.

The context

Working out how to set priorities for public funded research for the greatest public benefit is an issue that needs to be continuously addressed. Are current priorities misplaced? Would a shift in resource allocation yield greater benefit? These are questions that can be posed for all research agency budgets and all science and technology budget decisions by government. The CSIRO Priorities Method is just one of multiple approaches that can be applied to addressing these questions.

The Method also falls within a broader priority setting frame. In his 1997 report Dr John Stocker, as the Australian chief scientist, identified three types of priorities for government: ‘thematic’ priorities, focusing on the relative importance of scientific disciplines, socioeconomic objectives, or broad fields encompassing a range of disciplines or objectives; ‘structural or goal-oriented’ priorities, focusing on the relative importance of different broad goals for science and technology (for example, improving science and technology capabilities), and ‘mechanism’ priorities, focusing on the different mechanisms to support science and technology (for example, grant schemes vs research agency funding).32 Stocker also noted the essentially different nature of (i) high-level, whole-of-government decisions on priorities for mechanisms and structures; (ii) priority setting within a Ministerial portfolio, and (iii) decisions on priorities at an agency or organisational level, something for which the CSIRO Priority Method is well suited. In the event, the Australian government issued a set of national research priorities in 2002. This followed the Batterham report, ‘A Chance to Change’, which recommended integrated priority setting across the federal government.33 The 2002 research priority areas were environmental sustainability, promoting and maintaining health, developing new technologies, and safeguarding the nation.34

The Method has been used at various levels. In New Zealand, Czech Republic, and Indonesia, it was used for national priority setting. It has not been used at a national level in Australia, although in 1990 and 1993 CSIRO adopted a national setting to guide organisational priorities and internal resource allocation. (This did not affect the funding or conduct of any research outside of CSIRO). In this paper we showed how the Method has been used at a number of levels, such as industry sector, divisional, program, and even research-project reporting. The attractiveness vs feasibility screen has been widely adopted as a way displaying the rankings of a variety of research purposes.

Historical development

The Method has its origins in the late 1980s, as CSIRO sought to align its research more closely with the needs of the users of its research in industry and the public sector. First introduced in 1990, it served effectively through the following decade and beyond.

It provided a reliable method for comparing commercial and non-commercial research across the agency, under a common goal of maximising national benefits. It offered a transparent and accountable approach, which was useful for justifying government funding. In addition, it fostered productive dialogue between scientists and the users of CSIRO research in industry and government.

The Method also encouraged a cultural shift in CSIRO during the 1990s. Beyond their role in formal priority-setting, the principles of feasibility and attractiveness became ingrained in CSIRO thinking. This encouraged a shared language as scientists applied the principles in their research planning, their interactions with customers, how they were able to accept the change process from above.

Over time, the Method was adapted for other CSIRO contexts, such as divisional priority-setting or and ranking research opportunities in emerging technology fields, and for project reporting. It was also adopted by several other countries and modified to suit their specific needs.

Summary

The CSIRO Priorities Method is a means to rank and display priorities for public funded research, using two orthogonal measures, attractiveness and feasibility. The Method comprises three elements: a framework, a process, and a results screen. It has been used for priority setting at national, organisational, program and project levels in CSIRO and overseas.

It is a simple, robust and adaptable method. It is a tool for building a consensus view and achieving a shared, well-informed expert view of an organisation’s research priorities. The scoring process delivers comparative, rather than absolute, judgements based on analysis of data and the discussion of arguments for a set of different research purposes.

It offers a relatively simple route to incorporate qualitative and semi-quantitative information in decision making and complements more formal approaches such as cost- benefit analysis and business planning.35 While the latter may be appropriate for well-defined projects, they can also be difficult and expensive to apply rigorously with rigour.

To be noted is that the Method does not specify the way the results of priority exercises are translated into resource shifts. This is a matter left to the organisations conducting the exercise and as Section 4 shows, a variety of approaches have been used. We note the need to design the scoring process so as to minimise any subjectivity in scoring, something that has been commented on.36 Subjective factors are, to a degree, unavoidable in a process directed toward developing a consensus expert view. These can be minimised by incorporating interactive sessions to allow initial scores to be discussed and challenged before rescoring, as well as be specifying the research purposes so as to discourage ‘silo’ thinking (for example, by using research purposes which are cross-disciplinary or cross-organisational).

In closing, we note that while there are multiple ways to set research agendas, the CSIRO Priorities Method can be a useful addition to the mix. It offers a simple, logically sound approach to ranking and displaying priorities for public funding. Its flexibility, and adaptability to different organisational contexts, warrant its consideration for public research agencies seeking to prioritise their research.

Data availability

Data shown in the paper are available from public sources, as listed in footnotes and references within this text.

Conflicts of interest

The authors declare no conflicts of interest.

Declaration of funding

This research did not receive any specific funding.

References

Australian Bureau of Statistics (1993) No. 1297 Australian Standard Research Classification, Canberra.

Batterham, R. (2000) A Chance to Change: A report by the Chief Scientist, Canberra, ACT.

Best, M. (1990) The New Competition, Polity Press, Cambridge, UK, pp. 63–65.

Blyth, M. and Upstill, G. (1994) Effective priority setting for public sector research: CSIRO’s experience, International Science and Technology Policy Seminar, October 1994, Wellington, New Zealand, https://eoas.info/bib/ASBS15631.htm, viewed January 2025.

Brattström, E. (2021) Facilitating collaborative priority-setting for research and innovation: a case from the food sector, Technology Analysis & Strategic Management, 33(7), 742-754.
| Crossref | Google Scholar |

Bureau of Industry Economics (1992) ‘CSIRO’s Priorities Assessment Framework’. in Economic Evaluation of CSIRO Industrial Research, BIE Research Report 39, Canberra, pp. 24–36.

Commonwealth of Australia (1986) Science and Industry Research Legislation Amendment Act 1986, Canberra.

Commonwealth of Australia (1990) Science and Technology Budget Statement 1990-91. https://www.industry.gov.au/sites/default/files/1990-91-science-technology-budget-statement.pdf, viewed January 2025.

CSIRO (1990) Board Sub Committee on National Research Priorities, chaired by Ralph Ward Ambler, CSIRO Board Meeting 32, 20 February 1990, Agenda item 10.

CSIRO (1991a) CSIRO Priority Determination, 1990: Methodology and Results Overview, (Kretschmer Report). CSIRO Corporate Planning Office, Canberra, https://www.eoas.info/bib-pdf/ASBS15622.pdf, viewed January 2025.

CSIRO (1991b) CSIRO priority determination 1990 Role Statements. CSIRO Corporate Planning Office, Canberra, https://www.eoas.info/bib-pdf/ASBS15624.pdf, viewed January 2025.

CSIRO (1991c) ‘Corporate Development: Planning’ in CSIRO Annual Report 1990–91, pp. 55–58.

CSIRO (1993a) Setting priorities for research purposes and research projects: CSIRO Division of Animal Health. CSIRO Corporate Planning Office, Canberra, https://eoas.info/bib-pdf/ASBS15628.pdf, viewed January 2025.

CSIRO (1993b) Setting priorities for research purposes and research projects: CSIRO Division of Soils, CSIRO Corporate Planning Office, Canberra. https://eoas.info/bib-pdf/ASBS15629.pdf, viewed January 2025.

CSIRO (1993c) CSIRO research priorities 1994-95 to 1996-97: A progress report. CSIRO Corporate Planning Office, Canberra, https://eoas.info/bib-pdf/ASBS15626.pdf, viewed January 2025.

CSIRO (1997) CSIRO Strategic Research Plan 1997–98 to 1999–2000.

CSIRO (2000) CSIRO Strategic Plan 2000-01 to 2002-03, pp. 133–139.

Dennis, C. (2002) Australia sets priorities for future research, Nature, 420, 597.
| Crossref | Google Scholar | PubMed |

Foster, R., Linden, L., Whitely, R., and Kantrow, A. (1985a) Improving the Return on R&D-I, Research Management, 28(1), 12-17 https://www.jstor.org/stable/24120647.
| Google Scholar |

Foster, R., Linden, L., Whitely, R., and Kantrow, A. (1985b) Improving the Return on R&D- II, Research Management, 28(2), 13-22 https://www.jstor.org/stable/24120755.
| Google Scholar |

Georghiou, L. (1996) The UK Technology Foresight Programme, Futures, 28(4), 359-377.
| Google Scholar |

Industry Commission (1995) ‘CSIRO’s priority-setting system in research and development’, in Research and Development, Report No 44, Vol 3: Appendices pp. B1-B16. Australia Government Publishing Service, Canberra

International Center for Living Aquatic Resources Management (ICLARM) (1999) Supplement to the ICLARM Strategic Plan 2000–2020, Aquatic Resources Research in Developing Countries, Data and Evaluation by region and resource system. ICLARM Working Paper No. 4. https://digitalarchive.worldfishcenter.org/items/93d59d6e-da24-4938-a8bf-24ac6279b5fa

International Energy Agency (2014) Modelling and Analyses in R&D Priority-Setting and Innovation Workshop, Paris.

Jones, B.O. (1987) Guidelines to CSIRO, CSIRO Annual Report, 1987-88, p. 10.

Klusacek, K. (2001) Selection of Research Priorities Method of Critical Technologies, Technology Centre of the Academy of Sciences, CR, Prague, Czech Republic

Martin, B. (1996) Technology Foresight: capturing the benefits from science-related technologies, Research Evaluation, 6(2), 156-168.
| Google Scholar |

Martin, B., and Johnson, R. (1994) Technology foresight for wiring up the national innovation system: experiences in Britain, Australia, and New Zealand, Technological Forecasting and Social Change, 60(1), 37-54.
| Crossref | Google Scholar |

McKinsey and Company (1987) Assessing the Commercial Prospects for Research. Report to CSIRO Executive, https://eoas.info/bib-pdf/ASBS15627.pdf, viewed January 2025.

Montorzi, G., de Haan, S., and IJsselmuiden, C. (2010) Priority Setting for Research for Health A management process for countries, Council on Health Research for Development, Geneva.

Robertson, M. (1990) Interim Australian Standard Research Classification- Socio-Economic Objectives in CSIRO (1991a), Appendix 1.

Rys, G. (1991) Setting National Science Priorities for New Zealand. Presentation to national meeting February 1991. https://www.researchgate.net/publication/323268238_Setting_National_Science_Priorities_for_New_Zealand

Spilsbury, M.J., Norgbey, S., and Battaglino, C. (2014) Priority setting for evaluation: Developing a strategic evaluation portfolio, Evaluation and Program Planning, 46, 47-57.
| Crossref | Google Scholar | PubMed |

Spurling, T. H., Upstill, G., and Jordan, M. (1988) Research opportunities in the polymer and plastics industry, CSIRO Institute of Industrial Technologies, Canberra, https://eoas.info/bib-pdf/ASBS15632.pdf, viewed January 2025.

Spurling, T.H., Allison, G.,. Bond, W., Bateup, B., Robinson, G., Sutherland, D., Hall, J., Upstill, G., and Kariotoglou, N. (1991) Report of CSIRO Waste Treatment Research Taskforce, CSIRO Canberra, https://www.eoas.info/bib-pdf/ASBS15700.pdf, viewed January 2025.

Spurling, T.H., Redhead, T., and Blyth, M. (2001) Management and Systems Strengthening - Lembaga Ilmu Pengetahuan Indonesia (LIPI): Final report June 2001, CSIRO Canberra and LIPI Jakarta, https://www.eoas.info/bib-pdf/ASBS15699.pdf

Stocker, J.W. (1990) ‘CSIRO on the move’, in Science and Technology: Creating Wealth for Australia NSTAG Forum Report, Canberra, pp. 31–35. https://eoas.info/bib-pdf/ASBS15630.pdf, accessed 22 November 2024.

Stocker, J. W. (1997) Priority matters: a report to the Minister for Science and Technology on arrangements for Commonwealth science and technology, Department of Industry, Science and Tourism, Canberra.

Tinsley, S. (1984) Improving the Return on Research and Development, Industrial Research Institute (Report by the Research-on-Research Committee on R&D Productivity).

Upstill, G. (1995) The CSIRO priorities method, 1990-1995, CSIRO Industrial Technologies, Canberra, https://www.eoas.info/bib-pdf/ASBS15625.pdf, viewed January 2025.

Upstill, G., and Spurling, T. H. (2020) Engaging with Australian industry: CSIRO in the late twentieth century, Historical Records of Australian Science, 31(1), 1-16.
| Crossref | Google Scholar |

Upstill, G., Steele, J., Wright, J. (1990) Biomaterials: Research directions for CSIRO, CSIRO, Canberra, https://www.eoas.info/bib-pdf/ASBS15701.pdf, viewed January 2025.

World Health Organisation (2025) Methods with a focus on health R&D: Priority setting methods. https://www.who.int/observatories/global-observatory-on-health-research-and-development/resources/methods/priority-setting-methods

Footnotes

2 The Encyclopedia of Australian Science and Innovation (EOAS), is a gateway to the history and archives of science, technology and innovation in Australia. It is accessible at https://www.eoas.info/.

11 It has its roots in the work of Donaldson Brown at DuPont and General Motors in the 1920s. Referring to the formula that R (return on investment) equals P (ratio of net profit to sales) times T (ratio of sales to investment), Best (1990) notes ‘P was not new. Measuring earnings as a percentage of sales was as old as bookkeeping: it is the information that constitutes the income or profit and loss account of a business enterprise. Likewise, T was not new in that it uses the data found in the balance sheet. But defining turnover as the ratio of output to investment, breaking it down by department, and linking it to the cost accounts was new’.

12 McKinsey and Company (1987). This approach was used in 1988 to examine research opportunities in the Australian polymer and plastics Industry (Spurling and others 1988).