Setting priorities for publicly funded research: the CSIRO priorities method
Garrett Upstill
A
Abstract
The CSIRO Priorities Method is a way to rank and display research priorities for publicly funded research. This paper describes the development and evolution of the method that was employed in CSIRO throughout the 1990s and, since that time, in several other research organisations in Asia and Europe. It comprises three elements: a framework, a process, and a results screen, and has been used for priority setting at a national, organisational, program and project level. Its key attributes are its simplicity, robustness, and adaptability. This paper fills a gap in the literature about this Method by summarising its development and use, and by providing online references to previously unavailable documents.
Keywords: national, organisational and program priorities, R&D priorities, robust adaptable method, scientific priorities and resource allocation.
Introduction
How to maximise the return to the nation from publicly funded research is a challenge faced by all public research agencies, and one that receives continuing attention.1 The valuing and comparative ranking of different research areas is difficult because of the inherent risks and uncertainties in research outcomes, and this can be aggravated in public organisations where research may be longer term and directed toward public good and non-commercial outcomes.
The CSIRO Priorities Method (the Method) was developed in 1990 as a response to this challenge. It was designed to set priorities and guide spending in the Commonwealth Scientific and Industrial Research Organisation (CSIRO), Australia’s largest scientific research organisation. CSIRO conducts publicly funded research across a wide spectrum of commercial and non-commercial areas. The Method was used extensively by CSIRO in the 1990s for priority setting at an organisational, sectoral and program level. It has also been adopted for use by other public research agencies, both domestically and internationally. (See ‘Use of Method’, below.)
The Method is targeted at maximising the national benefits, commercial and/or non-commercial, from a portfolio of publicly funded research. It allows comparison of the Attractiveness and Feasibility of research for different purposes. The Method is robust and adaptable, and has proven effective in various settings and areas of research.
This paper fills a gap in the literature about the Method, which has been poorly documented in the public literature and online, primarily because it was created for in-house use in CSIRO. Here we explain its development, outline its key elements, and describe its various applications since 1990. We also reference previously unavailable documents now archived on the Encyclopedia of Australian Science and Innovation website.2
The next section provides historical context and explains the steps leading to the development of the Method. The following section outlines its key elements: the framework, process, and results screen. The next section describes how the Method was used by CSIRO for organisational, sectoral and program priority-setting, as well as its use by other research agencies. The final section reviews the development of the Method and discusses its potential for future use.
Background
History
Some background is relevant to the development of the Method. CSIRO, whose origins date back to 1926, has long been a mainstay of Australian government-funded research. In 1990, it accounted for approximately 16% of government spending on research and development, and approximately 10% of total national spending, including both public and private sectors.3
The 1980s were a period of significant economic reform in Australia, and the federal government implemented a series of measures to improve the international competitiveness of the nation’s industries.4 The government looked to CSIRO to contribute to this agenda and to align its research more closely with industry. In 1987 the Minister for Science, Barry Jones, issued guidelines to CSIRO that made it clear that its primary task was ‘the conduct of strategic and applied research in support of national economic, social, and environmental objectives’, rather than solely scientific ends.5 In addition, and to promote closer links with research users, the government set a 30% external earnings target for all of CSIRO’s research.
In 1986 the CSIRO Act was amended.6 A CSIRO Board was established and the Honourable Neville Wran, a former Premier of New South Wales, was appointed as the first Chairman in 1986. Subsequently the organisation was restructured into six industry-focused institutes, namely the Institute of Plant Production & Processing; Institute of Animal Production & Processing; Institute of Industrial Technologies; Institute of Information Science & Engineering; Institute of Minerals, Energy & Construction; and the Institute of Natural Resources & Environment.
Among the early actions of the Board were the establishment of two committees: one on national research priority-setting, and one on research evaluation. Both committees contributed directly to the development of the CSIRO Priorities Method.
The first committee developed a draft priority setting process using criteria grouped under the headings of ‘Benefits’ and ‘Barriers’, and outlined a procedure for scoring and presenting results that was trialled in early 1990 with a senior CSIRO management group.7
The second committee commissioned a consultancy report on assessing the commercial prospects for CSIRO research. This report proposed, among other things, a conceptual framework for use in evaluating small projects ‘where substantial resources were not available for evaluation.’8
The origins of this framework lay in a two-year study by the US Industrial Research Institute in the 1980s into research productivity,9 and specifically, the Research and Development (R&D) Return, Productivity and Yield framework developed by Foster and others,10 as shown in Fig. 1. The Figure outlines how investment in R&D contributes to an industrial organisation’s profits. It divides the return on R&D into two subordinate ratios: ‘R&D Productivity’ and ‘R&D Yield’. Multiplying R&D Productivity by R&D Yield equals the ‘Return on R&D’ (profits/technical progress). While this is essentially an accounting identity, it allows separation of two measurable components.11
R&D productivity and R&D yield are further divided into sub-components. R&D productivity is divided into ‘Potential Productivity’ (the maximum possible productivity improvement within the limits of the technology) and ‘Technology Development Efficiency’ (the efficiency of the R&D organisation compared to the maximum possible). R&D yield is divided into ‘Potential Yield’ (the maximum economic return possible given the structure of the market) and ‘Operating Efficiency’ (the efficiency of the commercialisation effort). This allows the separate consideration of R&D productivity and R&D yield: while R&D productivity relies on inputs related to scientific potential and research management, R&D yield relies on inputs related to market conditions and the potential return on research spending.
In the framework proposed for CSIRO12 the terminology in Fig. 1 was modified: R&D yield was divided into two components, ‘potential value’ and ‘ability to capture benefit for Australia’, and R&D productivity was divided into ‘technical potential’ and ‘research efficiency’.
In March 1990 Dr John Stocker took over as CSIRO’s Chief Executive. The Board tasked him with developing a process for establishing national research priorities to guide the organisation’s research expenditure over the 1990–3 funding triennium.13 Stocker secured support for the new process at a senior government level, and marshalled strong top-down and bottom-up support within CSIRO from Institute Directors and Divisional Chiefs as well as CSIRO researchers. A consequence was the closer engagement of scientists with industry research users in building an understanding of the markets involved and the way that technology could contribute.
The process also served as a vehicle for a productive dialogue at executive level with new and potential research users in both government and industry, for example, high level contacts with BHP and several other large companies. It provided the basis for a more disciplined dialogue and an opportunity for research partners to have input, as well as to explain CSIRO’s changed approach to resource allocation (J. W. Stocker [pers.comm.]).
The 1990 exercise centred on a framework, a process, and a results screen that were used to guide funding shifts.14 The framework adopted by CSIRO is set out in the next section. The process used by CSIRO in 1990, developed with the support of institute and corporate planners, is shown in Fig. 2.15 Steps included identification of a set of well-defined research purposes, using a national socio-economic research classification scheme, articulating and refining selection criteria, specifying data and information sets, and developing procedures for scoring. The results were displayed on a results screen, devised by CSIRO, which plotted the feasibility and attractiveness of research for each of the national research purposes.
The CSIRO Priorities Method
The Method was used by CSIRO for setting organisational priorities during the 1990s and has been adapted for use in other priority setting exercises, domestically and internationally. (See next section).
The Method comprises three elements—a framework, a process, and a results screen.
The framework
The framework (Fig. 3) uses four criteria to determine the attractiveness and the feasibility of investment for a particular research purpose.
Potential benefits: economic, environmental, and social benefits: the maximum commercial or other returns possible from technology improvements attributable to the research; and
Ability to capture the benefits: the ability of Australia’s organisations, private or public sector, to convert technical progress into commercial or other returns.
R&D potential: the scientific or technological potential of relevant research areas; and
R&D capacity: Australia’s ability to conduct the R&D and realise its potential in a timely way.
A score for ‘Attractiveness’ is achieved by multiplying the score (1–10) for ‘potential benefits’ by the score for ‘Ability to capture benefits’. This is a measure of the benefit of successful research and is determined by factors over which research organisations have little control.
A score for ‘Feasibility’ is achieved by multiplying the score for ‘R&D potential’ by the score for ‘R&D capacity’, and is a measure of the ability to achieve technical progress in Australia (per unit of R&D investment).
Multiplying the scores in this way can be interpreted as combining ‘scope’ and ‘efficiency’. For Attractiveness, one combines the ‘scope’ or maximum potential benefit with the ‘efficiency’ of existing Australian structures to capture these benefits. In the case of Feasibility, one combines the future ‘scope’ of research (for example, using a technology ‘S-curve’) with the ‘efficiency’ of the organisation to advance this research.
The process
Research purposes are the unifying categories that group together related research activities for priority-setting. They may be set at a national, organisational, or program and project level. In 1990 and 1993, 16 national research purposes were identified by CSIRO, using a national socio-economic research classification scheme.16 Later corporate exercises adopted a different classification, based on the organisation’s industry sectors.
Standardised information is critical when comparing different research purposes. For CSIRO’s organisational priority-setting process two information sets were assembled. The first, prepared in consultation with industry experts, provided, for each research purpose, data on the production, exports, imports, value added, research expenditure in Australia and overseas, and summary information on the major performers of research, research intensity, and key issues. A second information set, developed in consultation with scientists and research managers, provided a summary analysis of the technology prospects for each research purpose, considering factors such as resource costs, technology progress, and technical risks and uncertainties.
A set of discriminant questions was developed to help analyse background information and to focus discussion. The questions used by CSIRO in 1990 are shown in Table 1.
Finally, a standardised procedure is needed for scoring by the decision-making group. The process used in 1990 and 1993 by the CSIRO executive group, which combined individual judgement and consensus building, is shown in Table 2. After the first round of scoring the results were reviewed and discussed by the group in a workshop, participants had the opportunity to adjust their scores before the final group scores were averaged.
The results screen
The scores for each research purpose are displayed on a results screen, which plots scores for Feasibility against scores for Attractiveness. The feasibility vs attractiveness screen allows research purposes to be grouped by selectivity (Fig. 4). The closer the research purposes are to the top right-hand corner, the more deserving they are of support. Alternatively, research purposes that are closer the origin are less deserving of support and require more selectivity in funding.
The results screen was adopted as a superior method to one that multiplied the feasibility and attractiveness scores together so as to obtain a single value for each research purpose. (This would mask valuable information by blending internal factors, which the organisation can largely control, with external factors, that it cannot.)
Use of the Method
CSIRO organisational priorities
The Method was first applied by CSIRO in 1990 to determine the organisation’s research priorities for the 1990–3 triennium.17 It sought to identify research priorities from the national standpoint, and thereby guide the allocation of CSIRO research funds: the exercise involved assessing 16 research purposes (defined in terms of national socio-economic objectives) against the four framework criteria and then plotting the results on a feasibility vs attractiveness screen.
The research purposes were cross-disciplinary: Plant Production & Primary Products; Animal Production & Primary Products; Rural-based Manufacturing; Minerals; Energy Resource; Energy Supply; Manufacturing; Information & Communication Industries; Environmental Aspects of Economic Development; Environment; Transport; Construction; Commercial Services; Health; Defence; and Social Development & Community Services. The results of the 1990 exercise are shown in Fig. 5. The resource shifts that followed included the funding of a set of new research projects aligned with the revealed priorities, financed by a 1.5% annual priorities levy applied across the organisation.
The priority setting exercise for the 1993–6 funding triennium followed a similar approach, involving 16 national research purposes and culminating in a feasibility-attractiveness plot of the final results. The 1.5% levy was used to fund selected multi-divisional programs in high-priority areas over the triennium 1993–6.18
The priority setting arrangements for the CSIRO 1996–9 funding triennium were changed, reflecting the arrival of a new chief executive, Dr Malcolm McIntosh, and the adoption of a new matrix -management structure with 22 industry-based sectors and 27 discipline-based research divisions. CSIRO planning was now sector-based, and the (industry and government) users of research played an increased role as members of the Advisory groups established for each sector.19 Feasibility and attractiveness analysis was used by the sectors to develop their triennial priorities.
The 1999–2002 priority exercise was an integrated part of CSIRO’s sector-based corporate planning process. Feasibility and attractiveness analyses were central to priority setting for each sector.20 The inputs for R&D potential and R&D capacity primarily originated from CSIRO, while inputs on ‘potential value’ and the ‘ability to capture’ relied on contributions from the sector advisory groups. The priority-setting process included a final presentation by each sector to the executive group, which included an account of how each would manage with a hypothetical 20% more, or 20% less, funding over the coming triennium.
CSIRO program and project priorities
The Method was also used in CSIRO for setting at program and project levels. While the central logic was retained, the process was modified according to the needs of each exercise. Examples included:
Use of the Method to consider research opportunities in different industries, and applying the four selection criteria to evaluate the feasibility and attractiveness of potential areas for research. Studies included the Australian plastic and polymers industry, and the areas of waste treatment management and biomaterials.21
Use for prioritising projects within a CSIRO division. The Division of Animal Health, which addresses on endemic diseases affecting farm livestock, adapted the process for assessment of its projects as defined by industry research purpose (for example, sheep, beef cattle, dairy), and by problem areas (parasites, bacteria, viruses, toxins). Projects were scored and shown on a feasibility-attractiveness plot. This was then combined with a separate assessment of project quality to guide resource allocation decisions within the division.22
A similar approach was taken by the CSIRO Division of Soils, which adapted the Method to identify areas of research opportunity for sustainable and profitable management of soil and land resources.23 A workshop was held to define a set of research purposes, and areas of research opportunity (for example, pastures, site rehabilitation, forestry.) Projects were scored and were displayed on an attractiveness vs feasibility screen. These results were the combined with an analysis of project quality to guide the allocation of research funding within the division.
Project reporting and planning. In 1995 the CSIRO Institute of Industrial Technologies (comprising the Divisions of Applied Physics, Biotechnology, Chemicals & Polymers, Manufacturing Technology, and Materials Science) introduced a one-page project reporting format, that was used the 100-plus research projects in the Institute (Box 1).24
Box 1.Using the Priorities Method for project reporting and planning: Guidelines for project data sheets (CSIRO Institute of Industrial Technologies Upstill (1995)) |
Use by other countries
The CSIRO Priorities Method has also been employed in other countries.
The Ministry of Research Science and Technology in New Zealand adapted the method in 1991 for a national research priority setting exercise.25 The socio-economic objective- determined groupings were economic (primary, secondary, tertiary), environment, social/cultural, Antarctica, health, space and defence. The criteria used for scoring were research potential, research capacity, ability to capture benefits and (as written) strategic importance. A weighted combination of the scores for each of these four criteria yielded a priority index, which was used to guide the allocation of NZ$260 m of public good science funds for the year 1991–2.
The Method was introduced to LIPI (an Indonesian public research agency now part of BRIN, the National Research and Innovation Agency) during the course of the World Bank-funded Management Systems Strengthening (MSS-LIPI) project. It was used both within the LIPI discipline-based research centres to prioritise projects, and in a national research priority setting exercise.26
The Method was an input in the development of the UK Technology Foresight exercise,27 and the results of this exercise were ranked and displayed on a feasibility vs attractiveness screen,28 which grouped technologies under headings of emerging, intermediate, or key priority areas (Fig. 6).
In 1999 the International Center for Living Aquatic Resources Management (ICLARM) applied the Method to set research priorities in aquatic resource systems such as ponds, coastal waters, and coral reefs. Using the CSIRO approach ‘each resource system was scored for attractiveness (a measure of the potential benefits from conducting research, and the ability to utilise the outcomes in the region) against feasibility (a measure of the scientific potential to provide solutions to aquatic resource issues and ICLARM’s research capacity to undertake the research alone or in partnership)’.29
In 2001 the Method was used in a foresight exercise in the Czech Republic to identify priorities for the National Research Programme (to be launched in 2004). The exercise considered the most important technologies for the national industry and service sectors, and prioritised them. The Method’s criteria were modified. The two criteria for ‘feasibility’ were research & technology importance and absorption potential of the application sector, and the two criteria for ‘attractiveness’, renamed ‘importance’, were economic, social & environmental importance and research & technology opportunities. The results of the national exercise were displayed on a screen in which feasibility was plotted against importance.30
More recently, the Method has been adapted by the UN Environment Program Evaluation Office for use in prioritising evaluations conducted by the Office, and the ranking of opportunities for evaluation.31 The report established a set criteria and indicators, and used these to determine the relative attractiveness and feasibility for a set of potential evaluations. The report noted the return would be high when both attractiveness and feasibility measures were high, but increased selectivity would be needed when attractiveness and/or feasibility decline and the likely return was lower.
Discussion and summary
The impetus for this paper came from the realisation that, while the CSIRO Priorities Method had been successfully used since 1990 for research priority setting, there was little information about it available in the public literature or online. We have endeavoured to fill this gap by outlining its evolution, citing key documentation, and describing its use.
The context
Working out how to set priorities for public funded research for the greatest public benefit is an issue that needs to be continuously addressed. Are current priorities misplaced? Would a shift in resource allocation yield greater benefit? These are questions that can be posed for all research agency budgets and all science and technology budget decisions by government. The CSIRO Priorities Method is just one of multiple approaches that can be applied to addressing these questions.
The Method also falls within a broader priority setting frame. In his 1997 report Dr John Stocker, as the Australian chief scientist, identified three types of priorities for government: ‘thematic’ priorities, focusing on the relative importance of scientific disciplines, socioeconomic objectives, or broad fields encompassing a range of disciplines or objectives; ‘structural or goal-oriented’ priorities, focusing on the relative importance of different broad goals for science and technology (for example, improving science and technology capabilities), and ‘mechanism’ priorities, focusing on the different mechanisms to support science and technology (for example, grant schemes vs research agency funding).32 Stocker also noted the essentially different nature of (i) high-level, whole-of-government decisions on priorities for mechanisms and structures; (ii) priority setting within a Ministerial portfolio, and (iii) decisions on priorities at an agency or organisational level, something for which the CSIRO Priority Method is well suited. In the event, the Australian government issued a set of national research priorities in 2002. This followed the Batterham report, ‘A Chance to Change’, which recommended integrated priority setting across the federal government.33 The 2002 research priority areas were environmental sustainability, promoting and maintaining health, developing new technologies, and safeguarding the nation.34
The Method has been used at various levels. In New Zealand, Czech Republic, and Indonesia, it was used for national priority setting. It has not been used at a national level in Australia, although in 1990 and 1993 CSIRO adopted a national setting to guide organisational priorities and internal resource allocation. (This did not affect the funding or conduct of any research outside of CSIRO). In this paper we showed how the Method has been used at a number of levels, such as industry sector, divisional, program, and even research-project reporting. The attractiveness vs feasibility screen has been widely adopted as a way displaying the rankings of a variety of research purposes.
Historical development
The Method has its origins in the late 1980s, as CSIRO sought to align its research more closely with the needs of the users of its research in industry and the public sector. First introduced in 1990, it served effectively through the following decade and beyond.
It provided a reliable method for comparing commercial and non-commercial research across the agency, under a common goal of maximising national benefits. It offered a transparent and accountable approach, which was useful for justifying government funding. In addition, it fostered productive dialogue between scientists and the users of CSIRO research in industry and government.
The Method also encouraged a cultural shift in CSIRO during the 1990s. Beyond their role in formal priority-setting, the principles of feasibility and attractiveness became ingrained in CSIRO thinking. This encouraged a shared language as scientists applied the principles in their research planning, their interactions with customers, how they were able to accept the change process from above.
Over time, the Method was adapted for other CSIRO contexts, such as divisional priority-setting or and ranking research opportunities in emerging technology fields, and for project reporting. It was also adopted by several other countries and modified to suit their specific needs.
Summary
The CSIRO Priorities Method is a means to rank and display priorities for public funded research, using two orthogonal measures, attractiveness and feasibility. The Method comprises three elements: a framework, a process, and a results screen. It has been used for priority setting at national, organisational, program and project levels in CSIRO and overseas.
It is a simple, robust and adaptable method. It is a tool for building a consensus view and achieving a shared, well-informed expert view of an organisation’s research priorities. The scoring process delivers comparative, rather than absolute, judgements based on analysis of data and the discussion of arguments for a set of different research purposes.
It offers a relatively simple route to incorporate qualitative and semi-quantitative information in decision making and complements more formal approaches such as cost- benefit analysis and business planning.35 While the latter may be appropriate for well-defined projects, they can also be difficult and expensive to apply rigorously with rigour.
To be noted is that the Method does not specify the way the results of priority exercises are translated into resource shifts. This is a matter left to the organisations conducting the exercise and as Section 4 shows, a variety of approaches have been used. We note the need to design the scoring process so as to minimise any subjectivity in scoring, something that has been commented on.36 Subjective factors are, to a degree, unavoidable in a process directed toward developing a consensus expert view. These can be minimised by incorporating interactive sessions to allow initial scores to be discussed and challenged before rescoring, as well as be specifying the research purposes so as to discourage ‘silo’ thinking (for example, by using research purposes which are cross-disciplinary or cross-organisational).
In closing, we note that while there are multiple ways to set research agendas, the CSIRO Priorities Method can be a useful addition to the mix. It offers a simple, logically sound approach to ranking and displaying priorities for public funding. Its flexibility, and adaptability to different organisational contexts, warrant its consideration for public research agencies seeking to prioritise their research.
Data availability
Data shown in the paper are available from public sources, as listed in footnotes and references within this text.
References
Blyth, M. and Upstill, G. (1994) Effective priority setting for public sector research: CSIRO’s experience, International Science and Technology Policy Seminar, October 1994, Wellington, New Zealand, https://eoas.info/bib/ASBS15631.htm, viewed January 2025.
Brattström, E. (2021) Facilitating collaborative priority-setting for research and innovation: a case from the food sector, Technology Analysis & Strategic Management, 33(7), 742-754.
| Crossref | Google Scholar |
Commonwealth of Australia (1990) Science and Technology Budget Statement 1990-91. https://www.industry.gov.au/sites/default/files/1990-91-science-technology-budget-statement.pdf, viewed January 2025.
CSIRO (1991a) CSIRO Priority Determination, 1990: Methodology and Results Overview, (Kretschmer Report). CSIRO Corporate Planning Office, Canberra, https://www.eoas.info/bib-pdf/ASBS15622.pdf, viewed January 2025.
CSIRO (1991b) CSIRO priority determination 1990 Role Statements. CSIRO Corporate Planning Office, Canberra, https://www.eoas.info/bib-pdf/ASBS15624.pdf, viewed January 2025.
CSIRO (1993a) Setting priorities for research purposes and research projects: CSIRO Division of Animal Health. CSIRO Corporate Planning Office, Canberra, https://eoas.info/bib-pdf/ASBS15628.pdf, viewed January 2025.
CSIRO (1993b) Setting priorities for research purposes and research projects: CSIRO Division of Soils, CSIRO Corporate Planning Office, Canberra. https://eoas.info/bib-pdf/ASBS15629.pdf, viewed January 2025.
CSIRO (1993c) CSIRO research priorities 1994-95 to 1996-97: A progress report. CSIRO Corporate Planning Office, Canberra, https://eoas.info/bib-pdf/ASBS15626.pdf, viewed January 2025.
Dennis, C. (2002) Australia sets priorities for future research, Nature, 420, 597.
| Crossref | Google Scholar | PubMed |
Foster, R., Linden, L., Whitely, R., and Kantrow, A. (1985a) Improving the Return on R&D-I, Research Management, 28(1), 12-17 https://www.jstor.org/stable/24120647.
| Google Scholar |
Foster, R., Linden, L., Whitely, R., and Kantrow, A. (1985b) Improving the Return on R&D- II, Research Management, 28(2), 13-22 https://www.jstor.org/stable/24120755.
| Google Scholar |
Georghiou, L. (1996) The UK Technology Foresight Programme, Futures, 28(4), 359-377.
| Google Scholar |
International Center for Living Aquatic Resources Management (ICLARM) (1999) Supplement to the ICLARM Strategic Plan 2000–2020, Aquatic Resources Research in Developing Countries, Data and Evaluation by region and resource system. ICLARM Working Paper No. 4. https://digitalarchive.worldfishcenter.org/items/93d59d6e-da24-4938-a8bf-24ac6279b5fa
Martin, B. (1996) Technology Foresight: capturing the benefits from science-related technologies, Research Evaluation, 6(2), 156-168.
| Google Scholar |
Martin, B., and Johnson, R. (1994) Technology foresight for wiring up the national innovation system: experiences in Britain, Australia, and New Zealand, Technological Forecasting and Social Change, 60(1), 37-54.
| Crossref | Google Scholar |
McKinsey and Company (1987) Assessing the Commercial Prospects for Research. Report to CSIRO Executive, https://eoas.info/bib-pdf/ASBS15627.pdf, viewed January 2025.
Rys, G. (1991) Setting National Science Priorities for New Zealand. Presentation to national meeting February 1991. https://www.researchgate.net/publication/323268238_Setting_National_Science_Priorities_for_New_Zealand
Spilsbury, M.J., Norgbey, S., and Battaglino, C. (2014) Priority setting for evaluation: Developing a strategic evaluation portfolio, Evaluation and Program Planning, 46, 47-57.
| Crossref | Google Scholar | PubMed |
Spurling, T. H., Upstill, G., and Jordan, M. (1988) Research opportunities in the polymer and plastics industry, CSIRO Institute of Industrial Technologies, Canberra, https://eoas.info/bib-pdf/ASBS15632.pdf, viewed January 2025.
Spurling, T.H., Allison, G.,. Bond, W., Bateup, B., Robinson, G., Sutherland, D., Hall, J., Upstill, G., and Kariotoglou, N. (1991) Report of CSIRO Waste Treatment Research Taskforce, CSIRO Canberra, https://www.eoas.info/bib-pdf/ASBS15700.pdf, viewed January 2025.
Spurling, T.H., Redhead, T., and Blyth, M. (2001) Management and Systems Strengthening - Lembaga Ilmu Pengetahuan Indonesia (LIPI): Final report June 2001, CSIRO Canberra and LIPI Jakarta, https://www.eoas.info/bib-pdf/ASBS15699.pdf
Stocker, J.W. (1990) ‘CSIRO on the move’, in Science and Technology: Creating Wealth for Australia NSTAG Forum Report, Canberra, pp. 31–35. https://eoas.info/bib-pdf/ASBS15630.pdf, accessed 22 November 2024.
Upstill, G. (1995) The CSIRO priorities method, 1990-1995, CSIRO Industrial Technologies, Canberra, https://www.eoas.info/bib-pdf/ASBS15625.pdf, viewed January 2025.
Upstill, G., and Spurling, T. H. (2020) Engaging with Australian industry: CSIRO in the late twentieth century, Historical Records of Australian Science, 31(1), 1-16.
| Crossref | Google Scholar |
Upstill, G., Steele, J., Wright, J. (1990) Biomaterials: Research directions for CSIRO, CSIRO, Canberra, https://www.eoas.info/bib-pdf/ASBS15701.pdf, viewed January 2025.
World Health Organisation (2025) Methods with a focus on health R&D: Priority setting methods. https://www.who.int/observatories/global-observatory-on-health-research-and-development/resources/methods/priority-setting-methods
Footnotes
1 World Health Organisation (2025). Montorzi and others (2010). International Energy Agency (2014). Brattström (2021).
2 The Encyclopedia of Australian Science and Innovation (EOAS), is a gateway to the history and archives of science, technology and innovation in Australia. It is accessible at https://www.eoas.info/.
5 Jones (1987).
7 CSIRO (1990).
11 It has its roots in the work of Donaldson Brown at DuPont and General Motors in the 1920s. Referring to the formula that R (return on investment) equals P (ratio of net profit to sales) times T (ratio of sales to investment), Best (1990) notes ‘P was not new. Measuring earnings as a percentage of sales was as old as bookkeeping: it is the information that constitutes the income or profit and loss account of a business enterprise. Likewise, T was not new in that it uses the data found in the balance sheet. But defining turnover as the ratio of output to investment, breaking it down by department, and linking it to the cost accounts was new’.
12 McKinsey and Company (1987). This approach was used in 1988 to examine research opportunities in the Australian polymer and plastics Industry (Spurling and others 1988).
13 Stocker (1990).
14 CSIRO (1991a).
17 CSIRO (1991a, 1991b, 1991c).
18 CSIRO (1993c).
19 CSIRO (1997).
20 CSIRO (2000).
22 CSIRO (1993a).
23 CSIRO (1993b).
24 Upstill (1995).
25 Rys (1991).
27 Martin (1996).
28 Georghiou (1996).
30 Klusacek (2001).
32 Stocker (1997).
33 Batterham (2000).
34 Dennis (2002).