Free Standard AU & NZ Shipping For All Book Orders Over $80!
Register      Login
Animal Production Science Animal Production Science Society
Food, fibre and pharmaceuticals from animals
RESEARCH ARTICLE

Evaluation for farming systems improvement: looking backwards, thinking forwards

J. Dart
+ Author Affiliations
- Author Affiliations

Clear Horizon, PO Box 341, Hastings, Vic. 3915, Australia. Email: Jess@clearhorizon.com.au

Australian Journal of Experimental Agriculture 45(6) 627-633 https://doi.org/10.1071/EA03249
Submitted: 21 November 2003  Accepted: 20 August 2004   Published: 29 June 2005

Abstract

Based on personal reflection, this paper presents program evaluation as a vehicle to bring about better project results and perhaps even a better world. New paradigm approaches to farming systems improvement feature multiple collaborators in work that is increasingly participatory, process-oriented and diverse in outcome. This is often accompanied by pressure for rapid feedback and dialogue. Conventional objectives-based evaluation methods are insufficient to capture the range of unanticipated outcomes that this work may produce, and may be incompatible with a participatory ethos. In the contemporary farming systems improvement context, evaluation is most valuable when it has short-cycles and fosters reflection. Two contrasting techniques that offer promise for meeting these needs are the most significant change technique, and participatory approaches to program logic. Presenting a radical departure from conventional monitoring against quantitative indicators, most significant change technique involves the regular collection and participatory interpretation of ‘stories’ about change rather than predetermined quantitative indicators. Program logic is the rationale behind a program or project — what are understood to be the cause and effect relationships between project activities, outputs, intermediate outcomes, and ultimate outcomes. When done in group situations, program logic offers many benefits by enabling participants to question the cause and effect assumptions to improve project design. These techniques can supplement traditional approaches; closing some of the information gaps identified by contemporary farming systems improvement work.


Acknowledgments

I would like to acknowledge, that while the views presented are my own, a considerable amount of the work behind these ideas was developed in conjunction with Evaluation Support Department of Primary Industries, Victoria. I was fortunate enough to work with this team and I thank them for allowing me to present these ideas. More information about the work of this team can be found in McDonald et al. (2003).


References


Bennett CF (1975) Up the hierarchy. Journal of Extension 1, 6–12. (verified 6 June 2005).

Dart JJ (1999b) The tale behind the performance story approach. Evaluation News and Comment 8, 12–13. (accessed 6 June 2005).

Davies RJ (1996) An evolutionary approach to facilitating organisational learning: an experiment by the Christian Commission for Development in Bangladesh, Swansea. (Centre for Development Studies: UK) Available online at: http://www.swan.ac.uk/cds/rd/ccdb.htm (verified 6 June 2005).

Farrington J, Nelson J (1997) Using logframes to monitor and review farmer participatory research. Network Paper No. 73, Overseas Development Institute, London.

Fetterman DM, Kaftarian SJ, Wandersman A (1996) ‘Empowerment evaluation: knowledge and tools for self-assessment and accountability.’ (Thousand Oaks: Sage)

Guerin LJ, Guerin TF (1994) Constraints to the adoption of innovation in agricultural research and environmental management: a review. Australian Journal of Experimental Agriculture 34, 549–571.
Crossref |
(verified 6 June 2005).

McDonald B, Rogers P, Kefford B (2003) Teaching people to fish? Building the evaluation capability of public sector organisations. Evaluation 9, 9–30.
Crossref | GoogleScholarGoogle Scholar | open url image1

Mark MM, Henry GT, Julnes G (2000) ‘Evaluation: an integrated framework for understanding, guiding, and improving public and nonprofit policies and programs.’ (Jossey-Bass: San Francisco)

Mosse D, Ekande T, Sodhi PS, Jones S, Mehta M, Moitra U (1995) A’pproaches to participatory planning: a review of the KRIBP experience.’ (Centre for Development Studies, University of Wales: Swansea)

Oakley P, Pratt B, Clayton A (1998) ‘Outcomes and impact: evaluating change in social development.’ (INTRAC: Oxford)

Owen JM (1993) ‘Program evaluation, forms and approaches.’ (Allen and Unwin: St Leonards)

Murray P (2000) Evaluating participatory extension programs: challenges and problems. Australian Journal of Experimental Agriculture 40, 519–526.
Crossref | GoogleScholarGoogle Scholar | open url image1

Patton MQ (1997) ‘Utilization focused evaluation.’ (Thousand Oaks: Sage)

Pawson R, Tilley N (1997) ‘Realistic evaluation.’ (Sage: London)

Pretty JN (1995) Participatory learning for sustainable agriculture. World Development 23, 1247–1263.
Crossref | GoogleScholarGoogle Scholar | open url image1

Tyler RW (1967) Changing the concepts of educational evaluation. In ‘Perspectives of curriculum evaluation. Vol. 1’. (Ed. RE Stake) (Rand McNally: New York)

Worthen BR, Sanders JR, Fitzpatrick JL (1997) ‘Program evaluation: alternative approaches and practical guidelines.’ (Longman: New York)