Free Standard AU & NZ Shipping For All Book Orders Over $80!
Register      Login
Australian Health Review Australian Health Review Society
Journal of the Australian Healthcare & Hospitals Association
RESEARCH ARTICLE (Open Access)

The role of boards in clinical governance: activities and attitudes among members of public health service boards in Victoria

Marie M. Bismark A B , Simon J. Walter A and David M. Studdert A
+ Author Affiliations
- Author Affiliations

A University of Melbourne, 207 Bouverie Street, Carlton, Vic. 3010, Australia. Email: swalter@unimelb.edu.au, d.studdert@unimelb.edu.au

B Corresponding author. Email: mbismark@unimelb.edu.au

Australian Health Review 37(5) 682-687 https://doi.org/10.1071/AH13125
Submitted: 18 June 2013  Accepted: 5 September 2013   Published: 4 November 2013

Journal Compilation © AHHA 2013

Abstract

Objectives To determine the nature and extent of governance activities by health service boards in relation to quality and safety of care and to gauge the expertise and perspectives of board members in this area.

Methods This study used an online and postal survey of the Board Chair, Quality Committee Chair and two randomly selected members from the boards of all 85 health services in Victoria. Seventy percent (233/332) of members surveyed responded and 96% (82/85) of boards had at least one member respond.

Results Most boards had quality performance as a standing item on meeting agendas (79%) and reviewed data on medication errors and hospital-acquired infections at least quarterly (77%). Fewer boards benchmarked their service’s quality performance against external comparators (50%) or offered board members formal training on quality (53%). Eighty-two percent of board members identified quality as a top priority for board oversight, yet members generally considered their boards to be a relatively minor force in shaping the quality of care. There was a positive correlation between the size of health services (total budget, inpatient separations) and their board’s level of engagement in quality-related activities. Ninety percent of board members indicated that additional training in quality and safety would be ‘moderately useful’ or ‘very useful’. Almost every respondent believed the overall quality of care their service delivered was as good as, or better than, the typical Victorian health service.

Conclusions Collectively, health service boards are engaged in an impressive range of clinical governance activities. However, the extent of engagement is uneven across boards, certain knowledge deficits are evident and there was wide agreement among board members that further training in quality-related issues would be useful.

What is known about the topic? There is an emerging international consensus that effective board leadership is a vital element of high-quality healthcare. In Australia, new National Health Standards require all public health service boards to have a ‘system of governance that actively manages patient safety and quality risks’.

What does this paper add? Our survey of all public health service Boards in Victoria found that, overall, boards are engaged in an impressive range of clinical governance activities. However, tensions are evident. First, whereas some boards are strongly engaged in clinical governance, others report relatively little activity. Second, despite 8 in 10 members rating quality as a top board priority, few members regarded boards as influential players in determining it. Third, although members regarded their boards as having strong expertise in quality, there were signs of knowledge limitations, including: near consensus that (additional) training would be useful; unfamiliarity with key national quality documents; and overly optimistic beliefs about quality performance.

What are the implications for practitioners? There is scope to improve board expertise in clinical governance through tailored training programs. Better board reporting would help to address the concern of some board members that they are drowning in data yet thirsty for meaningful information. Finally, standardised frameworks for benchmarking internal quality data against external measures would help boards to assess the performance of their own health service and identify opportunities for improvement.

Introduction

National health policy reforms place local governing boards at the centre of a drive toward improved quality of health care.1 Public health service boards have been established in each state and territory and new National Health Standards require a ‘system of governance that actively manages patient safety and quality risks’.2 These moves respond to an emerging international consensus that board leadership is a vital element of high-quality care.310

A growing body of international research suggests that hospitals with boards that are actively engaged in quality issues are more likely to have quality-improvement programs in place and to perform better on indicators such as risk-adjusted mortality rates.9,11 Early evidence from hospital systems overseas suggests that whereas some boards perform strongly, others lack understanding of patient safety problems and receive inadequate information for sound decision making.11 Little is known about Australian boards’ engagement in clinical governance.

We surveyed members of boards of all public hospitals, public health services and multipurpose services in Victoria (‘health services’). Our aims were: (1) to ascertain the nature and extent of the boards’ current activities in overseeing quality; and (2) to describe the expertise and perspectives of board members in this area.


Methods

Setting

Victoria has 85 public health services, ranging from metropolitan services with more than 500 acute care beds to rural services with fewer than five beds. Sixteen health services are located in metropolitan areas; the rest are in regional and rural communities.12 Each is overseen by a board appointed by the Minister for Health.

All boards in Victoria are required to have a quality committee, and to publish an Annual Quality of Care report.

At the time of the study, health services were preparing for the introduction of the National Safety and Quality Health Service Standards, which form the basis for accreditation of public health services from 1 January 2013.2 These standards stipulate that boards are ‘responsible for governing all organisational domains of activity including … safety and quality’. Standard 1 sets forth five criteria in this area (Table 1).


Table 1.  Standard 1: governance for safety and quality in health service organisations2
Click to zoom

Study sample and instrument

We sampled four members from each board: the Chair, the Chair of the Quality Committee and two other randomly selected members from among members who had served on the board for at least 12 months.

We adapted for the Australian context a survey developed by Jha and Epstein in the United States.11 To ensure the appropriateness of the questions for the Victorian context, we sought feedback from the Victorian Department of Health, the Victorian Healthcare Association and the Victorian Managed Insurance Authority and piloted the survey with three board members and a former board Chair who were not part of the study sample.

The instrument defined ‘quality’ as referring to four dimensions of healthcare: appropriateness, effectiveness, acceptability and safety. It then asked respondents which quality-related activities, from a specified list of possible activities, their boards were undertaking. The activities were derived from a similar list developed by Jha and Epstein,11 the new National Standards,2 and a review of the international literature on clinical governance.

Survey questions also addressed four other domains: board members’ training and perceived expertise; perceptions of their health service’s quality performance; board priorities; and perceptions of the board’s influence over quality.

The Department of Health provided data on the total annual budget and number of inpatient separations for each health service.

Survey administration

A survey research company (Strategic Data) administered the survey in March 2012. Participants could complete the survey online, on paper or by telephone. Non-responders were followed up by email and telephone.

Analysis

We analysed the response data by computing simple counts and cross-tabulations. We constructed a ‘quality activity score’ by assigning one point for each activity a board was undertaking, from a list of 15 activities that members were specifically queried about. A score of 15 points indicated engagement in all 15 quality-related activities.

Responses to questions about expertise, knowledge and attitudes were analysed at the board member level; responses about board activities were analysed at the board level using the Chair’s response if members’ responses were divergent. We tested for statistically significant associations between boards’ quality activity scores and their size − as measured by their total annual budget and total annual inpatient separations, respectively − using Kendall’s rank correlation.

All analyses were conducted using R version 2.15.1.13 The Human Research Ethics Committee of the University of Melbourne approved the study.


Results

Characteristics of respondents

Of the 332 members surveyed, 70% (233) responded and 96% (82/85) of boards had at least one member respond. Of the three boards that declined to participate, two were in rural areas and one was in a metropolitan area.

Forty-six percent of surveyed board members served on a quality committee of the board. Respondents had an average tenure of 7 years (range 1−36 years). One-third of members had no governance experience before their appointments, and one in five board members had served on the board for 10 or more years. Table 2 describes other characteristics of the respondents.


Table 2.  Characteristics of board members (n = 233 members)
T2

Engagement in quality-related activities

Table 3 reports the proportion of boards undertaking each of 15 quality-related activities. A majority of boards had established quality goals (84%) and regularly monitored progress toward the board’s quality of care plan (77%). By contrast, only half of boards assessed the organisation’s quality against external benchmarks, just over half of boards provided members with formal training on quality-related issues and fewer than one-quarter provided members with training on healthcare disparities. The quality activity score, which is based on the sum of activities each board was undertaking, indicated substantial variation in clinical governance activities across boards (Fig. 1). For example, whereas 25 boards were engaged in less than half of the 15 specified activities, 19 boards were undertaking 12 or more of them.


Table 3.  Quality-related activities undertaken by boards (n = 82 boards)
ATSI, Aboriginal and Torres Strait Islander
T3


Fig. 1.  Distribution of boards by total number of quality-related activities they were engaged in, from among a specified list of 15 activities.
F1

The amount of time boards spent on quality of care issues also varied. Seven boards reported spending 10% or less of their time on quality of healthcare issues, whereas nine boards spent more than 30%.

There was a significant positive correlation between the quality activity score at board level and both the annual budget (tau = 0.28, P < 0.0001) and annual inpatient separations (tau = 0.28, P < 0.0001) of the health service. In other words, the boards of larger organisations were more likely to be highly engaged in the quality-related activities covered in our survey than were the boards of smaller organisations. However, this was not uniformly true: some rural boards, for example, had very high-quality activity scores whereas some metropolitan boards reported undertaking fewer than half of the specified activities.

Besides the specific activities they were queried about, some respondents mentioned other quality-related activities in which their board was engaged, including annual quality of care retreats, regular literature reviews, public forums, partnerships with indigenous communities, presentations by patients and families, leadership walk-arounds and quality of care awards.

Perceived expertise and knowledge

Ninety percent (208/233) of board members believed that their board had ‘moderate’ or ‘very significant’ expertise in quality of care issues. Nearly two-thirds (138/231) said that expertise in quality of care issues was important when recommending new board appointees. Nonetheless, 90% of board members (218/233) indicated that additional training would be ‘moderately useful’ or ‘very useful’.

Board members’ familiarity with key quality-related policies, indicators and standards was uneven. Most members were familiar with major Victorian documents, including the Department of Health’s Quality of Care Report guidelines14 (94% of members ‘somewhat familiar’ or ‘very familiar’) and the Patient Satisfaction Monitor15 (91%). There was less familiarity with major national documents: 46% of members were ‘not familiar’ with the Open Disclosure Standard16 and 37% were ‘not familiar’ with the Australian Charter of Healthcare Rights.17

Members of quality committees were more closely familiar with Quality of Care Report guidelines than other board members (52% v. 33% ‘very familiar’, P = 0.003), but not with the Open Disclosure Standard or the Charter.

Board priorities and influence

From a list of six possible priorities for board oversight − including financial performance, business strategy and operations − most members (82%) identified quality of care as one of the top two priorities. Yet members generally considered their boards to be a relatively minor force in shaping the quality of care. Less than 10% of members named the board or the board chair as the first or second most influential actor in determining quality, although 21% named the board’s Quality Committee (Fig. 2). Members rated the Chief Executive Officer to be the most influential actor, followed by the Director of Nursing.


Fig. 2.  Perceptions of board members regarding who most influences quality of healthcare.
F2

Perceived performance

Almost every respondent (225/231) believed the overall quality of care delivered by their health service was the same or better than the typical Victorian health service. None rated it as worse, although six members said they did not know how their health service compared and a small fraction (<1%) rated it as worse on particular dimensions of performance (e.g. having a safe and skilled workforce) (Fig. 3).


Fig. 3.  Board members’ perception of the quality of their health service compared with the typical Victorian health service. Numbers add to less than 100% because some respondents (<5%) selected ‘didn’t know’ in response to questions about how their health service compared with others.
F3


Discussion

Overall, Victorian health service boards were engaged in an impressive range of clinical governance activities. However, tensions were evident. First, whereas some boards appeared to be strongly engaged in quality-related activities, others reported relatively little activity. Organisational size was positively correlated with intensity of engagement. Second, despite 8 in 10 members rating quality as a top board priority, few members regarded boards as an influential player in determining it. Third, although members regarded their boards as having strong expertise in quality, there were signs of knowledge limitations, including: wide agreement that (additional) training in this area would be useful; unfamiliarity with key national quality documents; and overly optimistic beliefs about quality performance.

Our findings highlight four inter-related challenges for clinical governance by health service boards in Victoria. First, for boards to become active and enthusiastic about quality governance two elements seem essential: (1) a belief that this is a core part of their mission; and (2) confidence that doing so will drive better outcomes for patients. Our findings suggest that boards have embraced the first element but not the second. Board members felt that they played a relatively modest role in influencing quality, rating their contribution well below that of senior management and clinical leaders.

Second, there is scope to improve boards’ understanding of quality issues. Nearly half of the boards did not offer formal training in this area, and members signalled a strong appetite for it. This finding resonates with results from a 2000 survey of 47 Australian hospital board chairs, which suggested underinvestment in professional development of board members.18 Further qualitative research is needed to understand the nature and form that such training should take for optimum impact. Options include in-house training by staff with expertise in this area, the expansion of tailored programs offered by organisations such as the Australian Centre for Healthcare Governance, or the addition of a clinical governance module to the governance training offered by the Australian Institute of Company Directors. Third, our survey pointed to several gaps in measurement. An oft-quoted adage in management circles is that ‘you can’t manage what you don’t measure’. Nearly one-third of the boards did not monitor quality through simplified composite sets of quality indicators, such as dashboards and scorecards.19 Lack of effective reporting structures is a recurring theme in inquiries into quality breakdowns in healthcare institutions.20 Finally, half of the boards did not routinely compare internal quality data against external measures. The absence of standardised frameworks for making such comparisons is likely to be a retarding factor here.21 Although the Department of Health collects volumes of data from health services, few outcome measures are consistently made available to health services to support benchmarking activities in the field (Bismark MM, Studdert DS unpub. data). Some health services have taken the initiative, entering into data-sharing collaborations with peers (e.g. through the Health Roundtable22), but the benefits of such initiatives are confined to member organisations. The lack of benchmarking may go some way toward explaining the uniform belief among respondents that the quality of care delivered by their health service was as good as or better than others. Jha and Epstein11 found similar overoptimism among hospital board chairs in the USA, and the same kind of misperceptions have been observed in other studies of performance self-assessments by drivers, students, educators and others.23,24 A recognised cause of these so-called ‘Lake Wobegon effects’ (named after Garrison Keillor’s fictional community, in which ‘all the women are strong, all the men are good looking, all the children are above average’) is unavailability or underuse of reliable information on peer performance.

Our study has limitations. The generalisability of our findings outside Victoria is unknown. However, because the new national governance framework borrowed heavily from Victoria’s model, our findings have relevance for boards elsewhere. Additionally, we relied on self-reported measures of knowledge and performance. Despite assurances of anonymity, a degree of social desirability bias is likely and its effect would be an inflated picture of activity and engagement. Finally, our study is descriptive: it provides a valuable snapshot of board members’ attitudes and activities in this area. Further work is required to understand whether a causal relationship exists between effective clinical governance and improved patient outcomes in public health services.

Historically, health service boards focussed on financial issues and chief executive performance.25 Quality of care was assumed, its oversight was left to clinical leaders and it tended to be poorly measured.26 That approach is being rewritten today, spurred by mounting evidence that organisational factors, including high-level leadership, influence quality of care.7,9,11,27 Findings from the present study point to several steps that may assist health service boards in Australia to enhance the depth and value of their contributions to ensuring that patients receive better care.


Competing interests

The authors declare there are no competing interests.



Acknowledgements

This study was funded by the Victorian Healthcare Association, the Victorian Managed Insurance Authority, and an ARC Laureate Fellowship (FL110100102 to Dr Studdert) from the Australian Research Council. The research was conducted independently from the funders. The funders had no role in the collection, analysis, or interpretation of data, writing of the report, or the decision to submit the article for publication.


We thank the health service boards that took part in this study. We also thank our project advisory group, led by Trevor Carr, Alison Brown and Liz Cox, and the Council of Board Chairs for their helpful comments and suggestions.


References

[1]  Rudd K, Swan W, Roxon N. A national health and hospitals network for Australia’s future. Canberra: Commonwealth of Australia; 2010.

[2]  Australian Commission on Safety and Quality in Health Care. Safety and quality improvement guide standard 1: governance for safety and quality in health service organisations. Sydney: ACSQHC; 2012. Availabl at http://www.safetyandquality.gov.au/wp-content/uploads/2012/10/Standard1_Oct_2012_WEB1.pdf [verified March 2013]

[3]  National Quality Forum. Hospital governing boards and quality of care: a call to responsibility. Washington, DC: National Quality Forum; 2004.

[4]  Conway J. Getting boards on board: engaging governing boards in quality and safety. Jt Comm J Qual Patient Saf 2008; 34 214–20.
| 18468360PubMed |

[5]  Braithwaite J, Travaglia JF. An overview of clinical governance policies, practices and initiatives. Aust Health Rev 2008; 32 10–22.
An overview of clinical governance policies, practices and initiatives.Crossref | GoogleScholarGoogle Scholar | 18241145PubMed |

[6]  Baker GR, Denis JL, Pomey MP, MacIntosh-Murray A. Designing effective governance for quality and safety in Canadian healthcare. Healthc Q 2010; 13 38–45.
Designing effective governance for quality and safety in Canadian healthcare.Crossref | GoogleScholarGoogle Scholar | 20104036PubMed |

[7]  Weiner BJ, Shortell SM, Alexander J. Promoting clinical involvement in hospital quality improvement efforts: the effects of top management, board, and physician leadership. Health Serv Res 1997; 32 491–510.
| 1:STN:280:DyaK2svmvVOmsg%3D%3D&md5=64db6d256e3be174e88d16cd7012338fCAS | 9327815PubMed |

[8]  Vaughn T, Koepke M, Kroch E, Lehrman W, Sinha S, Levey S. Engagement of leadership in quality improvement initiatives: executive quality improvement survey results. J Patient Saf 2006; 2 2–9.

[9]  Jiang HJ, Lockee C, Bass K, Fraser I. Board oversight of quality: any differences in process of care and mortality? J Healthc Manag 2009; 54 15–30.
| 19227851PubMed |

[10]  Joshi MS, Hines SC. Getting the board on board: engaging hospital boards in quality and patient safety. Jt Comm J Qual Patient Saf 2006; 32 179–87.
| 16649648PubMed |

[11]  Jha A, Epstein A. Hospital governance and the quality of care. Health Aff 2010; 29 182–7.
Hospital governance and the quality of care.Crossref | GoogleScholarGoogle Scholar |

[12]  Department of Health. Annual report 2011–2012. Melbourne: Department of Health; 2012. Available at http://docs.health.vic.gov.au/docs/doc/Department-of-Health-Annual-Report-2011-12 [verified September 2013]

[13]  R Core Team. R: A language and environment for statistical computing. Vienna: R Foundation for Statistical Computing; 2012.

[14]  Department of Health. Quality of care reports 2011–12 recommended reporting. Melbourne: Department of Health; 2012. Available at http://docs.health.vic.gov.au/docs/doc/27224D7FD4D839F3CA257A17000A28E8/$FILE/Quality%20of%20care%20reports%20-%20recommended%20reporting%202011-12.pdf [verified March 2013]

[15]  Department of Health. Victorian patient satisfaction monitor year 11 annual report July 2011 to June 2012. Melbourne: State Government of Victoria; 2012. Available at http://www.vpsm.com.au/dnld/VPSM%20Annual%20Report%202011-2012.pdf [verified March 2013]

[16]  Australian Council for Safety and Quality in Healthcare. Open disclosure standard: a national standard for open communication in public and private hospitals following an adverse event in healthcare. Canberra: ACSQHC; 2003. Available at http://www.safetyandquality.gov.au/our-work/open-disclosure/the-open-disclosure-standard/ [verified March 2013]

[17]  Australian Commission on Safety and Quality in Healthcare. Australian charter of healthcare rights. Sydney: ACSQHC; 2008. Available at http://www.safetyandquality.gov.au/wp-content/uploads/2012/01/Charter-PDf.pdf [verified March 2013]

[18]  Tregoning S. Hospital board structure: changing form and changing issues. Aust Health Rev 2000; 23 28–37.
Hospital board structure: changing form and changing issues.Crossref | GoogleScholarGoogle Scholar | 1:STN:280:DC%2BD3M%2FmtleqtQ%3D%3D&md5=dfddd9b0d9227d05d27e7fb025f5ee2fCAS | 11186057PubMed |

[19]  Eckerson W. Performance dashboards: measuring, monitoring and managing your business. New Jersey: John Wiley and Sons; 2011.

[20]  Walshe K. Understanding and learning from organisational failure. Qual Saf Health Care 2003; 12 81–2.
Understanding and learning from organisational failure.Crossref | GoogleScholarGoogle Scholar | 1:STN:280:DC%2BD3s7mt1Kkuw%3D%3D&md5=8acf2240cdbf1d11ba9a2a614e9bd25aCAS | 12679497PubMed |

[21]  Healy J, Braithwaite J. Designing safer health care through responsive regulation. Med J Aust 2006; 184 S56–9.
| 16719738PubMed |

[22]  The Health Roundtable. Annual report 2012. Terrigal, NSW: The Health Roundtable; 2012.

[23]  Cannell JJ. Nationally normed elementary achievement testing in America’s public schools. How all fifty states are above the national average. Educ Meas 1988; 7 5–9.
Nationally normed elementary achievement testing in America’s public schools. How all fifty states are above the national average.Crossref | GoogleScholarGoogle Scholar |

[24]  Svenson O. Are we all less risky and more skillful than our fellow drivers? Acta Psychol 1981; 47 143–8.
Are we all less risky and more skillful than our fellow drivers?Crossref | GoogleScholarGoogle Scholar |

[25]  Margolin FS, Hawkins S, Alexander JA, Prybil L. Hospital governance: initial summary report of 2005 survey of CEOs and Board Chairs. Chicago: Health Research and Educational Trust; 2006.

[26]  Braithwaite J, Healy J, Dwan K. The governance of health safety and quality. Canberra: Australian Council for Safety and Quality in Health Care; 2005.

[27]  Davies HT, Nutley SM, Mannion R. Organisational culture and quality of health care. Qual Health Care 2000; 9 111–9.
Organisational culture and quality of health care.Crossref | GoogleScholarGoogle Scholar | 1:STN:280:DC%2BD3M%2FgslKjtA%3D%3D&md5=893752be8b91ddc024b53508be4976e4CAS | 11067249PubMed |