Time to establish comprehensive long-term monitoring of Australian medical graduates?
Christine Jorm A B , Jane Bleasel A and Inam Haq AA The University of Sydney, Edward Ford Building, NSW 2006, Australia. Email: jane.bleasel@sydney.edu.au; inam.haq@sydney.edu.au
B Corresponding author. Email: christine.jorm@sydney.edu.au
Australian Health Review 42(6) 635-639 https://doi.org/10.1071/AH16292
Submitted: 20 December 2016 Accepted: 6 June 2017 Published: 7 December 2017
Journal Compilation © AHHA 2018 Open Access CC BY-NC-ND
Abstract
We believe that the well being of our medical students (and medical staff throughout the continuum of practice) matters too much not to ask, ‘How do they feel?’ Society, and students themselves, have invested too much in their education not to query ‘How well are they performing in the workplace?’. Our accountability to the community demands we ask, ‘How are their patients going?’ This article presents a schema for building long-term monitoring in Australia, using linked and reliable data, that will enable these questions to be answered. Although the answers will be of interest to many, medical schools will then be well placed to alter their programs and processes based on these three domains of graduate well being, workplace performance and patient outcomes.
Introduction
Australia’s 20 medical schools have quite different approaches to selection and educational approaches. Some are postgraduate, others accept undergraduate entry and the duration of their courses varies: 4, 5 or 6 years. The effect of variation between schools on graduate ability and doctors’ long-term performance is unexplored, despite much strongly held opinion. Medical education has always been expensive for the community, and students themselves are now facing significant higher-education debts. The costs of the differences in course duration alone would suggest the need for some investigation, but there are more urgent issues.
There was recent acknowledgement by the 2015 Australian Review of Medical Intern Training1 that medical graduates ‘…enter the health system highly qualified from a variety of…medical programs but with often limited experience in actual patient care and no baseline of work-ready capabilities.’
Other data also suggest all is not right longer term with the graduates of our medical schools: some are depressed and burnt out,2,3 some are a source of patient complaints4 and many fail to practise evidence-based care.5,6 This is despite Australian Health Practitioner regulatory requirements, both those associated with registration (self- and peer disclosure) and the professional practice standards described within Good Medical Practice.7
Some have advocated for a national licensing examination to ensure a more consistent standard of graduate. There is no such examination in Australia (unlike those that currently exist in the US and soon to exist in the UK), and instead there is a collaborative approach to assessing competency with common examination questions and examination stations.8 However, any examination system is limited because ‘…focus on measurable behaviours [may] ignore the kinds of higher-order thinking and acting that constitute competence in demanding work, such as medicine.’9 Skills rather than knowledge are modern priorities. Clinicians no longer need to remember vast numbers of facts and crucial skills now include translation of evidence, coordination of care and collaboration. The right attitudes and attributes are also required to excel in the use of such skills (e.g. self-reflection, motivation). Compared with knowledge, examination of skills is difficult, resource intensive and much more useful for students when formative (hence the call to move to ‘assessment for learning, not of learning’10).
Since 1985, medical schools have been supported and guided by the Australian Medical Council (AMC).11 The AMC medical school accreditation process is one of self- and peer assessment against a set of broad standards. The AMC has encouraged diversity of approaches to medical education and its expert assessors provide an improvement-focused program assessment. US commentators recently criticised the effectiveness of this style of medical program accreditation as a stand-alone assurance of quality.12 Yet, the AMC Standards for Assessment and Accreditation of Primary Medical Programs specifically require Australian schools to also track their graduates:
6.2.1 The medical education provider analyses the performance of cohorts of students and graduates in relation to the outcomes of the medical program.13
Providers currently have few tools available to assist them to do this. We propose that an approach to monitoring graduates long term be considered. The aim of such monitoring would be to improve the education students receive, tailoring it accurately based on Australian experiences and thus improve patient care. The proposed system has three elements: (1) ‘How do they feel?’; (2) ‘How are they doing?’; and (3) ‘How are their patients going?’ Explanation of these three elements and possibilities for monitoring follows.
Well being, mental health and career satisfaction: ‘How do they feel?’
The issue of poor doctor mental health, and its effects, has been publicised in Australian media recently with the tragic report of the suicide of three junior doctors.14 Issues cited include workplace culture, competition for training places, the financial cost of postgraduate examinations and the inability of the profession to look objectively at balancing career and life outside work.14 A recent Lancet editorial cited the fragmentation of clinical teams, shift working and loss of locus of control as contributors to stress and to the erosion of mechanisms that previously maintained resilience.15
The demonstrable extent of the problem of reduced well being and mental illness in students and doctors2,3 supports long-term monitoring. Other countries do this. The Longitudinal Study of Norwegian Medical Students and Doctors16 has a major focus on well being and involves all Norwegian medical schools. It commenced in 1993 and is now reporting on 15-year follow-up work17. The UK has a National Training Survey (http://www.gmc-uk.org/education/surveys.asp accessed 3 August 2017); this compulsory survey is completed by those in post-primary medical qualification training and captures data on trainee satisfaction with their programs, compliance with General Medical Council standards and Royal College speciality curricula, data from speciality trainee annual progression reports and data from employers. The data are freely available online, allowing comparison of outcomes according to several domains, including qualifying medical school, geographical region, medical and surgical speciality, etc.
Australia has a Medical Schools Outcomes Database (MSOD)18, a national data collection established in 2004 by Medical Deans Australia and New Zealand (MDANZ) and recruiting over 90% of medical students (with over 30 000 participants now in the dataset). There are unique identifiers for participants and a linkage is currently being sought across the MSOD, medical registration and national health workforce datasets. The focus of the future linkage work is on career choice (e.g. intentions vs what happens). The questionnaire contains several interesting demographic and attitudinal items (including documented student concern about educational debt and satisfaction with medical schooling). The first cohort is now in their 5th year after graduation. The scope of the survey could be enhanced with the addition of items on career satisfaction, well being and mental health to create a more detailed and curriculum-relevant picture than is currently available. This work could even develop into a longitudinal study with the addition of items from the MSOD survey into the existing surveys that form part of Australian Health Practitioner Regulation Agency (AHPRA) registration processes. This could also provide an opportunity to look at issues around equity with regard to gender and career choices, time out of work, return to work etc. This would require agreements to share data and an understanding from students, doctors, health regulators and workforce planners around how the data will be used.
Actions are possible. The culture of performance and achievement that extends from high school to university and throughout a medical career is relentless, with the added stigma of perceived weakness if individuals ask for help or admit they are not coping.19 In theory, educational interventions that increase engagement,20 improve students’ efficacy and resilience (see, for instance, Wald et al.21) and help them develop a positive sense of self22 can be protective against burn-out and depression. Griffith University investigated the association between training stress, coping skills and thoughts of dropping out of medicine in students and junior doctors.23 Those serious about dropping out used avoidance and risky behaviour as coping mechanisms. In the UK, a tool based on established occupational health risk models has been developed to look at medical student well being.24 When results of six medical schools were compared, schools did well in development of skills and knowledge, but scored more variably in areas such as work–life balance, academic demands and health. This formative tool can be used by medical schools to review their curricula and learning environment for quality improvement, and could be used to compare medical schools.
Discussions we have had with medical students and young doctors has included the importance of building support networks and self-monitoring of mental health and stress. Work clarifying the precise value of the many strategies that could be promoted, such as mindfulness, peer support groups, meditation, exercise and easy access to psychological services, is also needed. See Box 1 for more details.
Box 1. Proposed actions to monitor and improve well being, mental health and career satisfaction |
|
Behaviour and performance: ‘How are they doing?’
Currently, Australian medical schools do not systematically and regularly engage in ‘post-market surveillance’ with regard to the competence and professional behaviours of their students. If they did, this would enable schools to improve their management of conduct and health issues, their approach to progression and potentially reconsider entry criteria.
There is much discussion on monitoring and influencing professional behaviours within medical school, which is beyond the scope of this paper (certainly the disorganised or those that resent direction often stay that way, hence the heralded association of failure to bring vaccination certificates on entry with later professional behaviour problems25). To date, there has been little Australian research26,27 that considers medical graduate performance in the workplace. However, it has been a recent focus in the UK, with studies of the ‘Foundation Year’.28–30
In an ideal world, reliable junior doctor assessments would be able to be linked with medical school entry and assessment data. Unfortunately, New South Wales prevocational training assessments have been found to be superficial31 and Western Australian workplace assessments, of limited reliability.32 AHPRA is apparently concerned about the lack of the current utility of assessments for detecting problem doctors,32 and improved instruments and supervisor training seem likely. However, limited long-term investment in trainees by health services and supervisors (due to short-term rotations and contracts) will limit the effectiveness of any instrument.33
Having a valid workplace assessment instrument alone will not be sufficient; it should be combined with study of successful graduates and enquiry into any perceived skill and capability deficits.34,35 Medical schools may have significant support structures in place, in addition to policies and procedures that allow students with health issues to complete the course with reasonable adjustments. Students may expect these same supports in the workplace, and the realisation that this is often not the case is a shock. Medical schools need to work with students and health service employers to narrow the expectation–reality gap.
Managers of junior medical officers are keen for a feed-forward approach, with identification of issues that may mean junior doctors need more support. Cooperation and communication between employers, medical schools and possibly regulators could enable both feed-forward and feedback on doctors and trainees at risk (e.g. those with prior complaints or a lack of response to feedback). There are privacy issues in both directions, but these are not insurmountable when the overriding concerns are patient and doctor safety, and meetings allowing for frank two-way discussion could be organised. The creation of positive case studies could also help improve workplace support for doctors in health and well being. Finally, this linkage will ensure appropriate employer feedback into curriculum redesign.35
There could also be behaviour and performance follow-up past the junior doctor years. Medical board sanctions are not a good measure, because these are rare and typically occur after years of harm or underperformance. Formal complaints (to healthcare complaints commissioners or AHPRA) are far more frequent and are often associated with poor communication by practitioners (skills that are teachable and assessable). Practitioner characteristics (e.g. age, sex, type of practice) form part of a recently developed predictive score for complaints,4 and the medical school attended could form part of such future analyses. See Box 2 for more details.
Box 2. Proposed actions to monitor and improve behaviour and performance |
|
Measurement of patient outcomes: ‘How are their patients going?’
Patient outcomes are a key long-term measure. Many apparently competent graduates go on to provide suboptimal care to patients during their careers, especially by failing to practise evidence-based medicine. In one US study, up to 35% of end-of-life Medicare expenditures, and 12% of overall Medicare expenditures, were explained by physician beliefs, not justified either by patient preferences or by evidence of clinical effectiveness.36 The Australian Atlases of Healthcare Variation (see https://www.safetyandquality.gov.au/atlas/atlas-2015/ and https://www.safetyandquality.gov.au/atlas/atlas-2017/, both accessed 3 August 2017) have demonstrated substantial variation in clinical practice, much of it unexplained. Doctors generate unnecessary tests and procedures5 and financially exploit patients.6 It is thought that ‘physician decision-making skills and perceptions of what constitutes evidence-based practice are influenced by the training they received in medical school’.37 Regardless of what teaching may be provided with regard to current concepts of evidence-based medicine, curriculum practices that encourage self-directed learning have the potential to increase the ability of students to continue to learn well when they are employed, when they do need to identify their own educational deficits and fill them.38
The effect of medical schooling on patient outcomes is too complex an intervention to study easily; many factors that determine the care the patient receives are not under the control of an individual provider, but affected by availability, patient preference, funding models, actions of other members of the health care team, etc. (the ‘dilution effect’).39 This is particularly so for junior medical officers; thus, for this group, process measures such as teamwork competency offer more promise.
However, we do now have the resources and skills to undertake meaningful big data outcomes research on independent practitioners. Variation in patient outcomes related to education has been demonstrated for US speciality training programs. A study reflecting nearly 5 million deliveries, more than 4000 obstetricians and more than 100 residency programs found programs were associated with substantial variation in maternal complication rates, such that women treated by obstetricians who trained in programs with the highest complication rate had a complication rate one-third higher than those treated by obstetricians trained in the better-performing programs.40 Other linkage work with a billing (cost) focus revealed physician spending patterns were associated with regional spending patterns during their residency training.41 The term ‘imprinting’ has been used for these long-lasting training effects.40
Although graduates spend more time in training programs than they do in medical school, a recent US study42 linked a physician dataset (877 000 practitioners) with Medicare procedures and payment and showed geographic distribution of practitioners and their billing practices correlated with medical school tuition costs. The authors of that study also found other divergences suggesting ‘early influences during medical school may have lasting impacts on a physician’s future clinical decisions’.42 To understand more, a snapshot study such as this needs to be developed with longitudinal work. See Box 3 for more details.
Box 3. Proposed action to monitor and improve patient outcomes |
Research is funded into variation in provider-sensitive interventions by medical schools and speciality training programs. After approval from the Department of Health, linkage with Medicare Benefit Schedule (MBS) claims data is undertaken. A focus could be on clinical areas identified by the Australian Commission on Safety and Quality in Health Care (ACQSHC) as priority areas for action, or high-cost, low-value procedures. Possible sponsors: ACQSHC, Australian Institute of Health and Welfare (AIHW), Australian Health Practitioner Regulation Agency (AHPRA), Australian National Health and Medical Research Council (NHMRC) |
Conclusion
The barriers to establishing the kinds of long-term follow-up of medical students we have described would be significant. These barriers exist in other countries and some have gained stakeholder cooperation for national data collections, or support for research projects in these areas.
Data of the kind we describe will enable prioritisation of improvements to our costly medical programs. These improvements are likely to include changes to entry processes, management of conduct and progression and balance of curriculum content (e.g. communication skills and health information literacy skills to support evidence-based practice). However, overemphasis on preparedness for the first year of work should be avoided, because this may risk biasing curriculum towards the instrumental skills required of a junior doctor with a negative effect on the development of skills for self-directed learning that will enable them to remain continuously competent in evidence-based care throughout their career. Ideally, a spiral curriculum43 would follow the students into the workplace (the prevocational years) and even influence advanced training.
Such improvements to medical schooling become ‘preventative measures’ to ensure that future practitioners have better well being, there are fewer patient complaints and that a more reliable standard of evidence-based care is practised.
Competing interests
The authors declare no competing interests.
References
[1] Wilson A, Feyer A. Review of medical intern training – final report. Canberra: Council of Australian Governments; 2015. Available at: http://www.coaghealthcouncil.gov.au/portals/0/review of medical intern training final report publication version.pdf [verified 5 October 2016].[2] beyondblue. National mental health survey of doctors and medical students. October 2013. Melbourne: beyondblue; 2013. Available at: https://www.beyondblue.org.au/docs/default-source/research-project-files/bl1132-report---nmhdmss-full-report_web [verified 12 May 2016].
[3] Rogers ME, Creed PA, Searle J. Emotional labour, training stress, burnout, and depressive symptoms in junior doctors. J Vocat Educ Train 2014; 66 232–48.
| Emotional labour, training stress, burnout, and depressive symptoms in junior doctors.Crossref | GoogleScholarGoogle Scholar |
[4] Spittal MJ, Bismark MM, Studdert DM. The PRONE score: an algorithm for predicting doctors’ risks of formal patient complaints using routinely collected administrative data. BMJ Qual Saf 2015; 24 360–368.
| The PRONE score: an algorithm for predicting doctors’ risks of formal patient complaints using routinely collected administrative data.Crossref | GoogleScholarGoogle Scholar |
[5] Duckett S, Breadon P, Romanes D. Questionable care: avoiding ineffective treatment. Melbourne: Grattan Institute; 2015.
[6] Medibank Private and Royal Australasian College of Surgeons. Surgical variance report: general surgery. 2016. Available at: https://www.surgeons.org/media/24091469/Surgical-Variance-Report-General-Surgery.pdf [verified 22 June 2016].
[7] The Medical Board of Australia. Good medical practice: a code of conduct for doctors in Australia. 2014. Available at: http://www.medicalboard.gov.au/Codes-Guidelines-Policies/Code-of-conduct.aspx [verified 6 July 2015].
[8] Wilkinson D. A new paradigm for assessment of learning outcomes among Australian medical students: in the best interest of all medical students. Aust Med Student J 2014; 4 45–47.
[9] ten Cate O, Billett S. Competency‐based medical education: origins, perspectives and potentialities. Med Educ 2014; 48 325–32.
| Competency‐based medical education: origins, perspectives and potentialities.Crossref | GoogleScholarGoogle Scholar |
[10] Harrison C, Wass V. The challenge of changing to an assessment for learning culture. Med Educ 2016; 50 704–6.
| The challenge of changing to an assessment for learning culture.Crossref | GoogleScholarGoogle Scholar |
[11] Geffen L. A brief history of medical education and training in Australia. MJA 2014; 201 S19–22.
| A brief history of medical education and training in Australia.Crossref | GoogleScholarGoogle Scholar |
[12] Holmboe ES, Batalden P. Achieving the desired transformation: thoughts on next steps for outcomes-based medical education. Acad Med 2015; 90 1215–23.
| Achieving the desired transformation: thoughts on next steps for outcomes-based medical education.Crossref | GoogleScholarGoogle Scholar |
[13] Australian Medical Council (AMC). Standards for assessment and accreditation of primary medical programs. Canberra: AMC; 2012.
[14] Anonymous. Three of my colleagues have killed themselves. Medicine’s dark secret can’t go on. Sydney Morning Herald 10 February 2017. Available at: http://www.smh.com.au/comment/three-of-my-colleagues-have-killed-themselves-medicines-dark-secret-cant-be-allowed-to-go-on-20170209-gu9crd.html [verified 18 July 2017].
[15] NSW Health Department. The clinician’s toolkit for improving patient care. Sydney: NSW Health Department; 2001. Available at: https://wdhb.org.nz/contented/clientfiles/whanganui-district-health-board/files/rttc_clinician-s-toolkit-for-improving-patient-care-nsw.pdf [verified 25 September 2017].
[16] Nylenna M, Gulbrandsen P, Førde R, Aasland OG. Unhappy doctors? A longitudinal study of life and job satsifaction among Norwegian doctors 1994–2002. BMC Health Serv Res 2005; 5 44
| Unhappy doctors? A longitudinal study of life and job satsifaction among Norwegian doctors 1994–2002.Crossref | GoogleScholarGoogle Scholar |
[17] Mahmood JI, Grotmol KS, Tesli M, Vaglum P, Tyssen R. Contextual factors and mental distress as possible predictors of hazardous drinking in Norwegian medical doctors: a 15-year longitudinal, nationwide study. Eur Addict Res 2007; 23 19–27.
| Contextual factors and mental distress as possible predictors of hazardous drinking in Norwegian medical doctors: a 15-year longitudinal, nationwide study.Crossref | GoogleScholarGoogle Scholar |
[18] Kaur B, Carberry A, Hogan N, Roberton D, Beilby J. The medical schools outcomes database project: Australian medical student characteristics. BMC Med Educ 2014; 14 180
| The medical schools outcomes database project: Australian medical student characteristics.Crossref | GoogleScholarGoogle Scholar |
[19] Muller D. Kathryn. N Engl J Med 2017; 376 1101–3.
| Kathryn.Crossref | GoogleScholarGoogle Scholar |
[20] Billett S. Developing students’ personal epistemologies. Integrating practice-based experiences into higher education. Haarlem: Springer; 2015.
[21] Wald HS, Anthony D, Hutchinson TA, Liben S, Smilovitch M, Donato AA. Professional identity formation in medical education for humanistic, resilient physicians: pedagogic strategies for bridging theory to practice. Acad Med 2015; 90 753–60.
| Professional identity formation in medical education for humanistic, resilient physicians: pedagogic strategies for bridging theory to practice.Crossref | GoogleScholarGoogle Scholar |
[22] Richards J, Sweet LP, Billett S. Preparing medical students as agentic learners through enhancing student engagement in clinical education. Asia-Pac J Coop Educ 2013; 14: 251–263.
[23] Rogers ME, Creed PA, Searle J, Nicholls SL. Coping with medical training demands: thinking of dropping out, or in it for the long haul. Stud High Educ 2015; 41 1715–32.
| Coping with medical training demands: thinking of dropping out, or in it for the long haul.Crossref | GoogleScholarGoogle Scholar |
[24] Lumley S, Ward P, Roberts L, Mann JP. Self-reported extracurricular activity, academic success, and quality of life in UK medical students. Int J Med Educ 2015; 6 111–17.
| Self-reported extracurricular activity, academic success, and quality of life in UK medical students.Crossref | GoogleScholarGoogle Scholar |
[25] Stern DT, Frohna A, Gruppen L. The prediction of professional behaviour. Med Educ 2005; 39 75–82.
| The prediction of professional behaviour.Crossref | GoogleScholarGoogle Scholar |
[26] Carr SE, Celenza A, Puddey IB, Lake F. Relationships between academic performance of medical students and their workplace performance as junior doctors. BMC Med Educ 2014; 14 157
| Relationships between academic performance of medical students and their workplace performance as junior doctors.Crossref | GoogleScholarGoogle Scholar |
[27] Dean SJ, Barratt AL, Hendry GD, Lyon PM. Preparedness for hospital practice among graduates of a problem-based, graduate-entry medical program. Med J Aust 2003; 178 163–6.
[28] Burford B, Whittle V, Vance GH. The relationship between medical student learning opportunities and preparedness for practice: a questionnaire study. BMC Med Educ 2014; 14 223
| The relationship between medical student learning opportunities and preparedness for practice: a questionnaire study.Crossref | GoogleScholarGoogle Scholar |
[29] Goodyear HM. First year doctors experience of work related wellbeing and implications for educational provision. Int J Med Educ 2014; 5 103–9.
| First year doctors experience of work related wellbeing and implications for educational provision.Crossref | GoogleScholarGoogle Scholar |
[30] Illing JC, Morrow GM, Rothwell nee Kergon CR, Burford BC, Baldauf BK, Davies CL, Peile EB, Spencer JA, Johnson N, Allen M, Morrison J. Perceptions of UK medical graduates’ preparedness for practice: a multi-centre qualitative study reflecting the importance of learning on the job. BMC Med Educ 2013; 13 34
| Perceptions of UK medical graduates’ preparedness for practice: a multi-centre qualitative study reflecting the importance of learning on the job.Crossref | GoogleScholarGoogle Scholar |
[31] Bingham CM, Crampton R. A review of prevocational medical trainee assessment in New South Wales. Med J Aust 2011; 195 410–12.
| A review of prevocational medical trainee assessment in New South Wales.Crossref | GoogleScholarGoogle Scholar |
[32] Carr SE, Celenza T, Lake FR. Descriptive analysis of junior doctor assessment in the first postgraduate year. Med Teach 2014; 36 983–90.
| Descriptive analysis of junior doctor assessment in the first postgraduate year.Crossref | GoogleScholarGoogle Scholar |
[33] Katelaris AG, Jorm C. Improved assessment needed for young doctors. Med J Aust 2011; 195 369
| Improved assessment needed for young doctors.Crossref | GoogleScholarGoogle Scholar |
[34] Scott G, Chang E, Grebennikov L. Using successful graduates to improve the quality of undergraduate nursing programs. J Teach Learn Grad Employ 2010; 1 26–44.
| Using successful graduates to improve the quality of undergraduate nursing programs.Crossref | GoogleScholarGoogle Scholar |
[35] Shah M, Grebennikov L, Nair CS. A decade of study on employer feedback on the quality of university graduates. Qual Assur Educ 2015; 23 262–78.
| A decade of study on employer feedback on the quality of university graduates.Crossref | GoogleScholarGoogle Scholar |
[36] Cutler D, Skinner J, Stern AD, Wennberg D. Physician beliefs and patient preferences: a new look at regional variation in health care spending. Cambridge, MA National Bureau of Economic Research; 2013.
[37] Reschovsky JD, Rich EC, Lake TK. Factors contributing to variations in physicians’ use of evidence at the point of care: a conceptual model. J Gen Intern Med 2015; 30 555–61.
| Factors contributing to variations in physicians’ use of evidence at the point of care: a conceptual model.Crossref | GoogleScholarGoogle Scholar |
[38] Lambert DR, Lurie SJ, Lyness JM, Ward DS. Standardizing and personalizing science in medical education. Acad Med 2010; 85 356–62.
| Standardizing and personalizing science in medical education.Crossref | GoogleScholarGoogle Scholar |
[39] Cook DA, West CP. Perspective: reconsidering the focus on ‘outcomes research’ in medical education: a cautionary note. Acad Med 2013; 88 162–7.
| Perspective: reconsidering the focus on ‘outcomes research’ in medical education: a cautionary note.Crossref | GoogleScholarGoogle Scholar |
[40] Asch DA, Nicholson S, Srinivas S, Herrin J, Epstein AJ. Evaluating obstetrical residency programs using patient outcomes. JAMA 2009; 302 1277–83.
| Evaluating obstetrical residency programs using patient outcomes.Crossref | GoogleScholarGoogle Scholar | 1:CAS:528:DC%2BD1MXhtFylsb%2FI&md5=5e775769fdef7b95c2dc1a95fa6c9182CAS |
[41] Chen C, Petterson S, Phillips R, Bazemore A, Mullan F. Spending patterns in region of residency training and subsequent expenditures for care provided by practicing physicians for Medicare beneficiaries. JAMA 2014; 312 2385–93.
| Spending patterns in region of residency training and subsequent expenditures for care provided by practicing physicians for Medicare beneficiaries.Crossref | GoogleScholarGoogle Scholar | 1:CAS:528:DC%2BC2MXit1Ogtr8%3D&md5=566099ec17a77951cbddfe6d6cc8b0d3CAS |
[42] Feldman K, Chawla NV. Does medical school training relate to practice? Evidence from big data. Big Data 2015; 3 103–13.
| Does medical school training relate to practice? Evidence from big data.Crossref | GoogleScholarGoogle Scholar |
[43] Harden RM. What is a spiral curriculum? Med Teach 1999; 21 141–3.
| What is a spiral curriculum?Crossref | GoogleScholarGoogle Scholar | 1:STN:280:DC%2BC3M7ltl2ruw%3D%3D&md5=2262adf5108c14ac02cf9c712677fb6dCAS |