User login
Residency training, in addition to developing clinical competence among trainees, is charged with improving resident teaching skills. The Liaison Committee on Medical Education and the Accreditation Council for Graduate Medical Education require that residents be provided with training or resources to develop their teaching skills.[1, 2] A variety of resident‐as‐teacher (RaT) programs have been described; however, the optimal format of such programs remains in question.[3] High‐fidelity medical simulation using mannequins has been shown to be an effective teaching tool in various medical specialties[4, 5, 6, 7] and may prove to be useful in teacher training.[8] Teaching in a simulation‐based environment can give participants the opportunity to apply their teaching skills in a clinical environment, as they would on the wards, but in a more controlled, predictable setting and without compromising patient safety. In addition, simulation offers the opportunity to engage in deliberate practice by allowing teachers to facilitate the same case on multiple occasions with different learners. Deliberate practice, which involves task repetition with feedback aimed at improving performance, has been shown to be important in developing expertise.[9]
We previously described the first use of a high‐fidelity simulation curriculum for internal medicine (IM) interns focused on clinical decision‐making skills, in which second‐ and third‐year residents served as facilitators.[10, 11] Herein, we describe a RaT program in which residents participated in a workshop, then served as facilitators in the intern curriculum and received feedback from faculty. We hypothesized that such a program would improve residents' teaching and feedback skills, both in the simulation environment and on the wards.
METHODS
We conducted a single‐group study evaluating teaching and feedback skills among upper‐level resident facilitators before and after participation in the RaT program. We measured residents' teaching skills using pre‐ and post‐program self‐assessments as well as evaluations completed by the intern learners after each session and at the completion of the curriculum.
Setting and Participants
We embedded the RaT program within a simulation curriculum administered July to October of 2013 for all IM interns at Massachusetts General Hospital (interns in the preliminary program who planned to pursue another field after the completion of the intern year were excluded) (n = 52). We invited postgraduate year (PGY) II and III residents (n = 102) to participate in the IM simulation program as facilitators via email. The curriculum consisted of 8 cases focusing on acute clinical scenarios encountered on the general medicine wards. The cases were administered during 1‐hour sessions 4 mornings per week from 7 AM to 8 AM prior to clinical duties. Interns completed the curriculum over 4 sessions during their outpatient rotation. The case topics were (1) hypertensive emergency, (2) post‐procedure bleed, (3) congestive heart failure, (4) atrial fibrillation with rapid ventricular response, (5) altered mental status/alcohol withdrawal, (6) nonsustained ventricular tachycardia heralding acute coronary syndrome, (7) cardiac tamponade, and (8) anaphylaxis. During each session, groups of 2 to 3 interns worked through 2 cases using a high‐fidelity mannequin (Laerdal 3G, Wappingers Falls, NY) with 2 resident facilitators. One facilitator operated the mannequin, while the other served as a nurse. Each case was followed by a structured debriefing led by 1 of the resident facilitators (facilitators switched roles for the second case). The number of sessions facilitated varied for each resident based on individual schedules and preferences.
Four senior residents who were appointed as simulation leaders (G.A.A., J.K.H., R.K., Z.S.) and 2 faculty advisors (P.F.C., E.M.M.) administered the program. Simulation resident leaders scheduled facilitators and interns and participated in a portion of simulation sessions as facilitators, but they were not analyzed as participants for the purposes of this study. The curriculum was administered without interfering with clinical duties, and no additional time was protected for interns or residents participating in the curriculum.
Resident‐as‐Teacher Program Structure
We invited participating resident facilitators to attend a 1‐hour interactive workshop prior to serving as facilitators. The workshop focused on building learner‐centered and small‐group teaching skills, as well as introducing residents to a 5‐stage debriefing framework developed by the authors and based on simulation debriefing best practices (Table 1).[12, 13, 14]
Stage of Debriefing | Action | Rationale |
---|---|---|
| ||
Emotional response | Elicit learners' emotions about the case | It is important to acknowledge and address both positive and negative emotions that arise during the case before debriefing the specific medical and communications aspects of the case. Unaddressed emotional responses may hinder subsequent debriefing. |
Objectives* | Elicit learners' objectives and combine them with the stated learning objectives of the case to determine debriefing objectives | The limited amount of time allocated for debriefing (1520 minutes) does not allow the facilitator to cover all aspects of medical management and communication skills in a particular case. Focusing on the most salient objectives, including those identified by the learners, allows the facilitator to engage in learner‐centered debriefing. |
Analysis | Analyze the learners' approach to the case | Analyzing the learners' approach to the case using the advocacy‐inquiry method[11] seeks to uncover the learner's assumptions/frameworks behind the decision made during the case. This approach allows the facilitator to understand the learners' thought process and target teaching points to more precisely address the learners' needs. |
Teaching | Address knowledge gaps and incorrect assumptions | Learner‐centered debriefing within a limited timeframe requires teaching to be brief and targeted toward the defined objectives. It should also address the knowledge gaps and incorrect assumptions uncovered during the analysis phase. |
Summary | Summarize key takeaways | Summarizing highlights the key points of the debriefing and can be used to suggest further exploration of topics through self‐study (if necessary). |
Resident facilitators were observed by simulation faculty and simulation resident leaders throughout the intern curriculum and given structured feedback either in‐person immediately after completion of the simulation session or via a detailed same‐day e‐mail if the time allotted for feedback was not sufficient. Feedback was structured by the 5 stages of debriefing described in Table 1, and included soliciting residents' observations on the teaching experience and specific behaviors observed by faculty during the scenarios. E‐mail feedback (also structured by stages of debriefing and including observed behaviors) was typically followed by verbal feedback during the next simulation session.
The RaT program was composed of 3 elements: the workshop, case facilitation, and direct observation with feedback. Because we felt that the opportunity for directly observed teaching and feedback in a ward‐like controlled environment was a unique advantage offered by the simulation setting, we included all residents who served as facilitators in the analysis, regardless of whether or not they had attended the workshop.
Evaluation Instruments
Survey instruments were developed by the investigators, reviewed by several experts in simulation, pilot tested among residents not participating in the simulation program, and revised by the investigators.
Pre‐program Facilitator Survey
Prior to the RaT workshop, resident facilitators completed a baseline survey evaluating their preparedness to teach and give feedback on the wards and in a simulation‐based setting on a 5‐point scale (see Supporting Information, Appendix I, in the online version of this article).
Post‐program Facilitator Survey
Approximately 3 weeks after completion of the intern simulation curriculum, resident facilitators were asked to complete an online post‐program survey, which remained open for 1 month (residents completed this survey anywhere from 3 weeks to 4 months after their participation in the RaT program depending on the timing of their facilitation). The survey asked residents to evaluate their comfort with their current post‐program teaching skills as well as their pre‐program skills in retrospect, as previous research demonstrated that learners may overestimate their skills prior to training programs.[15] Resident facilitators could complete the surveys nonanonymously to allow for matched‐pairs analysis of the change in teaching skills over the course of the program (see Supporting Information, Appendix II, in the online version of this article).
Intern Evaluation of Facilitator Debriefing Skills
After each case, intern learners were asked to anonymously evaluate the teaching effectiveness of the lead resident facilitator using the adapted Debriefing Assessment for Simulation in Healthcare (DASH) instrument.[16] The DASH instrument evaluated the following domains: (1) instructor maintained an engaging context for learning, (2) instructor structured the debriefing in an organized way, (3) instructor provoked in‐depth discussions that led me to reflect on my performance, (4) instructor identified what I did well or poorly and why, (5) instructor helped me see how to improve or how to sustain good performance, (6) overall effectiveness of the simulation session (see Supporting Information, Appendix III, in the online version of this article).
Post‐program Intern Survey
Two months following the completion of the simulation curriculum, intern learners received an anonymous online post‐program evaluation assessing program efficacy and resident facilitator teaching (see Supporting Information, Appendix IV, in the online version of this article).
Statistical Analysis
Teaching skills and learners' DASH ratings were compared using the Student t test, Pearson 2 test, and Fisher exact test as appropriate. Pre‐ and post‐program rating of teaching skills was undertaken in aggregate and as a matched‐pairs analysis.
The study was approved by the Partners Institutional Review Board.
RESULTS
Forty‐one resident facilitators participated in 118 individual simulation sessions encompassing 236 case scenarios. Thirty‐four residents completed the post‐program facilitator survey and were included in the analysis. Of these, 26 (76%) participated in the workshop and completed the pre‐program survey. Twenty‐three of the 34 residents (68%) completed the post‐program evaluation nonanonymously (13 PGY‐II, 10 PGY‐III). Of these, 16 completed the pre‐program survey nonanonymously. The average number of sessions facilitated by each resident was 3.9 (range, 112).
Pre‐ and Post‐program Self‐Assessment of Residents' Teaching Skills
Participation in the simulation RaT program led to improvements in resident facilitators' self‐reported teaching skills across multiple domains (Table 2). These results were consistent when using the retrospective pre‐program assessment in matched‐pairs analysis (n=34) and when performing the analysis using the true pre‐program preparedness compared to post‐program comfort with teaching skills in a non‐matched‐pairs fashion (n = 26) and matched‐pairs fashion (n = 16). We report P values for the more conservative estimates using the retrospective pre‐program assessment matched‐pairs analysis. The most significant improvements occurred in residents' ability to teach in a simulated environment (2.81 to 4.16, P < 0.001 [5‐point scale]) and give feedback (3.35 to 3.77, P < 0.001).
Pre‐program Rating (n = 34) | Post‐program Rating (n = 34) | P Value | |
---|---|---|---|
| |||
Teaching on rounds | 3.75 | 4.03 | 0.005 |
Teaching on wards outside rounds | 3.83 | 4.07 | 0.007 |
Teaching in simulation | 2.81 | 4.16 | <0.001 |
Giving feedback | 3.35 | 3.77 | <0.001 |
Resident facilitators reported that participation in the RaT program had a significant impact on their teaching skills both within and outside of the simulation environment (Table 3). However, the greatest gains were seen in the domain of teaching in simulation. It was also noted that participation in the program improved resident facilitators' medical knowledge.
Category | Not at All | Slightly Improved | Moderately Improved | Greatly Improved | Not Sure |
---|---|---|---|---|---|
Teaching on rounds, n = 34 | 4 (12%) | 12 (35%) | 13 (38%) | 4 (12%) | 1 (3%) |
Teaching on wards outside rounds, n = 34 | 3 (9%) | 13 (38%) | 12 (35%) | 5 (15%) | 1 (3%) |
Teaching in simulation, n = 34 | 0 (0%) | 4 (12%) | 7 (21%) | 23 (68%) | 0 (0%) |
Giving feedback, n = 34 | 4 (12%) | 10 (29%) | 12 (35%) | 6 (18%) | 2 (6%) |
Medical knowledge, n = 34 | 2 (6%) | 11 (32%) | 18 (53%) | 3 (9%) | 0 (0%) |
Subgroup analyses were performed comparing the perceived improvement in teaching and feedback skills among those who did or did not attend the facilitator workshop, those who facilitated 5 or more versus less than 5 sessions, and those who received or did not receive direct observation and feedback from faculty. Although numerically greater gains were seen across all 4 domains among those who attended the workshop, facilitated 5 or more sessions, or received feedback from faculty, only teaching on rounds and on the wards outside rounds reached statistical significance (Table 4). It should be noted that all residents who facilitated 5 or more sessions also attended the workshop and received feedback from faculty. We also compared perceived improvement among PGY‐II and PGY‐III residents. In contrast to PGY‐II residents, who demonstrated an improvement in all 4 domains, PGY‐III residents only demonstrated improvement in simulation‐based teaching.
Pre‐program | Post‐program | P Value | Pre‐program | Post‐program | P Value | |
---|---|---|---|---|---|---|
Facilitated Less Than 5 Sessions (n = 18) | Facilitated 5 or More Sessions (n = 11) | |||||
Did Not Attend Workshop (n = 10) | Attended Workshop (n = 22) | |||||
Received Feedback From Resident Leaders Only (n = 11) | Received Faculty Feedback (n = 21) | |||||
PGY‐II (n = 13) | PGY‐III (n = 9) | |||||
| ||||||
Teaching on rounds | 3.68 | 3.79 | 0.16 | 3.85 | 4.38 | 0.01 |
Teaching on wards outside rounds | 3.82 | 4 | 0.08 | 3.85 | 4.15 | 0.04 |
Teaching in simulation | 2.89 | 4.06 | <0.01 | 2.69 | 4.31 | <0.01 |
Giving feedback | 3.33 | 3.67 | 0.01 | 3.38 | 3.92 | 0.01 |
Teaching on rounds | 4 | 4.1 | 0.34 | 3.64 | 4 | <0.01 |
Teaching on wards outside rounds | 4 | 4 | 1.00 | 3.76 | 4.1 | <0.01 |
Teaching in simulation | 2.89 | 4.11 | <0.01 | 2.77 | 4.18 | <0.01 |
Giving feedback | 3.56 | 3.78 | 0.17 | 3.27 | 3.77 | <0.01 |
Teaching on rounds | 3.55 | 3.82 | 0.19 | 3.86 | 4.14 | 0.01 |
Teaching on wards outside rounds | 4 | 4 | 1.00 | 3.75 | 4.1 | <0.01 |
Teaching in simulation | 2.7 | 3.8 | <0.01 | 2.86 | 4.33 | <0.01 |
Giving feedback | 3.2 | 3.6 | 0.04 | 3.43 | 3.86 | <0.01 |
Teaching on rounds | 3.38 | 3.85 | 0.03 | 4.22 | 4.22 | 1 |
Teaching on wards outside rounds | 3.54 | 3.85 | 0.04 | 4.14 | 4.14 | 1 |
Teaching in simulation | 2.46 | 4.15 | <0.01 | 3.13 | 4.13 | <0.01 |
Giving feedback | 3.23 | 3.62 | 0.02 | 3.5 | 3.88 | 0.08 |
Intern Learners' Assessment of Resident Facilitators and the Program Overall
During the course of the program, intern learners completed 166 DASH ratings evaluating 34 resident facilitators (see Supporting Information, Appendix V, in the online version of this article). Ratings for the 6 DASH items ranged from 6.49 to 6.73 (7‐point scale), demonstrating a high level of facilitator efficacy across multiple domains. No differences in DASH scores were noted among subgroups of resident facilitators described in the previous paragraph.
Thirty‐eight of 52 intern learners (73%) completed the post‐program survey.
Facilitator Behaviors | Very Often, >75% | Often, >50% | Sometimes, 25%50% | Rarely, <25% | Never |
---|---|---|---|---|---|
Elicited emotional reactions, n = 38 | 18 (47%) | 16 (42%) | 4 (11%) | 0 (0%) | 0 (0%) |
Elicited objectives from learner, n = 37 | 26 (69%) | 8 (22%) | 2 (6%) | 1 (3%) | 0 (0%) |
Asked to share clinical reasoning, n = 38 | 21 (56%) | 13 (33%) | 4 (11%) | 0 (0%) | 0 (0%) |
Summarized learning points, n = 38 | 31 (81%) | 7 (19%) | 0 (0%) | 0 (0%) | 0 (0%) |
Spoke for less than half of the session, n = 38 | 8 (22%) | 17 (44%) | 11 (28%) | 2 (6%) | 0 (0%) |
All intern learners rated the overall simulation experience as either excellent (81%) or good (19%) on the post‐program evaluation (4 or 5 on a 5‐point Likert scale, respectively). All interns strongly agreed (72%) or agreed (28%) that the simulation sessions improved their ability to manage acute clinical scenarios. Interns reported that resident facilitators frequently utilized specific debriefing techniques covered in the RaT curriculum during the debriefing sessions (Table 5).
DISCUSSION
We describe a unique RaT program embedded within a high‐fidelity medical simulation curriculum for IM interns. Our study demonstrates that resident facilitators noted an improvement in their teaching and feedback skills, both in the simulation setting and on the wards. Intern learners rated residents' teaching skills and the overall simulation curriculum highly, suggesting that residents were effective teachers.
The use of simulation in trainee‐as‐teacher curricula holds promise because it can provide an opportunity to teach in an environment closely approximating the wards, where trainees have the most opportunities to teach. However, in contrast to true ward‐based teaching, simulation can provide predictable scenarios in a controlled environment, which eliminates the distractions and unpredictability that exist on the wards, without compromising patient safety. Recently, Tofil et al. described the first use of simulation in a trainee‐as‐teacher program.[17] The investigators utilized a 1‐time simulation‐based teaching session, during which pediatric fellows completed a teacher‐training workshop, developed and served as facilitators in a simulated case, and received feedback. The use of simulation allowed fellows an opportunity to apply newly acquired skills in a controlled environment and receive feedback, which has been shown to improve teaching skills.[18]
The experience from our program expands on that of Tofil et al., as well as previously described trainee‐as‐teacher curricula, by introducing a component of deliberate practice that is unique to the simulation setting and has been absent from most previously described RaT programs.[3] Most residents had the opportunity to facilitate the same case on multiple occasions, allowing them to receive feedback and make adjustments. Residents who facilitated 5 or more sessions demonstrated more improvement, particularly in teaching outside of simulation, than residents who facilitated fewer sessions. It is notable that PGY‐II resident facilitators reported an improvement in their teaching skills on the wards, though less pronounced as compared to teaching in the simulation‐based environment, suggesting that benefits of the program may extend to nonsimulation‐based settings. Additional studies focusing on objective evaluation of ward‐based teaching are needed to further explore this phenomenon. Finally, the self‐reported improvements in medical knowledge by resident facilitators may serve as another benefit of our program.
Analysis of learner‐level data collected in the postcurriculum intern survey and DASH ratings provides additional support for the effectiveness of the RaT program. The majority of intern learners reported that resident facilitators used the techniques covered in our program frequently during debriefings. In addition, DASH scores clustered around maximum efficacy for all facilitators, suggesting that residents were effective teachers. Although we cannot directly assess whether the differences demonstrated in resident facilitators' self‐assessments translated to their teaching or were significant from the learners' perspective, these results support the hypothesis that self‐assessed improvements in teaching and feedback skills were significant.
In addition to improving resident teaching skills, our program had a positive impact on intern learners as evidenced by intern evaluations of the simulation curriculum. While utilizing relatively few faculty resources, our program was able to deliver an extensive and well‐received simulation curriculum to over 50 interns. The fact that 40% of second‐ and third‐year residents volunteered to teach in the program despite the early morning timing of the sessions speaks to the interest that trainees have in teaching in this setting. This model can serve as an important and efficient learning platform in residency training programs. It may be particularly salient to IM training programs where implementation of simulation curricula is challenging due to large numbers of residents and limited faculty resources. The barriers to and lessons learned from our experience with implementing the simulation curriculum have been previously described.[10, 11]
Our study has several limitations. Changes in residents' teaching skills were self‐assessed, which may be inaccurate as learners may overestimate their abilities.[19] Although we collected data on the experiences of intern learners that supported residents' self‐assessment, further studies using more objective measures (such as the Objective Structured Teaching Exercise[20]) should be undertaken. We did not objectively assess improvement of residents' teaching skills on the wards, with the exception of the residents' self assessment. Due to the timing of survey administration, some residents had as little as 1 month between completion of the curriculum and responding to the post‐curriculum survey, limiting their ability to evaluate their teaching skills on the wards. The transferability of the skills gained in simulation‐based teaching to teaching on the wards deserves further study. We cannot definitively attribute perceived improvement of teaching skills to the RaT program without a control group. However, the frequent use of recommended techniques during debriefing, which are not typically taught in other settings, supports the efficacy of the RaT program.
Our study did not allow us to determine which of the 3 components of the RaT program (workshop, facilitation practice, or direct observation and feedback) had the greatest impact on teaching skills or DASH ratings, as those who facilitated more sessions also completed the other components of the program. Furthermore, there may have been a selection bias among facilitators who facilitated more sessions. Because only 16 of 34 participants completed both the pre‐program and post‐program self‐assessments in a non‐anonymous fashion, we were not able to analyze the effect of pre‐program factors, such as prior teaching experience, on program outcomes. It should also be noted that allowing resident facilitators the option to complete the survey non‐anonymously could have biased our results. The simulation curriculum was conducted in a single center, and resident facilitators were self‐selecting; therefore, our results may not be generalizable. Finally, the DASH instrument was only administered after the RaT workshop and was likely limited further by the ceiling effect created by the learners' high satisfaction with the simulation program overall.
In summary, our simulation‐based RaT program improved resident facilitators' self‐reported teaching and feedback skills. Simulation‐based training provided an opportunity for deliberate practice of teaching skills in a controlled environment, which was a unique component of the program. The impact of deliberate practice on resident teaching skills and optimal methods to incorporate deliberate practice in RaT programs deserves further study. Our curriculum design may serve as a model for the development of simulation programs that can be employed to improve both intern learning and resident teaching skills.
Acknowledgements
The authors acknowledge Deborah Navedo, PhD, Assistant Professor, Massachusetts General Hospital Institute of Health Professions, and Emily M. Hayden, MD, Assistant Professor, Department of Emergency Medicine, Massachusetts General Hospital and Harvard Medical School, for their assistance with development of the RaT curriculum. The authors thank Dr. Jenny Rudolph, Senior Director, Institute for Medical Simulation at the Center for Medical Simulation, for her help in teaching us to use the DASH instrument. The authors also thank Dr. Daniel Hunt, MD, Associate Professor, Department of Medicine, Massachusetts General Hospital and Harvard Medical School, for his thoughtful review of this manuscript.
Disclosure: Nothing to report.
- Liaison Committee on Medical Education. Functions and structure of a medical school: standards for accreditation of medical education programs leading to the M.D. degree. Washington, DC, and Chicago, IL: Association of American Medical Colleges and American Medical Association; 2000.
- Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in pediatrics. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQ-PIF/320_pediatrics_07012013.pdf. Accessed June 18, 2014.
- A systematic review of resident‐as‐teacher programmes. Med Educ. 2009;43(12):1129–1140. , , , .
- The utility of simulation in medical education: what is the evidence? Mt Sinai J Med. 2009;76:330–343. , , , et al.
- National growth in simulation training within emergency medicine residency programs, 2003–2008. Acad Emerg Med. 2008;15:1113–1116. , , , et al.
- Simulation center accreditation and programmatic benchmarks: a review for emergency medicine. Acad Emerg Med. 2010;17(10):1093–1103. , , , et al.
- How much evidence does it take? A cumulative meta‐analysis of outcomes of simulation‐based education. Med Educ. 2014;48(8):750–760. .
- Resident‐as‐teacher: a suggested curriculum for emergency medicine. Acad Emerg Med. 2006;13(6):677–679. , , , et al..
- Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81. .
- Pilot program utilizing medical simulation in clinical decision making training for internal medicine interns. J Grad Med Educ. 2012;4:490–495. , , , , , .
- How we implemented a resident‐led medical simulation curriculum in a large internal medicine residency program. Med Teach. 2014;36(4):279–283. , , , et al.
- There's no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1:49–55. , , , .
- Improving faculty feedback to resident trainees during a simulated case: a randomized, controlled trial of an educational intervention. Anesthesiology. 2014;120(1):160–171. , , , .
- Introduction to debriefing. Semin Perinatol. 2013:37(3)166–174. .
- Response‐shift bias: a source of contamination of self‐report measures. J Appl Psychol. 1979;4:93–106. , .
- Center for Medical Simulation. Debriefing assessment for simulation in healthcare. Available at: http://www.harvardmedsim.org/debriefing‐assesment‐simulation‐healthcare.php. Accessed June 18, 2014.
- A novel iterative‐learner simulation model: fellows as teachers. J Grad Med Educ. 2014;6(1):127–132. , , , et al.
- Direct observation of faculty with feedback: an effective means of improving patient‐centered and learner‐centered teaching skills. Teach Learn Med. 2007;19(3):278–286. , , .
- Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self assessments. J Pers Soc Psychol. 1999;77:1121–1134. , .
- Reliability and validity of an objective structured teaching examination for generalist resident teachers. Acad Med. 2002;77(10 suppl):S29–S32. , , , et al.
Residency training, in addition to developing clinical competence among trainees, is charged with improving resident teaching skills. The Liaison Committee on Medical Education and the Accreditation Council for Graduate Medical Education require that residents be provided with training or resources to develop their teaching skills.[1, 2] A variety of resident‐as‐teacher (RaT) programs have been described; however, the optimal format of such programs remains in question.[3] High‐fidelity medical simulation using mannequins has been shown to be an effective teaching tool in various medical specialties[4, 5, 6, 7] and may prove to be useful in teacher training.[8] Teaching in a simulation‐based environment can give participants the opportunity to apply their teaching skills in a clinical environment, as they would on the wards, but in a more controlled, predictable setting and without compromising patient safety. In addition, simulation offers the opportunity to engage in deliberate practice by allowing teachers to facilitate the same case on multiple occasions with different learners. Deliberate practice, which involves task repetition with feedback aimed at improving performance, has been shown to be important in developing expertise.[9]
We previously described the first use of a high‐fidelity simulation curriculum for internal medicine (IM) interns focused on clinical decision‐making skills, in which second‐ and third‐year residents served as facilitators.[10, 11] Herein, we describe a RaT program in which residents participated in a workshop, then served as facilitators in the intern curriculum and received feedback from faculty. We hypothesized that such a program would improve residents' teaching and feedback skills, both in the simulation environment and on the wards.
METHODS
We conducted a single‐group study evaluating teaching and feedback skills among upper‐level resident facilitators before and after participation in the RaT program. We measured residents' teaching skills using pre‐ and post‐program self‐assessments as well as evaluations completed by the intern learners after each session and at the completion of the curriculum.
Setting and Participants
We embedded the RaT program within a simulation curriculum administered July to October of 2013 for all IM interns at Massachusetts General Hospital (interns in the preliminary program who planned to pursue another field after the completion of the intern year were excluded) (n = 52). We invited postgraduate year (PGY) II and III residents (n = 102) to participate in the IM simulation program as facilitators via email. The curriculum consisted of 8 cases focusing on acute clinical scenarios encountered on the general medicine wards. The cases were administered during 1‐hour sessions 4 mornings per week from 7 AM to 8 AM prior to clinical duties. Interns completed the curriculum over 4 sessions during their outpatient rotation. The case topics were (1) hypertensive emergency, (2) post‐procedure bleed, (3) congestive heart failure, (4) atrial fibrillation with rapid ventricular response, (5) altered mental status/alcohol withdrawal, (6) nonsustained ventricular tachycardia heralding acute coronary syndrome, (7) cardiac tamponade, and (8) anaphylaxis. During each session, groups of 2 to 3 interns worked through 2 cases using a high‐fidelity mannequin (Laerdal 3G, Wappingers Falls, NY) with 2 resident facilitators. One facilitator operated the mannequin, while the other served as a nurse. Each case was followed by a structured debriefing led by 1 of the resident facilitators (facilitators switched roles for the second case). The number of sessions facilitated varied for each resident based on individual schedules and preferences.
Four senior residents who were appointed as simulation leaders (G.A.A., J.K.H., R.K., Z.S.) and 2 faculty advisors (P.F.C., E.M.M.) administered the program. Simulation resident leaders scheduled facilitators and interns and participated in a portion of simulation sessions as facilitators, but they were not analyzed as participants for the purposes of this study. The curriculum was administered without interfering with clinical duties, and no additional time was protected for interns or residents participating in the curriculum.
Resident‐as‐Teacher Program Structure
We invited participating resident facilitators to attend a 1‐hour interactive workshop prior to serving as facilitators. The workshop focused on building learner‐centered and small‐group teaching skills, as well as introducing residents to a 5‐stage debriefing framework developed by the authors and based on simulation debriefing best practices (Table 1).[12, 13, 14]
Stage of Debriefing | Action | Rationale |
---|---|---|
| ||
Emotional response | Elicit learners' emotions about the case | It is important to acknowledge and address both positive and negative emotions that arise during the case before debriefing the specific medical and communications aspects of the case. Unaddressed emotional responses may hinder subsequent debriefing. |
Objectives* | Elicit learners' objectives and combine them with the stated learning objectives of the case to determine debriefing objectives | The limited amount of time allocated for debriefing (1520 minutes) does not allow the facilitator to cover all aspects of medical management and communication skills in a particular case. Focusing on the most salient objectives, including those identified by the learners, allows the facilitator to engage in learner‐centered debriefing. |
Analysis | Analyze the learners' approach to the case | Analyzing the learners' approach to the case using the advocacy‐inquiry method[11] seeks to uncover the learner's assumptions/frameworks behind the decision made during the case. This approach allows the facilitator to understand the learners' thought process and target teaching points to more precisely address the learners' needs. |
Teaching | Address knowledge gaps and incorrect assumptions | Learner‐centered debriefing within a limited timeframe requires teaching to be brief and targeted toward the defined objectives. It should also address the knowledge gaps and incorrect assumptions uncovered during the analysis phase. |
Summary | Summarize key takeaways | Summarizing highlights the key points of the debriefing and can be used to suggest further exploration of topics through self‐study (if necessary). |
Resident facilitators were observed by simulation faculty and simulation resident leaders throughout the intern curriculum and given structured feedback either in‐person immediately after completion of the simulation session or via a detailed same‐day e‐mail if the time allotted for feedback was not sufficient. Feedback was structured by the 5 stages of debriefing described in Table 1, and included soliciting residents' observations on the teaching experience and specific behaviors observed by faculty during the scenarios. E‐mail feedback (also structured by stages of debriefing and including observed behaviors) was typically followed by verbal feedback during the next simulation session.
The RaT program was composed of 3 elements: the workshop, case facilitation, and direct observation with feedback. Because we felt that the opportunity for directly observed teaching and feedback in a ward‐like controlled environment was a unique advantage offered by the simulation setting, we included all residents who served as facilitators in the analysis, regardless of whether or not they had attended the workshop.
Evaluation Instruments
Survey instruments were developed by the investigators, reviewed by several experts in simulation, pilot tested among residents not participating in the simulation program, and revised by the investigators.
Pre‐program Facilitator Survey
Prior to the RaT workshop, resident facilitators completed a baseline survey evaluating their preparedness to teach and give feedback on the wards and in a simulation‐based setting on a 5‐point scale (see Supporting Information, Appendix I, in the online version of this article).
Post‐program Facilitator Survey
Approximately 3 weeks after completion of the intern simulation curriculum, resident facilitators were asked to complete an online post‐program survey, which remained open for 1 month (residents completed this survey anywhere from 3 weeks to 4 months after their participation in the RaT program depending on the timing of their facilitation). The survey asked residents to evaluate their comfort with their current post‐program teaching skills as well as their pre‐program skills in retrospect, as previous research demonstrated that learners may overestimate their skills prior to training programs.[15] Resident facilitators could complete the surveys nonanonymously to allow for matched‐pairs analysis of the change in teaching skills over the course of the program (see Supporting Information, Appendix II, in the online version of this article).
Intern Evaluation of Facilitator Debriefing Skills
After each case, intern learners were asked to anonymously evaluate the teaching effectiveness of the lead resident facilitator using the adapted Debriefing Assessment for Simulation in Healthcare (DASH) instrument.[16] The DASH instrument evaluated the following domains: (1) instructor maintained an engaging context for learning, (2) instructor structured the debriefing in an organized way, (3) instructor provoked in‐depth discussions that led me to reflect on my performance, (4) instructor identified what I did well or poorly and why, (5) instructor helped me see how to improve or how to sustain good performance, (6) overall effectiveness of the simulation session (see Supporting Information, Appendix III, in the online version of this article).
Post‐program Intern Survey
Two months following the completion of the simulation curriculum, intern learners received an anonymous online post‐program evaluation assessing program efficacy and resident facilitator teaching (see Supporting Information, Appendix IV, in the online version of this article).
Statistical Analysis
Teaching skills and learners' DASH ratings were compared using the Student t test, Pearson 2 test, and Fisher exact test as appropriate. Pre‐ and post‐program rating of teaching skills was undertaken in aggregate and as a matched‐pairs analysis.
The study was approved by the Partners Institutional Review Board.
RESULTS
Forty‐one resident facilitators participated in 118 individual simulation sessions encompassing 236 case scenarios. Thirty‐four residents completed the post‐program facilitator survey and were included in the analysis. Of these, 26 (76%) participated in the workshop and completed the pre‐program survey. Twenty‐three of the 34 residents (68%) completed the post‐program evaluation nonanonymously (13 PGY‐II, 10 PGY‐III). Of these, 16 completed the pre‐program survey nonanonymously. The average number of sessions facilitated by each resident was 3.9 (range, 112).
Pre‐ and Post‐program Self‐Assessment of Residents' Teaching Skills
Participation in the simulation RaT program led to improvements in resident facilitators' self‐reported teaching skills across multiple domains (Table 2). These results were consistent when using the retrospective pre‐program assessment in matched‐pairs analysis (n=34) and when performing the analysis using the true pre‐program preparedness compared to post‐program comfort with teaching skills in a non‐matched‐pairs fashion (n = 26) and matched‐pairs fashion (n = 16). We report P values for the more conservative estimates using the retrospective pre‐program assessment matched‐pairs analysis. The most significant improvements occurred in residents' ability to teach in a simulated environment (2.81 to 4.16, P < 0.001 [5‐point scale]) and give feedback (3.35 to 3.77, P < 0.001).
Pre‐program Rating (n = 34) | Post‐program Rating (n = 34) | P Value | |
---|---|---|---|
| |||
Teaching on rounds | 3.75 | 4.03 | 0.005 |
Teaching on wards outside rounds | 3.83 | 4.07 | 0.007 |
Teaching in simulation | 2.81 | 4.16 | <0.001 |
Giving feedback | 3.35 | 3.77 | <0.001 |
Resident facilitators reported that participation in the RaT program had a significant impact on their teaching skills both within and outside of the simulation environment (Table 3). However, the greatest gains were seen in the domain of teaching in simulation. It was also noted that participation in the program improved resident facilitators' medical knowledge.
Category | Not at All | Slightly Improved | Moderately Improved | Greatly Improved | Not Sure |
---|---|---|---|---|---|
Teaching on rounds, n = 34 | 4 (12%) | 12 (35%) | 13 (38%) | 4 (12%) | 1 (3%) |
Teaching on wards outside rounds, n = 34 | 3 (9%) | 13 (38%) | 12 (35%) | 5 (15%) | 1 (3%) |
Teaching in simulation, n = 34 | 0 (0%) | 4 (12%) | 7 (21%) | 23 (68%) | 0 (0%) |
Giving feedback, n = 34 | 4 (12%) | 10 (29%) | 12 (35%) | 6 (18%) | 2 (6%) |
Medical knowledge, n = 34 | 2 (6%) | 11 (32%) | 18 (53%) | 3 (9%) | 0 (0%) |
Subgroup analyses were performed comparing the perceived improvement in teaching and feedback skills among those who did or did not attend the facilitator workshop, those who facilitated 5 or more versus less than 5 sessions, and those who received or did not receive direct observation and feedback from faculty. Although numerically greater gains were seen across all 4 domains among those who attended the workshop, facilitated 5 or more sessions, or received feedback from faculty, only teaching on rounds and on the wards outside rounds reached statistical significance (Table 4). It should be noted that all residents who facilitated 5 or more sessions also attended the workshop and received feedback from faculty. We also compared perceived improvement among PGY‐II and PGY‐III residents. In contrast to PGY‐II residents, who demonstrated an improvement in all 4 domains, PGY‐III residents only demonstrated improvement in simulation‐based teaching.
Pre‐program | Post‐program | P Value | Pre‐program | Post‐program | P Value | |
---|---|---|---|---|---|---|
Facilitated Less Than 5 Sessions (n = 18) | Facilitated 5 or More Sessions (n = 11) | |||||
Did Not Attend Workshop (n = 10) | Attended Workshop (n = 22) | |||||
Received Feedback From Resident Leaders Only (n = 11) | Received Faculty Feedback (n = 21) | |||||
PGY‐II (n = 13) | PGY‐III (n = 9) | |||||
| ||||||
Teaching on rounds | 3.68 | 3.79 | 0.16 | 3.85 | 4.38 | 0.01 |
Teaching on wards outside rounds | 3.82 | 4 | 0.08 | 3.85 | 4.15 | 0.04 |
Teaching in simulation | 2.89 | 4.06 | <0.01 | 2.69 | 4.31 | <0.01 |
Giving feedback | 3.33 | 3.67 | 0.01 | 3.38 | 3.92 | 0.01 |
Teaching on rounds | 4 | 4.1 | 0.34 | 3.64 | 4 | <0.01 |
Teaching on wards outside rounds | 4 | 4 | 1.00 | 3.76 | 4.1 | <0.01 |
Teaching in simulation | 2.89 | 4.11 | <0.01 | 2.77 | 4.18 | <0.01 |
Giving feedback | 3.56 | 3.78 | 0.17 | 3.27 | 3.77 | <0.01 |
Teaching on rounds | 3.55 | 3.82 | 0.19 | 3.86 | 4.14 | 0.01 |
Teaching on wards outside rounds | 4 | 4 | 1.00 | 3.75 | 4.1 | <0.01 |
Teaching in simulation | 2.7 | 3.8 | <0.01 | 2.86 | 4.33 | <0.01 |
Giving feedback | 3.2 | 3.6 | 0.04 | 3.43 | 3.86 | <0.01 |
Teaching on rounds | 3.38 | 3.85 | 0.03 | 4.22 | 4.22 | 1 |
Teaching on wards outside rounds | 3.54 | 3.85 | 0.04 | 4.14 | 4.14 | 1 |
Teaching in simulation | 2.46 | 4.15 | <0.01 | 3.13 | 4.13 | <0.01 |
Giving feedback | 3.23 | 3.62 | 0.02 | 3.5 | 3.88 | 0.08 |
Intern Learners' Assessment of Resident Facilitators and the Program Overall
During the course of the program, intern learners completed 166 DASH ratings evaluating 34 resident facilitators (see Supporting Information, Appendix V, in the online version of this article). Ratings for the 6 DASH items ranged from 6.49 to 6.73 (7‐point scale), demonstrating a high level of facilitator efficacy across multiple domains. No differences in DASH scores were noted among subgroups of resident facilitators described in the previous paragraph.
Thirty‐eight of 52 intern learners (73%) completed the post‐program survey.
Facilitator Behaviors | Very Often, >75% | Often, >50% | Sometimes, 25%50% | Rarely, <25% | Never |
---|---|---|---|---|---|
Elicited emotional reactions, n = 38 | 18 (47%) | 16 (42%) | 4 (11%) | 0 (0%) | 0 (0%) |
Elicited objectives from learner, n = 37 | 26 (69%) | 8 (22%) | 2 (6%) | 1 (3%) | 0 (0%) |
Asked to share clinical reasoning, n = 38 | 21 (56%) | 13 (33%) | 4 (11%) | 0 (0%) | 0 (0%) |
Summarized learning points, n = 38 | 31 (81%) | 7 (19%) | 0 (0%) | 0 (0%) | 0 (0%) |
Spoke for less than half of the session, n = 38 | 8 (22%) | 17 (44%) | 11 (28%) | 2 (6%) | 0 (0%) |
All intern learners rated the overall simulation experience as either excellent (81%) or good (19%) on the post‐program evaluation (4 or 5 on a 5‐point Likert scale, respectively). All interns strongly agreed (72%) or agreed (28%) that the simulation sessions improved their ability to manage acute clinical scenarios. Interns reported that resident facilitators frequently utilized specific debriefing techniques covered in the RaT curriculum during the debriefing sessions (Table 5).
DISCUSSION
We describe a unique RaT program embedded within a high‐fidelity medical simulation curriculum for IM interns. Our study demonstrates that resident facilitators noted an improvement in their teaching and feedback skills, both in the simulation setting and on the wards. Intern learners rated residents' teaching skills and the overall simulation curriculum highly, suggesting that residents were effective teachers.
The use of simulation in trainee‐as‐teacher curricula holds promise because it can provide an opportunity to teach in an environment closely approximating the wards, where trainees have the most opportunities to teach. However, in contrast to true ward‐based teaching, simulation can provide predictable scenarios in a controlled environment, which eliminates the distractions and unpredictability that exist on the wards, without compromising patient safety. Recently, Tofil et al. described the first use of simulation in a trainee‐as‐teacher program.[17] The investigators utilized a 1‐time simulation‐based teaching session, during which pediatric fellows completed a teacher‐training workshop, developed and served as facilitators in a simulated case, and received feedback. The use of simulation allowed fellows an opportunity to apply newly acquired skills in a controlled environment and receive feedback, which has been shown to improve teaching skills.[18]
The experience from our program expands on that of Tofil et al., as well as previously described trainee‐as‐teacher curricula, by introducing a component of deliberate practice that is unique to the simulation setting and has been absent from most previously described RaT programs.[3] Most residents had the opportunity to facilitate the same case on multiple occasions, allowing them to receive feedback and make adjustments. Residents who facilitated 5 or more sessions demonstrated more improvement, particularly in teaching outside of simulation, than residents who facilitated fewer sessions. It is notable that PGY‐II resident facilitators reported an improvement in their teaching skills on the wards, though less pronounced as compared to teaching in the simulation‐based environment, suggesting that benefits of the program may extend to nonsimulation‐based settings. Additional studies focusing on objective evaluation of ward‐based teaching are needed to further explore this phenomenon. Finally, the self‐reported improvements in medical knowledge by resident facilitators may serve as another benefit of our program.
Analysis of learner‐level data collected in the postcurriculum intern survey and DASH ratings provides additional support for the effectiveness of the RaT program. The majority of intern learners reported that resident facilitators used the techniques covered in our program frequently during debriefings. In addition, DASH scores clustered around maximum efficacy for all facilitators, suggesting that residents were effective teachers. Although we cannot directly assess whether the differences demonstrated in resident facilitators' self‐assessments translated to their teaching or were significant from the learners' perspective, these results support the hypothesis that self‐assessed improvements in teaching and feedback skills were significant.
In addition to improving resident teaching skills, our program had a positive impact on intern learners as evidenced by intern evaluations of the simulation curriculum. While utilizing relatively few faculty resources, our program was able to deliver an extensive and well‐received simulation curriculum to over 50 interns. The fact that 40% of second‐ and third‐year residents volunteered to teach in the program despite the early morning timing of the sessions speaks to the interest that trainees have in teaching in this setting. This model can serve as an important and efficient learning platform in residency training programs. It may be particularly salient to IM training programs where implementation of simulation curricula is challenging due to large numbers of residents and limited faculty resources. The barriers to and lessons learned from our experience with implementing the simulation curriculum have been previously described.[10, 11]
Our study has several limitations. Changes in residents' teaching skills were self‐assessed, which may be inaccurate as learners may overestimate their abilities.[19] Although we collected data on the experiences of intern learners that supported residents' self‐assessment, further studies using more objective measures (such as the Objective Structured Teaching Exercise[20]) should be undertaken. We did not objectively assess improvement of residents' teaching skills on the wards, with the exception of the residents' self assessment. Due to the timing of survey administration, some residents had as little as 1 month between completion of the curriculum and responding to the post‐curriculum survey, limiting their ability to evaluate their teaching skills on the wards. The transferability of the skills gained in simulation‐based teaching to teaching on the wards deserves further study. We cannot definitively attribute perceived improvement of teaching skills to the RaT program without a control group. However, the frequent use of recommended techniques during debriefing, which are not typically taught in other settings, supports the efficacy of the RaT program.
Our study did not allow us to determine which of the 3 components of the RaT program (workshop, facilitation practice, or direct observation and feedback) had the greatest impact on teaching skills or DASH ratings, as those who facilitated more sessions also completed the other components of the program. Furthermore, there may have been a selection bias among facilitators who facilitated more sessions. Because only 16 of 34 participants completed both the pre‐program and post‐program self‐assessments in a non‐anonymous fashion, we were not able to analyze the effect of pre‐program factors, such as prior teaching experience, on program outcomes. It should also be noted that allowing resident facilitators the option to complete the survey non‐anonymously could have biased our results. The simulation curriculum was conducted in a single center, and resident facilitators were self‐selecting; therefore, our results may not be generalizable. Finally, the DASH instrument was only administered after the RaT workshop and was likely limited further by the ceiling effect created by the learners' high satisfaction with the simulation program overall.
In summary, our simulation‐based RaT program improved resident facilitators' self‐reported teaching and feedback skills. Simulation‐based training provided an opportunity for deliberate practice of teaching skills in a controlled environment, which was a unique component of the program. The impact of deliberate practice on resident teaching skills and optimal methods to incorporate deliberate practice in RaT programs deserves further study. Our curriculum design may serve as a model for the development of simulation programs that can be employed to improve both intern learning and resident teaching skills.
Acknowledgements
The authors acknowledge Deborah Navedo, PhD, Assistant Professor, Massachusetts General Hospital Institute of Health Professions, and Emily M. Hayden, MD, Assistant Professor, Department of Emergency Medicine, Massachusetts General Hospital and Harvard Medical School, for their assistance with development of the RaT curriculum. The authors thank Dr. Jenny Rudolph, Senior Director, Institute for Medical Simulation at the Center for Medical Simulation, for her help in teaching us to use the DASH instrument. The authors also thank Dr. Daniel Hunt, MD, Associate Professor, Department of Medicine, Massachusetts General Hospital and Harvard Medical School, for his thoughtful review of this manuscript.
Disclosure: Nothing to report.
Residency training, in addition to developing clinical competence among trainees, is charged with improving resident teaching skills. The Liaison Committee on Medical Education and the Accreditation Council for Graduate Medical Education require that residents be provided with training or resources to develop their teaching skills.[1, 2] A variety of resident‐as‐teacher (RaT) programs have been described; however, the optimal format of such programs remains in question.[3] High‐fidelity medical simulation using mannequins has been shown to be an effective teaching tool in various medical specialties[4, 5, 6, 7] and may prove to be useful in teacher training.[8] Teaching in a simulation‐based environment can give participants the opportunity to apply their teaching skills in a clinical environment, as they would on the wards, but in a more controlled, predictable setting and without compromising patient safety. In addition, simulation offers the opportunity to engage in deliberate practice by allowing teachers to facilitate the same case on multiple occasions with different learners. Deliberate practice, which involves task repetition with feedback aimed at improving performance, has been shown to be important in developing expertise.[9]
We previously described the first use of a high‐fidelity simulation curriculum for internal medicine (IM) interns focused on clinical decision‐making skills, in which second‐ and third‐year residents served as facilitators.[10, 11] Herein, we describe a RaT program in which residents participated in a workshop, then served as facilitators in the intern curriculum and received feedback from faculty. We hypothesized that such a program would improve residents' teaching and feedback skills, both in the simulation environment and on the wards.
METHODS
We conducted a single‐group study evaluating teaching and feedback skills among upper‐level resident facilitators before and after participation in the RaT program. We measured residents' teaching skills using pre‐ and post‐program self‐assessments as well as evaluations completed by the intern learners after each session and at the completion of the curriculum.
Setting and Participants
We embedded the RaT program within a simulation curriculum administered July to October of 2013 for all IM interns at Massachusetts General Hospital (interns in the preliminary program who planned to pursue another field after the completion of the intern year were excluded) (n = 52). We invited postgraduate year (PGY) II and III residents (n = 102) to participate in the IM simulation program as facilitators via email. The curriculum consisted of 8 cases focusing on acute clinical scenarios encountered on the general medicine wards. The cases were administered during 1‐hour sessions 4 mornings per week from 7 AM to 8 AM prior to clinical duties. Interns completed the curriculum over 4 sessions during their outpatient rotation. The case topics were (1) hypertensive emergency, (2) post‐procedure bleed, (3) congestive heart failure, (4) atrial fibrillation with rapid ventricular response, (5) altered mental status/alcohol withdrawal, (6) nonsustained ventricular tachycardia heralding acute coronary syndrome, (7) cardiac tamponade, and (8) anaphylaxis. During each session, groups of 2 to 3 interns worked through 2 cases using a high‐fidelity mannequin (Laerdal 3G, Wappingers Falls, NY) with 2 resident facilitators. One facilitator operated the mannequin, while the other served as a nurse. Each case was followed by a structured debriefing led by 1 of the resident facilitators (facilitators switched roles for the second case). The number of sessions facilitated varied for each resident based on individual schedules and preferences.
Four senior residents who were appointed as simulation leaders (G.A.A., J.K.H., R.K., Z.S.) and 2 faculty advisors (P.F.C., E.M.M.) administered the program. Simulation resident leaders scheduled facilitators and interns and participated in a portion of simulation sessions as facilitators, but they were not analyzed as participants for the purposes of this study. The curriculum was administered without interfering with clinical duties, and no additional time was protected for interns or residents participating in the curriculum.
Resident‐as‐Teacher Program Structure
We invited participating resident facilitators to attend a 1‐hour interactive workshop prior to serving as facilitators. The workshop focused on building learner‐centered and small‐group teaching skills, as well as introducing residents to a 5‐stage debriefing framework developed by the authors and based on simulation debriefing best practices (Table 1).[12, 13, 14]
Stage of Debriefing | Action | Rationale |
---|---|---|
| ||
Emotional response | Elicit learners' emotions about the case | It is important to acknowledge and address both positive and negative emotions that arise during the case before debriefing the specific medical and communications aspects of the case. Unaddressed emotional responses may hinder subsequent debriefing. |
Objectives* | Elicit learners' objectives and combine them with the stated learning objectives of the case to determine debriefing objectives | The limited amount of time allocated for debriefing (1520 minutes) does not allow the facilitator to cover all aspects of medical management and communication skills in a particular case. Focusing on the most salient objectives, including those identified by the learners, allows the facilitator to engage in learner‐centered debriefing. |
Analysis | Analyze the learners' approach to the case | Analyzing the learners' approach to the case using the advocacy‐inquiry method[11] seeks to uncover the learner's assumptions/frameworks behind the decision made during the case. This approach allows the facilitator to understand the learners' thought process and target teaching points to more precisely address the learners' needs. |
Teaching | Address knowledge gaps and incorrect assumptions | Learner‐centered debriefing within a limited timeframe requires teaching to be brief and targeted toward the defined objectives. It should also address the knowledge gaps and incorrect assumptions uncovered during the analysis phase. |
Summary | Summarize key takeaways | Summarizing highlights the key points of the debriefing and can be used to suggest further exploration of topics through self‐study (if necessary). |
Resident facilitators were observed by simulation faculty and simulation resident leaders throughout the intern curriculum and given structured feedback either in‐person immediately after completion of the simulation session or via a detailed same‐day e‐mail if the time allotted for feedback was not sufficient. Feedback was structured by the 5 stages of debriefing described in Table 1, and included soliciting residents' observations on the teaching experience and specific behaviors observed by faculty during the scenarios. E‐mail feedback (also structured by stages of debriefing and including observed behaviors) was typically followed by verbal feedback during the next simulation session.
The RaT program was composed of 3 elements: the workshop, case facilitation, and direct observation with feedback. Because we felt that the opportunity for directly observed teaching and feedback in a ward‐like controlled environment was a unique advantage offered by the simulation setting, we included all residents who served as facilitators in the analysis, regardless of whether or not they had attended the workshop.
Evaluation Instruments
Survey instruments were developed by the investigators, reviewed by several experts in simulation, pilot tested among residents not participating in the simulation program, and revised by the investigators.
Pre‐program Facilitator Survey
Prior to the RaT workshop, resident facilitators completed a baseline survey evaluating their preparedness to teach and give feedback on the wards and in a simulation‐based setting on a 5‐point scale (see Supporting Information, Appendix I, in the online version of this article).
Post‐program Facilitator Survey
Approximately 3 weeks after completion of the intern simulation curriculum, resident facilitators were asked to complete an online post‐program survey, which remained open for 1 month (residents completed this survey anywhere from 3 weeks to 4 months after their participation in the RaT program depending on the timing of their facilitation). The survey asked residents to evaluate their comfort with their current post‐program teaching skills as well as their pre‐program skills in retrospect, as previous research demonstrated that learners may overestimate their skills prior to training programs.[15] Resident facilitators could complete the surveys nonanonymously to allow for matched‐pairs analysis of the change in teaching skills over the course of the program (see Supporting Information, Appendix II, in the online version of this article).
Intern Evaluation of Facilitator Debriefing Skills
After each case, intern learners were asked to anonymously evaluate the teaching effectiveness of the lead resident facilitator using the adapted Debriefing Assessment for Simulation in Healthcare (DASH) instrument.[16] The DASH instrument evaluated the following domains: (1) instructor maintained an engaging context for learning, (2) instructor structured the debriefing in an organized way, (3) instructor provoked in‐depth discussions that led me to reflect on my performance, (4) instructor identified what I did well or poorly and why, (5) instructor helped me see how to improve or how to sustain good performance, (6) overall effectiveness of the simulation session (see Supporting Information, Appendix III, in the online version of this article).
Post‐program Intern Survey
Two months following the completion of the simulation curriculum, intern learners received an anonymous online post‐program evaluation assessing program efficacy and resident facilitator teaching (see Supporting Information, Appendix IV, in the online version of this article).
Statistical Analysis
Teaching skills and learners' DASH ratings were compared using the Student t test, Pearson 2 test, and Fisher exact test as appropriate. Pre‐ and post‐program rating of teaching skills was undertaken in aggregate and as a matched‐pairs analysis.
The study was approved by the Partners Institutional Review Board.
RESULTS
Forty‐one resident facilitators participated in 118 individual simulation sessions encompassing 236 case scenarios. Thirty‐four residents completed the post‐program facilitator survey and were included in the analysis. Of these, 26 (76%) participated in the workshop and completed the pre‐program survey. Twenty‐three of the 34 residents (68%) completed the post‐program evaluation nonanonymously (13 PGY‐II, 10 PGY‐III). Of these, 16 completed the pre‐program survey nonanonymously. The average number of sessions facilitated by each resident was 3.9 (range, 112).
Pre‐ and Post‐program Self‐Assessment of Residents' Teaching Skills
Participation in the simulation RaT program led to improvements in resident facilitators' self‐reported teaching skills across multiple domains (Table 2). These results were consistent when using the retrospective pre‐program assessment in matched‐pairs analysis (n=34) and when performing the analysis using the true pre‐program preparedness compared to post‐program comfort with teaching skills in a non‐matched‐pairs fashion (n = 26) and matched‐pairs fashion (n = 16). We report P values for the more conservative estimates using the retrospective pre‐program assessment matched‐pairs analysis. The most significant improvements occurred in residents' ability to teach in a simulated environment (2.81 to 4.16, P < 0.001 [5‐point scale]) and give feedback (3.35 to 3.77, P < 0.001).
Pre‐program Rating (n = 34) | Post‐program Rating (n = 34) | P Value | |
---|---|---|---|
| |||
Teaching on rounds | 3.75 | 4.03 | 0.005 |
Teaching on wards outside rounds | 3.83 | 4.07 | 0.007 |
Teaching in simulation | 2.81 | 4.16 | <0.001 |
Giving feedback | 3.35 | 3.77 | <0.001 |
Resident facilitators reported that participation in the RaT program had a significant impact on their teaching skills both within and outside of the simulation environment (Table 3). However, the greatest gains were seen in the domain of teaching in simulation. It was also noted that participation in the program improved resident facilitators' medical knowledge.
Category | Not at All | Slightly Improved | Moderately Improved | Greatly Improved | Not Sure |
---|---|---|---|---|---|
Teaching on rounds, n = 34 | 4 (12%) | 12 (35%) | 13 (38%) | 4 (12%) | 1 (3%) |
Teaching on wards outside rounds, n = 34 | 3 (9%) | 13 (38%) | 12 (35%) | 5 (15%) | 1 (3%) |
Teaching in simulation, n = 34 | 0 (0%) | 4 (12%) | 7 (21%) | 23 (68%) | 0 (0%) |
Giving feedback, n = 34 | 4 (12%) | 10 (29%) | 12 (35%) | 6 (18%) | 2 (6%) |
Medical knowledge, n = 34 | 2 (6%) | 11 (32%) | 18 (53%) | 3 (9%) | 0 (0%) |
Subgroup analyses were performed comparing the perceived improvement in teaching and feedback skills among those who did or did not attend the facilitator workshop, those who facilitated 5 or more versus less than 5 sessions, and those who received or did not receive direct observation and feedback from faculty. Although numerically greater gains were seen across all 4 domains among those who attended the workshop, facilitated 5 or more sessions, or received feedback from faculty, only teaching on rounds and on the wards outside rounds reached statistical significance (Table 4). It should be noted that all residents who facilitated 5 or more sessions also attended the workshop and received feedback from faculty. We also compared perceived improvement among PGY‐II and PGY‐III residents. In contrast to PGY‐II residents, who demonstrated an improvement in all 4 domains, PGY‐III residents only demonstrated improvement in simulation‐based teaching.
Pre‐program | Post‐program | P Value | Pre‐program | Post‐program | P Value | |
---|---|---|---|---|---|---|
Facilitated Less Than 5 Sessions (n = 18) | Facilitated 5 or More Sessions (n = 11) | |||||
Did Not Attend Workshop (n = 10) | Attended Workshop (n = 22) | |||||
Received Feedback From Resident Leaders Only (n = 11) | Received Faculty Feedback (n = 21) | |||||
PGY‐II (n = 13) | PGY‐III (n = 9) | |||||
| ||||||
Teaching on rounds | 3.68 | 3.79 | 0.16 | 3.85 | 4.38 | 0.01 |
Teaching on wards outside rounds | 3.82 | 4 | 0.08 | 3.85 | 4.15 | 0.04 |
Teaching in simulation | 2.89 | 4.06 | <0.01 | 2.69 | 4.31 | <0.01 |
Giving feedback | 3.33 | 3.67 | 0.01 | 3.38 | 3.92 | 0.01 |
Teaching on rounds | 4 | 4.1 | 0.34 | 3.64 | 4 | <0.01 |
Teaching on wards outside rounds | 4 | 4 | 1.00 | 3.76 | 4.1 | <0.01 |
Teaching in simulation | 2.89 | 4.11 | <0.01 | 2.77 | 4.18 | <0.01 |
Giving feedback | 3.56 | 3.78 | 0.17 | 3.27 | 3.77 | <0.01 |
Teaching on rounds | 3.55 | 3.82 | 0.19 | 3.86 | 4.14 | 0.01 |
Teaching on wards outside rounds | 4 | 4 | 1.00 | 3.75 | 4.1 | <0.01 |
Teaching in simulation | 2.7 | 3.8 | <0.01 | 2.86 | 4.33 | <0.01 |
Giving feedback | 3.2 | 3.6 | 0.04 | 3.43 | 3.86 | <0.01 |
Teaching on rounds | 3.38 | 3.85 | 0.03 | 4.22 | 4.22 | 1 |
Teaching on wards outside rounds | 3.54 | 3.85 | 0.04 | 4.14 | 4.14 | 1 |
Teaching in simulation | 2.46 | 4.15 | <0.01 | 3.13 | 4.13 | <0.01 |
Giving feedback | 3.23 | 3.62 | 0.02 | 3.5 | 3.88 | 0.08 |
Intern Learners' Assessment of Resident Facilitators and the Program Overall
During the course of the program, intern learners completed 166 DASH ratings evaluating 34 resident facilitators (see Supporting Information, Appendix V, in the online version of this article). Ratings for the 6 DASH items ranged from 6.49 to 6.73 (7‐point scale), demonstrating a high level of facilitator efficacy across multiple domains. No differences in DASH scores were noted among subgroups of resident facilitators described in the previous paragraph.
Thirty‐eight of 52 intern learners (73%) completed the post‐program survey.
Facilitator Behaviors | Very Often, >75% | Often, >50% | Sometimes, 25%50% | Rarely, <25% | Never |
---|---|---|---|---|---|
Elicited emotional reactions, n = 38 | 18 (47%) | 16 (42%) | 4 (11%) | 0 (0%) | 0 (0%) |
Elicited objectives from learner, n = 37 | 26 (69%) | 8 (22%) | 2 (6%) | 1 (3%) | 0 (0%) |
Asked to share clinical reasoning, n = 38 | 21 (56%) | 13 (33%) | 4 (11%) | 0 (0%) | 0 (0%) |
Summarized learning points, n = 38 | 31 (81%) | 7 (19%) | 0 (0%) | 0 (0%) | 0 (0%) |
Spoke for less than half of the session, n = 38 | 8 (22%) | 17 (44%) | 11 (28%) | 2 (6%) | 0 (0%) |
All intern learners rated the overall simulation experience as either excellent (81%) or good (19%) on the post‐program evaluation (4 or 5 on a 5‐point Likert scale, respectively). All interns strongly agreed (72%) or agreed (28%) that the simulation sessions improved their ability to manage acute clinical scenarios. Interns reported that resident facilitators frequently utilized specific debriefing techniques covered in the RaT curriculum during the debriefing sessions (Table 5).
DISCUSSION
We describe a unique RaT program embedded within a high‐fidelity medical simulation curriculum for IM interns. Our study demonstrates that resident facilitators noted an improvement in their teaching and feedback skills, both in the simulation setting and on the wards. Intern learners rated residents' teaching skills and the overall simulation curriculum highly, suggesting that residents were effective teachers.
The use of simulation in trainee‐as‐teacher curricula holds promise because it can provide an opportunity to teach in an environment closely approximating the wards, where trainees have the most opportunities to teach. However, in contrast to true ward‐based teaching, simulation can provide predictable scenarios in a controlled environment, which eliminates the distractions and unpredictability that exist on the wards, without compromising patient safety. Recently, Tofil et al. described the first use of simulation in a trainee‐as‐teacher program.[17] The investigators utilized a 1‐time simulation‐based teaching session, during which pediatric fellows completed a teacher‐training workshop, developed and served as facilitators in a simulated case, and received feedback. The use of simulation allowed fellows an opportunity to apply newly acquired skills in a controlled environment and receive feedback, which has been shown to improve teaching skills.[18]
The experience from our program expands on that of Tofil et al., as well as previously described trainee‐as‐teacher curricula, by introducing a component of deliberate practice that is unique to the simulation setting and has been absent from most previously described RaT programs.[3] Most residents had the opportunity to facilitate the same case on multiple occasions, allowing them to receive feedback and make adjustments. Residents who facilitated 5 or more sessions demonstrated more improvement, particularly in teaching outside of simulation, than residents who facilitated fewer sessions. It is notable that PGY‐II resident facilitators reported an improvement in their teaching skills on the wards, though less pronounced as compared to teaching in the simulation‐based environment, suggesting that benefits of the program may extend to nonsimulation‐based settings. Additional studies focusing on objective evaluation of ward‐based teaching are needed to further explore this phenomenon. Finally, the self‐reported improvements in medical knowledge by resident facilitators may serve as another benefit of our program.
Analysis of learner‐level data collected in the postcurriculum intern survey and DASH ratings provides additional support for the effectiveness of the RaT program. The majority of intern learners reported that resident facilitators used the techniques covered in our program frequently during debriefings. In addition, DASH scores clustered around maximum efficacy for all facilitators, suggesting that residents were effective teachers. Although we cannot directly assess whether the differences demonstrated in resident facilitators' self‐assessments translated to their teaching or were significant from the learners' perspective, these results support the hypothesis that self‐assessed improvements in teaching and feedback skills were significant.
In addition to improving resident teaching skills, our program had a positive impact on intern learners as evidenced by intern evaluations of the simulation curriculum. While utilizing relatively few faculty resources, our program was able to deliver an extensive and well‐received simulation curriculum to over 50 interns. The fact that 40% of second‐ and third‐year residents volunteered to teach in the program despite the early morning timing of the sessions speaks to the interest that trainees have in teaching in this setting. This model can serve as an important and efficient learning platform in residency training programs. It may be particularly salient to IM training programs where implementation of simulation curricula is challenging due to large numbers of residents and limited faculty resources. The barriers to and lessons learned from our experience with implementing the simulation curriculum have been previously described.[10, 11]
Our study has several limitations. Changes in residents' teaching skills were self‐assessed, which may be inaccurate as learners may overestimate their abilities.[19] Although we collected data on the experiences of intern learners that supported residents' self‐assessment, further studies using more objective measures (such as the Objective Structured Teaching Exercise[20]) should be undertaken. We did not objectively assess improvement of residents' teaching skills on the wards, with the exception of the residents' self assessment. Due to the timing of survey administration, some residents had as little as 1 month between completion of the curriculum and responding to the post‐curriculum survey, limiting their ability to evaluate their teaching skills on the wards. The transferability of the skills gained in simulation‐based teaching to teaching on the wards deserves further study. We cannot definitively attribute perceived improvement of teaching skills to the RaT program without a control group. However, the frequent use of recommended techniques during debriefing, which are not typically taught in other settings, supports the efficacy of the RaT program.
Our study did not allow us to determine which of the 3 components of the RaT program (workshop, facilitation practice, or direct observation and feedback) had the greatest impact on teaching skills or DASH ratings, as those who facilitated more sessions also completed the other components of the program. Furthermore, there may have been a selection bias among facilitators who facilitated more sessions. Because only 16 of 34 participants completed both the pre‐program and post‐program self‐assessments in a non‐anonymous fashion, we were not able to analyze the effect of pre‐program factors, such as prior teaching experience, on program outcomes. It should also be noted that allowing resident facilitators the option to complete the survey non‐anonymously could have biased our results. The simulation curriculum was conducted in a single center, and resident facilitators were self‐selecting; therefore, our results may not be generalizable. Finally, the DASH instrument was only administered after the RaT workshop and was likely limited further by the ceiling effect created by the learners' high satisfaction with the simulation program overall.
In summary, our simulation‐based RaT program improved resident facilitators' self‐reported teaching and feedback skills. Simulation‐based training provided an opportunity for deliberate practice of teaching skills in a controlled environment, which was a unique component of the program. The impact of deliberate practice on resident teaching skills and optimal methods to incorporate deliberate practice in RaT programs deserves further study. Our curriculum design may serve as a model for the development of simulation programs that can be employed to improve both intern learning and resident teaching skills.
Acknowledgements
The authors acknowledge Deborah Navedo, PhD, Assistant Professor, Massachusetts General Hospital Institute of Health Professions, and Emily M. Hayden, MD, Assistant Professor, Department of Emergency Medicine, Massachusetts General Hospital and Harvard Medical School, for their assistance with development of the RaT curriculum. The authors thank Dr. Jenny Rudolph, Senior Director, Institute for Medical Simulation at the Center for Medical Simulation, for her help in teaching us to use the DASH instrument. The authors also thank Dr. Daniel Hunt, MD, Associate Professor, Department of Medicine, Massachusetts General Hospital and Harvard Medical School, for his thoughtful review of this manuscript.
Disclosure: Nothing to report.
- Liaison Committee on Medical Education. Functions and structure of a medical school: standards for accreditation of medical education programs leading to the M.D. degree. Washington, DC, and Chicago, IL: Association of American Medical Colleges and American Medical Association; 2000.
- Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in pediatrics. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQ-PIF/320_pediatrics_07012013.pdf. Accessed June 18, 2014.
- A systematic review of resident‐as‐teacher programmes. Med Educ. 2009;43(12):1129–1140. , , , .
- The utility of simulation in medical education: what is the evidence? Mt Sinai J Med. 2009;76:330–343. , , , et al.
- National growth in simulation training within emergency medicine residency programs, 2003–2008. Acad Emerg Med. 2008;15:1113–1116. , , , et al.
- Simulation center accreditation and programmatic benchmarks: a review for emergency medicine. Acad Emerg Med. 2010;17(10):1093–1103. , , , et al.
- How much evidence does it take? A cumulative meta‐analysis of outcomes of simulation‐based education. Med Educ. 2014;48(8):750–760. .
- Resident‐as‐teacher: a suggested curriculum for emergency medicine. Acad Emerg Med. 2006;13(6):677–679. , , , et al..
- Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81. .
- Pilot program utilizing medical simulation in clinical decision making training for internal medicine interns. J Grad Med Educ. 2012;4:490–495. , , , , , .
- How we implemented a resident‐led medical simulation curriculum in a large internal medicine residency program. Med Teach. 2014;36(4):279–283. , , , et al.
- There's no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1:49–55. , , , .
- Improving faculty feedback to resident trainees during a simulated case: a randomized, controlled trial of an educational intervention. Anesthesiology. 2014;120(1):160–171. , , , .
- Introduction to debriefing. Semin Perinatol. 2013:37(3)166–174. .
- Response‐shift bias: a source of contamination of self‐report measures. J Appl Psychol. 1979;4:93–106. , .
- Center for Medical Simulation. Debriefing assessment for simulation in healthcare. Available at: http://www.harvardmedsim.org/debriefing‐assesment‐simulation‐healthcare.php. Accessed June 18, 2014.
- A novel iterative‐learner simulation model: fellows as teachers. J Grad Med Educ. 2014;6(1):127–132. , , , et al.
- Direct observation of faculty with feedback: an effective means of improving patient‐centered and learner‐centered teaching skills. Teach Learn Med. 2007;19(3):278–286. , , .
- Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self assessments. J Pers Soc Psychol. 1999;77:1121–1134. , .
- Reliability and validity of an objective structured teaching examination for generalist resident teachers. Acad Med. 2002;77(10 suppl):S29–S32. , , , et al.
- Liaison Committee on Medical Education. Functions and structure of a medical school: standards for accreditation of medical education programs leading to the M.D. degree. Washington, DC, and Chicago, IL: Association of American Medical Colleges and American Medical Association; 2000.
- Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in pediatrics. Available at: http://www.acgme.org/acgmeweb/Portals/0/PFAssets/2013-PR-FAQ-PIF/320_pediatrics_07012013.pdf. Accessed June 18, 2014.
- A systematic review of resident‐as‐teacher programmes. Med Educ. 2009;43(12):1129–1140. , , , .
- The utility of simulation in medical education: what is the evidence? Mt Sinai J Med. 2009;76:330–343. , , , et al.
- National growth in simulation training within emergency medicine residency programs, 2003–2008. Acad Emerg Med. 2008;15:1113–1116. , , , et al.
- Simulation center accreditation and programmatic benchmarks: a review for emergency medicine. Acad Emerg Med. 2010;17(10):1093–1103. , , , et al.
- How much evidence does it take? A cumulative meta‐analysis of outcomes of simulation‐based education. Med Educ. 2014;48(8):750–760. .
- Resident‐as‐teacher: a suggested curriculum for emergency medicine. Acad Emerg Med. 2006;13(6):677–679. , , , et al..
- Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 suppl):S70–S81. .
- Pilot program utilizing medical simulation in clinical decision making training for internal medicine interns. J Grad Med Educ. 2012;4:490–495. , , , , , .
- How we implemented a resident‐led medical simulation curriculum in a large internal medicine residency program. Med Teach. 2014;36(4):279–283. , , , et al.
- There's no such thing as “nonjudgmental” debriefing: a theory and method for debriefing with good judgment. Simul Healthc. 2006;1:49–55. , , , .
- Improving faculty feedback to resident trainees during a simulated case: a randomized, controlled trial of an educational intervention. Anesthesiology. 2014;120(1):160–171. , , , .
- Introduction to debriefing. Semin Perinatol. 2013:37(3)166–174. .
- Response‐shift bias: a source of contamination of self‐report measures. J Appl Psychol. 1979;4:93–106. , .
- Center for Medical Simulation. Debriefing assessment for simulation in healthcare. Available at: http://www.harvardmedsim.org/debriefing‐assesment‐simulation‐healthcare.php. Accessed June 18, 2014.
- A novel iterative‐learner simulation model: fellows as teachers. J Grad Med Educ. 2014;6(1):127–132. , , , et al.
- Direct observation of faculty with feedback: an effective means of improving patient‐centered and learner‐centered teaching skills. Teach Learn Med. 2007;19(3):278–286. , , .
- Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self assessments. J Pers Soc Psychol. 1999;77:1121–1134. , .
- Reliability and validity of an objective structured teaching examination for generalist resident teachers. Acad Med. 2002;77(10 suppl):S29–S32. , , , et al.
© 2015 Society of Hospital Medicine