4
Evidence Digest Evidence to Support the Use of Patient Simulation to Enhance Clinical Practice Skills and Competency in Health Care Professionals and Students Bernadette Mazurek Melnyk, RN, PhD, CPNP/NPP, FAAN, FNAP T he purpose of Evidence Digest, a recurring column in Worldviews on Evidence-Based Nursing, is to provide concise summaries of well-designed and/or clinically im- portant recent studies along with implications for prac- tice, research, administration, and/or health policy. Arti- cles highlighted in this column may include quantitative and qualitative studies, systematic and integrative reviews, outcomes evaluation studies, as well as consensus state- ments by expert panels. Along with relevant implications, the level of evidence generated by the studies or reports highlighted in this column (see Figure 1) is included at the end of each summary so that readers can integrate the strength of evidence into their health care decisions. SIMULATION IMPROVES ACUTE CARE CRITICAL ASSESSMENT AND MANAGEMENT SKILLS Steadman R.H., Coates W.C., Huang Y.M., Matevosian R., Larmon B.R., McCullough L. & Ariel D. (2007). Critical Care Medicine, 34(1), 151–157. Purpose: The purpose of this study was to deter- mine whether full-scale simulation is superior to interac- tive problem-based learning for teaching medical students acute care assessment and management skills. Design: A randomized controlled trial with two exper- imental groups (i.e., a simulation group and a problem- based learning group). Sample/Setting: Thirty-one fourth year medical students at a school in the western United States. Method: Medical students were randomly assigned to either a one-week acute care simulation learning group or a problem-based learning group. At baseline, all subjects received a simulator-based initial assessment that tapped their critical care skills, which was conducted by two Copyright ©2008 Sigma Theta Tau International 1545-102X1/08 blinded observers who used a standardized checklist for their assessments. Before the intervention sessions, all stu- dents in both groups received didactic lectures on the test topic, dyspnea, and a control topic (i.e., abdominal pain). The simulation group then learned about dyspnea using a simulator, whereas the comparison group used a problem- based learning format (i.e., a case study without the use of simulation or medical equipment). Students were then post-tested on a unique dyspnea scenario using simulation and the standardized checklist. Each checklist comprised assessment (history and physical), diagnostic evaluation, and management items that were scored in a yes/no (i.e., performed/not performed) format. Point values were as- signed to each item by a group of experts from emergency medicine and anesthesiology. Higher points were assigned to what was labeled critical actions to generate a weighted score. Results: Baseline assessment scores were similar in both groups. However, post-test scores indicated that the simula- tion group performed significantly better than the problem- based learning group in that the mean change scores im- proved by 25 points in the simulation group versus 8 points in the problem-based learning group. Commentary with Implications for Action in Clinical Practice, Education, and Future Research. Unlike many other studies of simulation, a major strength of this investigation is that it used a randomized controlled trial to test the efficacy of simulation versus problem- based learning on students’ acute care assessment and management skills, which is the strongest design for supporting cause and effect relationships as well as for controlling confounding variables. Therefore, more con- fidence can be placed in the findings that simulation is a more effective teaching methodology than problem- based learning in enhancing students’ acute care assess- ment and management skills. Furthermore, although the sample size was small in this study, the positive Worldviews on Evidence-Based Nursing First Quarter 2008 49

Evidence to Support the Use of Patient Simulation to Enhance Clinical Practice Skills and Competency in Health Care Professionals and Students

Embed Size (px)

Citation preview

Evidence Digest

Evidence to Support the Use of PatientSimulation to Enhance Clinical Practice Skillsand Competency in Health Care Professionalsand Students

Bernadette Mazurek Melnyk, RN, PhD, CPNP/NPP, FAAN, FNAP

The purpose of Evidence Digest, a recurring column inWorldviews on Evidence-Based Nursing, is to provide

concise summaries of well-designed and/or clinically im-portant recent studies along with implications for prac-tice, research, administration, and/or health policy. Arti-cles highlighted in this column may include quantitativeand qualitative studies, systematic and integrative reviews,outcomes evaluation studies, as well as consensus state-ments by expert panels. Along with relevant implications,the level of evidence generated by the studies or reportshighlighted in this column (see Figure 1) is included atthe end of each summary so that readers can integrate thestrength of evidence into their health care decisions.

SIMULATION IMPROVES ACUTECARE CRITICAL ASSESSMENT

AND MANAGEMENT SKILLS

Steadman R.H., Coates W.C., Huang Y.M., Matevosian R.,Larmon B.R., McCullough L. & Ariel D. (2007). CriticalCare Medicine, 34(1), 151–157.

Purpose: The purpose of this study was to deter-mine whether full-scale simulation is superior to interac-tive problem-based learning for teaching medical studentsacute care assessment and management skills.

Design: A randomized controlled trial with two exper-imental groups (i.e., a simulation group and a problem-based learning group).

Sample/Setting: Thirty-one fourth year medical studentsat a school in the western United States.

Method: Medical students were randomly assigned toeither a one-week acute care simulation learning group ora problem-based learning group. At baseline, all subjectsreceived a simulator-based initial assessment that tappedtheir critical care skills, which was conducted by two

Copyright ©2008 Sigma Theta Tau International1545-102X1/08

blinded observers who used a standardized checklist fortheir assessments. Before the intervention sessions, all stu-dents in both groups received didactic lectures on the testtopic, dyspnea, and a control topic (i.e., abdominal pain).The simulation group then learned about dyspnea using asimulator, whereas the comparison group used a problem-based learning format (i.e., a case study without the useof simulation or medical equipment). Students were thenpost-tested on a unique dyspnea scenario using simulationand the standardized checklist. Each checklist comprisedassessment (history and physical), diagnostic evaluation,and management items that were scored in a yes/no (i.e.,performed/not performed) format. Point values were as-signed to each item by a group of experts from emergencymedicine and anesthesiology. Higher points were assignedto what was labeled critical actions to generate a weightedscore.

Results: Baseline assessment scores were similar in bothgroups. However, post-test scores indicated that the simula-tion group performed significantly better than the problem-based learning group in that the mean change scores im-proved by 25 points in the simulation group versus 8 pointsin the problem-based learning group.

Commentary with Implications for Action in ClinicalPractice, Education, and Future Research. Unlike manyother studies of simulation, a major strength of thisinvestigation is that it used a randomized controlledtrial to test the efficacy of simulation versus problem-based learning on students’ acute care assessment andmanagement skills, which is the strongest design forsupporting cause and effect relationships as well as forcontrolling confounding variables. Therefore, more con-fidence can be placed in the findings that simulation isa more effective teaching methodology than problem-based learning in enhancing students’ acute care assess-ment and management skills. Furthermore, althoughthe sample size was small in this study, the positive

Worldviews on Evidence-Based Nursing �First Quarter 2008 49

Evidence Digest

• Level I: Evidence from a systematic review or meta-analysis of all relevant randomized controlled trials (RCTs), or evidence-based clinical practice guidelines based on systematic reviews of RCTs

• Level II: Evidence obtained from at least one well-designed RCT

• Level III: Evidence obtained from well-designed controlled trials without randomization

• Level IV: Evidence from well-designed case-control and cohort studies

• Level V: Evidence from systematic reviews of descriptive and qualitativestudies

• Level VI: Evidence from a single descriptive or qualitative study

• Level VII: Evidence from the opinion of authorities and/or reports of expert committees

Modified from Guyatt & Rennie, 2002; Harris et al., 2001

Figure 1. Rating system for the hierarchy of evidence (from

Melnyk & Fineout-Overholt 2005).

effect of simulation learning was large enough for the de-tection of significant differences between the groups. Theevidence from this study is strong to support that healthcare educational programs should consider routinely in-corporating simulation as a key strategy for teachingimportant assessment and management skills in theircurriculums. Replication of this study with other healthcare provider students (e.g., nurses and physician assis-tants) would enhance the generalizability of the findingsto other health care professions.

Level of Evidence: II

SIMULATION TO REDUCE ERRORSIN OBSTETRICAL EMERGENCIES

Maslovitz S., Barkai G., Lessing J.B., Ziv A. & Many A.(2007). Recurrent obstetric mistakes identified by simula-tion. Obstetrics and Gynecology, 109(6), 1295–1300.

Purpose: The purpose of this study was to evaluate afour-session simulation-based program for labor and de-livery teams involved in obstetric emergencies to recognizeand handle common mistakes.

Design: Pre- and post-test; one-group pre-experimentaldesign.

Sample/Setting: Sixty residents in obstetrics and gyne-cology and 88 midwives were enrolled in a simulationcourse to manage obstetrical emergencies, with 42 teams(i.e., one resident and two midwives) completing all foursessions. The residents were in the first 3 years of a six-yeartraining program. The midwives had a mean delivery roomexperience of 2.5 years. The mean ages of the residents andmidwives were 30.8 and 34.1 years, respectively. The set-ting was the Israeli Center for Medical Simulation, whichis equipped with computerized low-technology simulatorsand high-tech mannequins.

Methods: The four-session simulation curriculum fo-cused on common obstetrical management errors, includ-ing transporting bleeding patients to the operating room,poor cardiopulmonary resuscitation techniques, shoulderdystocia, and delayed administration of blood productsto reverse consumption coagulopathy. Tutors observedthe simulations and completed checklists that compriseda comprehensive list of actions required of the teams.Trainees were graded on a scale of 0–100 by a scoringsystem based on the completed checklists. Questionnairesthat provided feedback were handed out to the trainers andtrainees before and after the sessions to determine useful-ness and satisfaction of the course. Identification of com-mon and recurring mistakes were determined by review-ing hours of videotapes and summing up the data from thechecklists.

Findings: On the pre-assessment questionnaire, 68%of the health care providers reported that they were nottrained to take independent action in any of the four se-lected emergencies. Sixty-four percent of them were neverrequired to take charge of a scenario in real life. However,82% of the providers reported that their theoretical knowl-edge of these situations was satisfactory.

The majority of teams (i.e., 76%) scored in the rangeof 65–80 on all four scenarios, with the lowest scores forpostpartum hemorrhage and eclamptic seizure. There wereno differences in scoring among first-, second-, and third-year residents, although grades were significantly higher inmore experienced (i.e., more than 6 years of training) mid-wives than those in less experienced midwives. Eighteenresidents went through a repeat simulation-based trainingday 6 months after the first. The scores of these traineeswere significantly higher than the first scores. Post-sessionfeedback from the trainees indicated that there were sub-stantial differences between their theoretical knowledgeand practice skills. Eighty-four percent of the trainees whosaid they were confident at baseline in their ability toperform well in emergency obstetrical situations retractedtheir original statements.

Commentary with Implications for Action in ClinicalPractice and Future Research. In today’s health envi-ronment, there is major emphasis being placed uponhospitals to ensure high-quality and safe patient care.Thus, institutions are accelerating their efforts to deter-mine the most effective strategies to decrease adverseevents with high-risk patients. This study’s findingsidentified errors and areas for improvement in high-risk obstetrical situations with residents and midwivesthrough the use of patient simulation. Since this re-search was not a randomized controlled trial and useda one-group design, the internal validity (i.e., being

50 First Quarter 2008 �Worldviews on Evidence-Based Nursing

Evidence Digest

able to conclude that it was the intervention that causeda change in the outcomes) is weak. However, the datagenerated is compelling in that simulation can assistin the identification of potential medical errors, as wellas improve health care providers’ ability to better man-age emergency situations. Future rigorous research withrandomized controlled trials is needed to continue todocument the positive outcomes of simulation versusother education and training on a variety of health careprovider and patient outcomes.

Level of Evidence: VI

PATIENT SIMULATION ANDCOMPETENCY DEVELOPMENT

IN A NURSE RESIDENCY PROGRAM

Beyea S.C., von Reyn L. & Slattery M.J. (2007). A nurse res-idency program for competency development using humanpatient simulation. Journal for Nurses in Staff Development,23(2), 77–82.

Purpose: The purpose of this study was to determine theimpact of a new graduate registered nurse residency pro-gram in patient simulation on nurses’ competency, confi-dence, and readiness for entry into practice.

Design: Preliminary outcomes evaluation after programimplementation.

Sample/Setting: Forty-two nurse residents participatedin the study from a northeastern U.S. medical center. Noinformation was provided about the demographics of thenew graduate nurse residents.

Methods: A 12-week nurse residency program was cre-ated that comprised weekly didactic presentations, weeklystructured simulation experiences, and clinical time witha preceptor. The content of the program included pro-fessional development, continuous quality improvement,teamwork and collaboration, patient safety, and self-directed learning within the context of health systems, in-formation management, safety, and clinical/functional di-mensions.

Scenario-based simulations were used to develop clini-cal and critical thinking skills in three tracks: (1) medical-surgical, (2) pediatrics/pediatric critical care, and (3) adultcritical care. Experts throughout the health care systemidentified high-risk conditions and situations, as well ascommonly occurring processes, that would provide in-strumental learning for entry into practice and positivelyaffect failure-to-rescue conditions (e.g., chest tube inser-tion/care, COPD with pneumonia, deep vein thrombosis,lethal dysrhythmias, seizure, and diabetes). All simulations

included patient safety, including human factors, commu-nication, resource management, and situational awareness.After each simulation, the nurse residents engaged in adebriefing about their performance with their supervisingeducators.

Each week, the nurses rated their level of confidence,competence, and readiness to provide independent nursingcare to patients with symptoms that they studied/practicedthat week on a visual analog scale from extremely lowto extremely high. At the end of the residency program,the nurses’ competencies also were assessed by the ed-ucators and unit leadership. They also completed theNurse Resident’s Readiness for Entry into Practice Com-petence Questionnaire at baseline, mid-point through theprogram, and at completion of the program, which isa 53-item scale that taps: (1) nurse/client relationship,(2) illness/injury prevention, and (3) curative/supportivecare.

Findings: In total, 95% of the nurses reported thatthey enjoyed the simulations and they should be partof the residency program. The nurses’ scores for con-fidence, competence, and readiness increased from thesecond to the tenth week. Units reported that the nurseresidents had developed better skills and a greater under-standing of their role as new graduates than those hiredpreviously.

The length of orientation was 14.7 weeks in the medi-cal/surgical track and 22 weeks in adult and pediatric criti-cal care in comparison to the standard orientation period of26 weeks, which was typical before the residency programwas implemented.

Commentary with Implications for Action in ClinicalPractice and Future Research. Although this study wasnot a randomized controlled trial, which is the strongestdesign for the testing of cause and effect relationships,the preliminary findings indicate that use of patient sim-ulations in a nurse residency program could possiblylead to a higher level of confidence and competence forindependent clinical practice in new nurse graduates. Inan error of severe nursing shortages in multiple coun-tries across the globe and a multitude of medical errors,the use of simulation in new orientation residency pro-grams could be a promising strategy to reduce length oforientation and increase competence in the practice ofnew graduates. Randomized controlled trials are neededto determine whether this type of residency program ismore efficacious in promoting competency than othertypes of orientation programs.

Level of Evidence: VI

Worldviews on Evidence-Based Nursing �First Quarter 2008 51

Evidence Digest

Ali J., Adam R.U., Sammy I., Ali E. & Williams J.I. (2007).The simulated trauma patient teaching module—does itimprove student performance? The Journal of Trauma In-jury, Infection, and Critical Care, 62, 1416–1420.

Purpose: The purpose of this study was to evaluate theeffectiveness of simulated Trauma Evaluation and Manage-ment (TEAM) program modules in comparison to con-trol learning modules in enhancing student knowledge andskills in assessing and managing trauma patients.

Design: Randomized controlled trial.Sample/Setting: A total of 70 final-year medical students

at the University of the West Indies.Methods: Students were invited to participate in the old

TEAM and new TEAM programs and were randomly as-signed to either experimental or control groups in each ofthe old and new TEAM programs. Two trained standardizedlive subjects provided the simulation learning experiencesin the new TEAM program, including one 32-year-old manwho sustained a tension pneumothorax and closed head in-jury from a fall and a 40-year-old taxi driver who sustainedleft rib fractures, a hemothorax, and a ruptured spleen froma motor vehicle accident. Both of these patients were inshock. Thirty-two and 34 objective criteria were used tostandardize the patient assessment and management learn-ing scenarios using established Objective Structured Clin-ical Examination scenarios, respectively.

Students completed a 20-item trauma multiple-choicequestionnaire examination before and after their assignedprograms. All students also completed a five-item ques-tionnaire using a five-point Likert scale from “strongly dis-agree” to “strongly agree” to rate whether: (1) objectivesof the program were met, (2) trauma knowledge was im-proved, (3) trauma skills were improved, (4) degree of sat-isfaction with the course, and (5) the course should bemade mandatory in the curriculum.

Findings: Post-test scores on the examination were sig-nificantly increased after both the old and new TEAM pro-grams, but there was a significantly greater increase afterthe new TEAM program that used the patient simulations.In the old TEAM program, 51.6% of the students ratedimprovement in their trauma skills at four or greater com-pared with 97.3% in the new program. A greater number of

students in the old TEAM program requested more handson teaching. Eighty-five percent of the students in the newTEAM program scored at a level to receive an honors passmark compared to none in the old TEAM program.

Commentary with Implications for Action in Clinical Prac-tice, Education, and Future Research. A major strength ofthis study is the use of a randomized controlled trial de-sign, which enhances internal validity of the study. How-ever, an important limitation of this study was the factthat the investigators only chose to include self-reportedmeasures of the students’ knowledge and perception ofclinical skills. Inclusion of an objective measure of thestudents’ performance would have allowed more confi-dence to be placed in the findings, especially if the ob-servations of performance converged with the students’self-reported data. Therefore, the inclusion of both self-reported and objective measures should be incorporatedin similar future studies. Despite this limitation, the find-ings indicated that students gained more from the newTEAM program that included simulated learning expe-riences. Incorporation of simulation in health care pro-fessional education programs seems to be an effectiveteaching method for enhancing knowledge and skills inthe assessment and management of acute care situations.

Level of Evidence: II

ReferencesGuyatt G. & Rennie D. (2002). Users’ guides to the medical

literature. Washington, DC: American Medical Associa-tion Press.

Harris R.P., Hefland M., Woolf S.H., Lohr K.N., MulrowC.D., Teutsch S.M. & Atkins D. (2001). Current meth-ods of the U.S. Preventive Services Task Force: A reviewof the process. American Journal of Preventive Medicine,20(Suppl 3), 21–35.

Melnyk B.M. & Fineout-Overholt E. (2005). Evidence-based practice in nursing & healthcare. A guide tobest practice. Philadelphia: Lippincott, Williams &Wilkins.

52 First Quarter 2008 �Worldviews on Evidence-Based Nursing