Online Course

Nurs 791 - Instructional Strategies and Assessment

Module 6: Learner Moderated Strategies

Clinical Simulation

Authors - Mary K. Fey, PhD, RN, CHSEs and Amy Daniels, MS, RN, CHSE, PhD Candidate

Clinical Simulation: Introduction

SIMpic1

Simulation is a technique that creates a situation or environment to allow persons to experience a representation of a real event for the purpose of practice, learning, evaluation, testing, or to gain understanding of systems or human actions (Healthcare Simulation Dictionary, 2016). Various theories inform the practice of simulation. Kolb’s Experiential Learning Theory (1984) is one of the foundational theories that underpins the practice of simulation. Kolb posits that learners engage in a concrete experience that is then followed by a period of reflection on the experience. This reflection allows the learner to consider new perspectives as the concrete experience is integrated with current knowledge. Experience, reflection and the formation of new perspectives ideally allows the learner to perform better in future similar situations.

Kolb Video

Clinical Simulation: Purpose, Method, and Modalities

The various simulation activities that take place in healthcare are classified according to the taxonomy of purpose, method and modality (Palaganas, Maxworthy, Epps, & Mancini, 2015). Choices made with regard to these elements are primarily driven by the educator’s objectives.

Purpose

Simulation can be used for multiple purposes. Examples of this include: procedural training to teach and test psychomotor skills; team training that focuses on teamwork and communication; and application of clinical decision making skills. Other purposes include: evaluation of learners; research; test facilities for identification of latent errors; and, in nursing education, to replace traditional clinical hours.

Method

Method refers to how a simulation will be facilitated. This is generally dictated by the objectives and the level of the learner, but can also be influenced by the resources available in the simulation lab. Facilitation considers activities before, during and after the simulation scenario. Before, consideration is given to preparing learners, including pre-work and creating a psychologically safe environment. Facilitation methods during will vary based on the purpose (e.g. teaching/learning; evaluation; testing of facility), the level of the learner (i.e. novice vs experienced; one profession vs interprofessional, etc).

Facilitation after the simulation considers the method of debriefing and/or feedback (Franklin, et al., 2013).

Modalities

Modality refers to the various types of equipment used to accomplish the objectives of the simulation session. For example, if the objective of the experience is psychomotor skill training, a partial body “task trainer” may be the most effective modality. Task trainers are lower in cost and often more anatomically correct than full-body manikins. This is an example of a task trainer. Simulations for team training often use a full body manikin. When simulation objectives include communication, Standardized Patients (SPs) are often the best choice.SPs are actors who are trained to portray a specific patient situation in a consistent way. Hybrid simulation involves a combination of modalities, e.g. a SP with a partial task trainer such as the “prompt” birth simulator.

Fidelity and Realism

Fidelity refers to the accuracy with which the simulated environment represents reality. Fidelity is often used simply to refer to the technological capabilities of the manikin, using the terms high- low- and mid-fidelity. The higher the fidelity, the more accurately a manikin mimics human functions, i.e. vital signs, the ability to breathe and other physical findings. This is an example of a high fidelity manikin; this is an example of a low fidelity manikin.

Topic1Img2

However, fidelity goes beyond the manikin and is a concept that is important to the entire simulation experience. It can be thought of in 3 ways: physical (how real does it look, feel, smell, etc), emotional and experiential (does it recreate the stress and tension of the actual clinical environment) and conceptual (do all the parts of the story fit together in a sensible way) (Rudolph, Simon, Raemer, 2007). The three types of fidelity interact with each other during simulated experiences in different ways. Dieckmann (2007) notes that when a simulation “works” for learners, they are able to experience the simulation relevant to the learning goals despite differences from the real clinical environment.

If the simulation “works,” then participants experience the simulation scenario relevant to the goal of the session and they are able to make semantical sense of the scenario despite its physical differences from the clinical situation. This perception, on the part of the learner, that the simulation “works” is called realism. To enhance this sense of realism, simulation educators establish a “fiction contract” with learners.  The fiction contract is an explicit and collaborative agreement in which the educator and learner both acknowledge that the simulated environment is not real, and learners agree to “act as if” it is real to the best of their ability. This agreement may foster learner engagement in the simulation (Rudolph, Raemer, Simon, 2014).

Features And Best Practices of Simulation Based Medical Education

Based on two comprehensive reviews of simulation based medical education research (Issenberg, McGaghie, Petrusa, Gordon, & Scalese, 2005 and McGaghie, Issenberg, Petrusa & Scalese, 2009), the following have been identified as best practices:

  1. Feedback –feedback on performance normally occurs during debriefing. This feedback is most frequently formative (i.e. for the purpose of improving practice), but may be summative in nature (i.e. pass/fail).
  2. Deliberate practice – this refers to focused, repetitive practice that is followed by precise informative feedback. The learners integrate the feedback as they continue to practice and refine the skill as they advance toward mastery of the skill.
  3. Curriculum integration – simulation should be integrated into the curriculum carefully along with parallel didactic and clinical education.
  4. Outcome measurement – as with any educational intervention, reliable and valid outcome measures are required.
  5. Simulation fidelity – fidelity refers to the degree to which the simulated environment mimics the real environment.
  6. Skill acquisition and maintenance- clinical skill acquisition is the most common learning objective of simulation.
  7. Mastery learning- incorporating deliberate practice, mastery learning’s goal is to ensure that all learners accomplish all education objectives with little or no outcome variation.
  8. Transfer to practice: this demonstrates that skills acquired in simulation transfer to real clinical settings.
  9. Team training- with communication at the root of many medical errors, simulation provides a method to teach teamwork skills
  10. High stakes testing- high stakes testing refers to testing in which decisions such as passing or failing a course or program are made based on the outcome.
  11. Instructor training- although there is no evidence supporting required training for simulation facilitators, teaching with simulation is not easy or intuitive; clinical experience is not a proxy for simulation instructor effectiveness.
  12. Educational and professional context- contextual factors such as practice setting, faculty expertise in training with simulation, and overall institutional support for simulation programs can impact the success or failure of simulation

The Role of The Educator In Simulation

In a classic description of simulation-based education, David Gaba refers to simulation as a “technique, not a technology” (2004). This technique provides a unique opportunity for experiential learning. Creating and facilitating this type of learning experience is usually outside of the skill set of a typical health professions educator. Simulation integrates knowledge from diverse fields, including psychology, adult learning, organizational behavior and education. In recognition of the complexity of facilitating simulation based education, professional organizations uniformly recognize the need for formal training and competence assessment, especially related to facilitating debriefing discussions (Decker, et al., 2013; Society for Simulation in Healthcare, 2014; Alexander et al., 2015).

Preparation and Training

The International Nursing Association for Clinical Simulation and Learning (INACSL) simulation standards provide guidelines for preparation and training of simulation educators. These guidelines are found in Standard V:  Facilitator. The guiding statement of this standard establishes that simulation educators receive focused training in the format of “formal coursework, continuing education offerings, and targeted work with an experienced mentor (Boese, et al., 2013).”

Resources for simulation educator training are increasing. Programs providing this training range from 1 day workshops through graduate degrees focusing on simulation education. Current evidence does not identify any one training length or modality as best practice. The National Council of State Boards of Nursing (NCSBN) published guidelines for simulation educator training and assessment. These guidelines, incorporating elements of the INACSL standards, state that simulation faculty should demonstrate evidence of formal training through:  attending simulation conferences; completing coursework on simulation instruction; training by a consultant; and/or targeted work with an experienced mentor. The NCSBN guidelines include a detailed checklist for simulation faculty preparation. (Alexander, et al., 2015).

Standards of Best Practice: SimulationSM

The International Nursing Association for Clinical Simulation and Learning published Standards of Best Practice for Simulation in 2009, with an updated version published in 2013. The Standards provide a way to assess simulation programs to determine if their adherence to best practices. They can also serve as a framework for quality improvement initiatives. There are currently nine Standards

  1. Terminology
  2. Professional Integrity of Participants
  3. Participant Objectives
  4. Facilitator
  5. Facilitation
  6. Debriefing Process
  7. Participant Assessment and Evaluation
  8. Simulation-enhanced Interprofessional Education
  9. Simulation Design

Each of the Standards is structured in the same way and includes:

Rationale – Justification for the development of a standard.

Outcome – Intended result(s) of adhering to the standard.

Criteria – Factors such as attributes, characteristics, and/or parameters necessary to meet the outcome(s) of the standard.

Guidelines – Procedures or principles that are not mandatory but are used to assist in meeting standards. Guidelines are not necessarily comprehensive; they provide a framework for developing policies and procedures.

Psychological Safety

Originating in the domain of organizational behavior (Edmondson, 1999), psychological safety has become a concept used in simulation education through the work of Rudolph, Simon, and Raemer (2014). Psychological safety in simulation education refers to the learning environment as interpreted by the learners. A psychologically safe learning environment is created and maintained when learners sense that it is safe to take interpersonal risks for the sake of learning without fear of punishment, ridicule, or humiliation (Rudolph, J. et al, 2014). 

Simulation educators establish a psychologically safe learning environment by setting clear boundaries, expectations, and goals; establishing a fiction contract; attending to logistic details; and conveying respect for the learner and a genuine interest in their perspective. Creating this environment beginning prior to the simulation experience in what may be termed as the brief of pre-brief, sets a stage promoting learner engagement (Simon, et al, 2007).

Initiating this safe learning environment is not enough. Once this environment is established, the educator must maintain a safe space for learning by continuing to demonstrate respect for learners, remain open to discuss mistakes, thoughts, and emotions with a position of curiosity about their perspectives, frames of reference, and points of view, without shaming or humiliating the learners (Rudolph, 2014).

Feedback and Debriefing

Based on Kolb’s Experiential Learning Cycle, and incorporating concepts from reflective (Schon, 1983) and transformational learning theories (Mezirow, 1991), debriefing and feedback are critical elements in experiential learning. In fact, several studies have demonstrated that learning does not occur in simulation in the absence of debriefing (Mahmood & Darzi, 2004; Savoldelli et al., 2006; Shinnick & Woo, 2010).

Although sometimes used interchangeably, feedback and debriefing are distinctly different. Feedback is a one-way conversation in which the facilitator provides the learner with information about their performance for the purpose of improvement in the future. Debriefing is a facilitated, bidirectional reflective discussion in which the facilitator seeks to understand the thought processes driving the decisions the learner made during the experience (Sawyer, Eppich, Brett-Fleegler, Grant & Cheng, 2016).  Feedback can be incorporated into larger debriefing conversations.

Several theory-based debriefing methods currently in use include:

  • Debriefing With Good Judgment (Rudolph, Simon, Dufresne & Raemer, 2006)
  • Debriefing for Meaningful Learning (Dreifuerst, 2012)
  • Gather, Analyze, Summarize (GAS) (O’Donnell, et al., 2009)
  • Promoting Excellence And Reflective Learning in Simulation (PEARLS) (Eppich & Cheng, 2015)

Current and Future Issues in Simulation

National Simulation Study

In 2014, the National Council of State Boards of Nursing released the results from the multi-year, multi-site, National Simulation Study (Hayden, Smiley, Alexander, Kardong-Edgren & Jeffries, 2014). In this study, prelicensure nursing student clinical groups at 10 schools of nursing were randomly assigned to replace traditional clinical hours with simulation experiences. Groups were randomized to replace 10%, 25% or 50% of clinical hours with simulation. The results of this study “provide substantial evidence that substituting high-quality simulation experiences for up to half of traditional clinical hours produces comparable end-of-program educational outcomes and new graduates that are ready for clinical practice” (p. S3). Replacement of clinical hours with simulation must be conducted under conditions similar to those in the study. Those conditions include faculty trained in the pedagogy of simulation and theory based debriefing, and simulation programs that adhere to the INACSL Standards of Practice.

In follow-up to the study, the NCSBN published guidelines for the use of simulation in prelicensure nursing programs. The major elements of these guidelines include:

  • a commitment on the part of the school to support the simulation program with financial resources, a plan for curriculum integration, and evaluation of the simulation program;
  • appropriate facilities for conducting simulation;
  • the educational and technological resources to meet the intended outcomes; and
  • faculty and personnel who are qualified to conduct simulation (Alexander et al., 2015)
Simulation Research

The Society for Simulation in Healthcare hosted the first Research Consensus Summit in 2011. The goal of the Summit was to provide guidance for simulation related research. This meeting resulted in recommendations regarding a variety of topics in simulation-related research. A primary recommendation was that the conceptual and theoretical bases of simulation related research as well at the methods used need to be more explicitly described in future publications (Dieckmann, P., et al., 2011). It was acknowledged that a mix of research methods should be used, and that the method must be matched to the goal of each study. Consensus statements regarding specific research-related topics included:

  • simulation for learning and teaching procedural skills;
  • simulation-based team training;
  • simulation research design;
  • human factors research;
  • instructional design and pedagogy;;
  • the impact of simulation on translational patient outcomes;
  • assessing learning outcomes;
  • debriefing;
  • simulation based assessment and regulation of healthcare professionals; and
  • reporting inquiry in simulation.

Reporting guidelines for simulation research and recommendations for evaluating the quality of research reports have since been published. In an extension of the CONSORT and STROBE reporting statements, Cheng et al (2016) published reporting guidelines for healthcare simulation research. The quality of published research reports can be evaluated using the Simulation Research Rubric (Fey, Gloe, Mariani, 2015).

Simulation Education Outcomes

The Kirkpatrick Training Evaluation Model (1994) identifies four levels to objectively measure training effectiveness. The four levels include: Reaction, Learning, Behavior, and Results. Published simulation research has addressed the four levels as below:

  • The Reaction Level measures how the training was received by the audience. Did the trainees view the training as a valuable experience, was the material presented in a way that was enjoyable and engaging? Current simulation literature is inundated with participant evaluation reports at this level (Adamson, K., Kardong-Edgren, S., & Willhaus, J., 2013).
  • The Learning Level evaluates if trainees gained knowledge during the training experience. Have the learners met the objectives set forth in the simulation encounter? A recent integrative review of Simulation Evaluation in Undergraduate Nursing Education identified skills and knowledge as one of five themes in publications focusing on simulation evaluation (Foronda, C., Liu, S., & Bauman, E., 2013).
  • The Behavior Level concentrates on how trainees change their behavior as a result of the instruction they receive. Evaluation of simulation outcomes at this level is limited. The most recent findings at this level are findings from the NCSBN Simulation Study. (Hayden, J., Smiley, R., Alexander, M, Kardong-Edgren, S., & Jeffries, P., 2014).
  • The Results Level provides trainers the opportunity to analyze the final results of their instructional program. In simulation, the strongest approach to assessing the impact of the program is connecting the teaching to patient outcomes. This level of evaluation remains challenging. Simulation evaluation literature suggests this be the focus of future research in healthcare education (Adamson, K., Kardong-Edgren, S., & Willhaus, J., 2013; Foronda, C., Liu, S., & Bauman, E., 2013).

As the practice of simulation based education continues to mature, evaluating its impact on patient outcomes will be critical in demonstrating its value.

The following resources may be helpful to those facilitating simulations

References

Adamson, K., Kardong-Edgren, S., & Willhaus, J. (2013). An Updated Review of Published Simulation Evaluation Instruments. Clinical Simulation in Nursing, 9(9), e393-e400.

Alexander, M., Durham, C. F., Hooper, J. I., Jeffries, P. R., Goldman, N., Kesten, K. S., ... & Tillman, C. (2015). NCSBN simulation guidelines for prelicensure nursing programs. Journal of Nursing Regulation, 6(3), 39-42.

Cheng, A., Kessler, D., Mackinnon, R., Chang, T. P., Nadkarni, V. M., Hunt, E. A., Duval-Arnould, J., Lin, Y.,

Cook, D.A., PUsic, M. & Hui, J. (2016). Reporting guidelines for health care simulation research: Extensions to the CONSORT and STROBE statements. BMJ Simulation and Technology Enhanced Learning, bmjstel-2016.

Cheng, A., Morse, K. J., Rudolph, J., Arab, A. A., Runnacles, J., & Eppich, W. (2016). Learner-Centered Debriefing for Health Care Simulation Education: Lessons for Faculty Development. Simulation in Healthcare, 11(1), 32-40.

Cheng, A., Eppich, W., Grant, V., Sherbino, J., Zendejas, B., & Cook, D. A. (2014). Debriefing for technology‐enhanced simulation: a systematic review and meta‐analysis. Medical education, 48(7), 657-666.

Cook, D.A. and Beckman, T.J. (2015). High-value, cost-conscious medical education. JAMA pediatrics, 169(2), 109-111.

Decker, S., Fey, M., Sideras, S., Caballero, S., Rockstraw, L. (R.), Boese, T., Franklin, A. E., Gloe, D., Lioce, L., Sando, C. R., Meakim, C., & Borum, J. C. (2013, June). Standards of Best Practice: Simulation Standard VI: The debriefing process. Clinical Simulation in Nursing, 9(6S), S27-S29. http://dx.doi.org/ 10.1016/j.ecns.2013.04.008.

Dieckmann, P., Gaba, D., & Rall, M. (2007). Deepening the theoretical foundations of patient simulation as social practice. Simulation in Healthcare, 2(3), 183-193.

Dreifuerst, K. T. (2012). Using debriefing for meaningful learning to foster development of clinical reasoning in simulation. Journal of Nursing Education, 51(6), 326-333.

Eppich, W., & Cheng, A. (2015). Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simulation in Healthcare, 10(2), 106-115.

Fey, M. K., Gloe, D., & Mariani, B. (2015). Assessing the Quality of Simulation-Based Research Articles: A Rating Rubric. Clinical Simulation in Nursing, 11(12), 496-504.

Foronda, C., Liu, S., & Bauman, E. (2013). Evaluation of Simualtion in Undergraduate Nurse Education: An Integrative Review. Clinical Simulation in Nursing, 9(10), e409-e416.

Franklin, A. E., Boese, T., Gloe, D., Lioce, L., Decker, S., Sando, C. R., Meakim, C., & Borum, J. C. (2013, June). Standards of Best Practice: Simulation Standard IV: Facilitation. Clinical Simulation in Nursing, 9(6S), S19-S21. http://dx.doi.org/10.1016/j.ecns.2013.04.011.

Gaba, D.M. (2004). The future vision of simulation in healthcare. Quality and Safety in Healthcare, 13, 2-10.

Gates, M.G., Parr, M.B., and Hughen, J.E. (2012). Enhancing nursing knowledge using high-fidelity simulation. Journal of Nursing Education, 51(1) 9-15.

Hayden, J., Smiley, R., Alexander, M, Kardong-Edgren, S., & Jeffries, P. (2014).  The NCSBN National Simulation Study:  A Longitudinal, Randomized, Controlled Study Replacing Clinical Hours with Simulation in Prelicensure Nursing Education. Journal of Nursing Regulation, 5(2), S4-S64.

Healthcare Simulation Dictionary.  (2016).  Downloaded 06/01/2016 from http://www.ssih.org/Dictionary

International Nursing Association for Clinical Simulation and Learning (2013).  Standards of Best Practice:  SimulationTM.  Downloaded 6/09/2016 from http://www.nursingsimulation.org/issue/S1876-1399(13)X0013-1

Issenberg, S.B., McGaghie, W.C., Petrusa, E.R., Gordon, L. and Scalese, R.J. (2005). Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Medical Teacher 27(1), 10-28.

Kirkpatrick, D. L. (1994). Evaluating training programs: The four levels. San Francisco, CA: Bernett-Koehler.

Kolb, D. (1984). Experiential education: Experience as the source of learning and development.

Mahmood, T., & Darzi, A. (2004). The learning curve for a colonoscopy simulator in the absence of any feedback: No feedback, no learning. Surgical Endoscopy, 18(8), 1224-30. doi:10.1007/s00464-003-9143-4

McGaghie, W.C., Issenberg, B., Petrusa, E.M., and Scalese, R.J. (2010). A critical review of simulation-based medical education research: 2003-2009. Medical Education, 44: 50-63.

Mezirow, J. (1991). Transformative dimensions of adult learning. Jossey-Bass, 350 Sansome Street, San Francisco, CA 94104-1310.

O’Donnell, J., Rodgers, D., Lee, W., Edelson, D., Haag, J., Hamilton, M., ... & Meeks, R. (2009). Structured and supported debriefing. Dallas, Tex: American Heart Association.

Palaganas, J. C., Maxworthy, J. C., Epps, C. A., & Mancini, M. E. (2014). Defining excellence in simulation programs. Lippincott Williams & Wilkins.

Rudolph, J. W., Simon, R., & Raemer, D. B. (2007). Which reality matters? Questions on the path to high engagement in healthcare simulation. Simulation in Healthcare, 2(3), 161-163.

Rudolph, J., Simon, R., Dufresne, R.L., and Raemer, D. (2006). There’s no such thing as non-judgmental debriefing: A theory and method for debriefing with good judgment. Simulation in Healthcare, 1(1), 49-55.

Sawyer, T., Eppich, W., Brett-Fleegler, M., Grant, V., & Cheng, A. (2016). More Than One Way to Debrief: A Critical Review of Healthcare Simulation Debriefing Methods. Simulation in Healthcare, 11(3), 209-217.

Savoldelli, G., Naik, V., Park, J., Joo, H., Chow, R., & Hamstra, S. (2006). Value of debriefing during simulated crisis management: Oral versus video-assisted oral feedback. Anesthesiology, 105(2), 279-85. doi:10.1097/00000542-200608000-00010

Schon, D. A. (1983). The reflective practitioner: How professionals think in action (1st ed.) Basic Books, Inc.

Shinnick, M. A., & Woo, M. (2010). Debriefing: The most important component in simulation? Communicating Nursing Research, 43, 353-353.

This website is maintained by the University of Maryland School of Nursing (UMSON) Office of Learning Technologies. The UMSON logo and all other contents of this website are the sole property of UMSON and may not be used for any purpose without prior written consent. Links to other websites do not constitute or imply an endorsement of those sites, their content, or their products and services. Please send comments, corrections, and link improvements to online@son.umaryland.edu.