Development of Distance Simulation Educator Guidelines in Healthcare: A Delphi Method Application

The abrupt disruption of in-person instruction in health care during the COVID-19 pandemic prompted new methods in health professions education.1,2 In the absence of direct patient contact following mandated social distancing guidelines, the transition of non-COVID–related simulation activities to distance teaching became the primary means to train health care providers and students at several institutes.1,2 The sudden transition from in-person to distance simulation and its extensive use during the pandemic has rapidly grown, even though distance simulation is still being established as a methodology.3

Distance simulation is a learning method closely related to distance learning and eLearning and has been used for several years.4–13 The need for educator development for eLearning and distance learning environments was recognized,14–17 leading to the offering of the first Ph.D. program in eLearning methodology in 2006 by the School of eLearning Science.18 Educator development in the online environment increases educator satisfaction, students' satisfaction, enrollment, and the educators' intrinsic desire for learning.19–21 Although training in distance learning in medical and health care fields has been established,18,22,23 training in distance simulation education has not been formally studied. The literature demonstrates that health care educators' training, as part of their faculty development, significantly improves their capabilities to interact with the newest technologies and associated technological nuances.1,24–26 However, the pandemic revealed that educators were inadequately trained in creating and facilitating online instructional sessions, and providing psychological and technical support for non–in-person instruction, among several other challenges.1,27–29 Buléon et al.1 reported up to 82% of their respondents would like to use some form of distance simulation on a long-term basis. This begs the question: are simulation educators competent to use the distance simulation modality for the transfer of learning and to measure the growth and development of learners in distance simulation settings without additional training?

Distance Simulation Educator

After careful consideration of various distance simulation definitions8,30–32 and simulation modalities, we adapted the definition of distance simulation from the Healthcare Distance Simulation Collaboration1 because it better served the overarching goals of this research by including the types and subtypes of simulations we wished to explore. The Collaboration began in 2020 after the onset of the pandemic to explore and develop innovative approaches to distance simulation education. It includes the modalities of distance, tele, remote, and extended realities (XR) including virtual reality, mixed and augmented reality, digital real-time gamification (but not 2-dimensional screen-based simulations), and any hybrid form of these simulations under the term “distance simulation”.33 For this study, we define distance simulation educator (“distance sim educator”) as a person who uses simulation methods in real time for health care professionals in a virtual, online, or digital environment, using evidence-based practices and strategies to educate participants to the highest standards of care in the skill of patient management.

In the absence of ideal elements or characteristics for distance sim educators, this study sought to develop these guidelines to be the basis for such qualities. The questions that need to be asked at this juncture are: (1) What are the basic competencies needed of a simulation educator who conducts simulation at a distance? (2) How do they differ from competencies identified for in-person simulation educators? (3) How can the basic competencies evolve into advanced competencies? And (4) How do advanced competencies compare with the basic competencies? The aim of this study was to disseminate the first set of competencies required of and unique to effective distance sim educators through international consensus.

METHODOLOGY Study Design

This was a multiphasic modified Delphi study iteratively validating the content of the synthesized literature, a proposed set of distance simulation educator guidelines, presented in a white paper34 obtaining consensus of subject matter experts. This project followed the Cumyn and Harris35 model of obtaining Delphi consensus to develop distance sim educator or faculty development guidelines. This study required 3 rounds to reach a consensus. The researchers of the Delphi study were from various health care backgrounds and experienced in-person and distance simulation educators.

Target Population

This study targeted simulation educators with experience in distance simulation. This included accomplished educators practicing distance simulation prepandemic, novice but early adopters of distance simulation in the current health care landscape, educational leaders, and experts in online learning environments.

Sampling Strategy

Inclusion and exclusion criteria were broadly developed to ensure relevant expertise was included without narrowing the parameters to a specific health care field and increasing the generalizability of the guidelines. Considering the length of the document, it was divided into 4 sections after 4 domains of the Certified Healthcare Simulation Educator (CHSE) blueprints36 to be validated by 4 groups of experts separately. In sampling and setting up the groups, we ensured that all 4 groups were heterogeneous in some way but contained field experts within each domain. Although heterogeneity is coveted in a sample, having experts whose credibility was established was necessary for our Delphi study.37 Therefore, two thirds of the participants in each group were randomly chosen using online randomizing software.38 The remaining one third of the participants were placed according to the expertise needed to make an informed decision in that group during the discussion. For example, we chose a technology expert to be in the technology domain and a human factors expert in the domain that talked about human factors.

Inclusion Criteria A working professional in health care simulation in the capacity of: simulation educator, simulation operation specialist, simulation industry expert, simulation administrator, simulation researcher, instructional designers, human factors specialists, immersive technology (XR) programmers and specialists Experience in working in-person and/or distance simulation: in-person simulation for more than 4 years; distance simulation for more than 1 year; having conducted and developed distance simulations Simulationists from different countries who could understand and communicate in English In-person attendees of International Meeting of Simulation in Healthcare, IMSH, 2022 Exclusion Criteria If participants could not understand or communicate in English Prelicensure students Those experts unable to complete surveys or attend IMSH 2022 in person Recruitment Method

Experts were recruited from members of the Distance Simulation Collaboration, Research, and Accreditation Committees of the Society of Simulation in Healthcare (SSiH) and via snowball sampling.

Ethical Consideration

Approval from the institutional review board of Indiana University was sought, and this study was deemed exempt.

B. Approach to Content Validation

The establishment of content validity is foundational for all steps of content development for any research.39,40 Therefore, it was closely followed during the entire process. The flow diagram of the process followed is shown in Figure 1.

F1FIGURE 1:

Flow diagram of the steps in the Delphi process. This flow diagram shows the progress of this study from the structuring of the guidelines version 0.0 through version 3.0 of the distance simulation educators guidelines with the help of 3 rounds of Delphi process.

Step 1: Synthesis of Distance Sim Educator Guidelines—A White Paper

Before this study, we published proposed distance sim educator guidelines version 0.0 as a white paper.34 These guidelines were constructed after thorough deliberation among the researchers at the culmination of rigorous examination of 25 relevant professional fields or roles intersecting distance simulation education, such as simulation educator, operation specialist, educational technologist, telesimulation specialist, XR specialist, simulation application programmer, etcetera. The research was performed through job analyses of these professional roles, current available education—both matriculated in the form of a degree and unmatriculated in the form of short courses—and certifications. Furthermore, field experts were sought and interviewed to inform the individual rubrics for these professional roles. After careful consideration, the concepts of social justice, equity, diversity, inclusion (JEDI) and human factors along with concepts of more specific technology-based competencies were included.

Structure of Distance Sim Guidelines Document

With the consensus among the principal investigators and accompanying researchers, CHSE blueprints36 were used as a guide to divide the document into 4 individual sections or domains while the National Institute for Health and Care Excellence (NICE) Guidelines Manual41 was used to structure the individual competencies and subcompetencies. The 4 domains, named after CHSE,36 had 66 competencies with 216 basic and 179 advanced subcompetencies:

Domain 1: Professional Values and Capabilities Domain 2: Healthcare and Simulation Knowledge and Principles Domain 3: Educational Principles Applied to Distance Simulation Domain 4: Simulation Resources and Environments.

The CHSE blueprints36 were chosen as the guide because of its preestablished significance as a simulation education standard in the simulation community, nationally and globally. Conscious efforts were put in to use the same verbiage as the CHSE blueprints to scaffold the distance simulation competencies on the current simulation educator competencies. These guidelines had 2 levels of competency: the basic or competent level, which every distance sim educator should aim to possess when executing any distance simulation activities, and the advanced or expert level, which a distance sim educator may strive for in the future with continuous professional development.

Step 2: Delphi Round 1—Online Survey of Simulation Experts

The newly added concepts of JEDI, human factors, and more comprehensive technology-based competencies were discussed with the attendees of the 2021 Distance Simulation Summit in October 2021.42 The input was incorporated into the draft to create guidelines version 1.0,34 which was used in the online survey during the first round to begin the Delphi process.

Data Collection

Because of the length of the document, it was divided into 4 parts each consisting of 1 domain. Four questionnaires in the form of 4 online surveys, modeled after each domain, were constructed and distributed. Therefore, 4 Delphi processes were executed concurrently and independently with 4 sets of experts. Each group provided agreement about 1 specific questionnaire survey/1 domain during the entire process by choosing one of the options from “Keep, Modify, or Delete”. The data were collected and stored using Qualtrics software.43 Demographic information was obtained from participants, although their identities were kept confidential. The number of participants was variable in each group in each round. Please see Table 1 for the number of attendees in round 1 for all 4 Delphi groups.

TABLE 1 - Number of Attendees in Round 1 for 4 Delphi Groups N = 67* Round 1 Group 1 Group 2 Group 3 Group 4 Experts invited 23 22 22 21 Responded 18 18 17 14 Response rate 78% 82% 77% 67%

*Nine experts were invited to respond to all surveys.


Step 3: Delphi Round 2—an In-Person Meeting

At the conclusion of the first round of online survey, the feedback was collated, and kappa was calculated individually for all subcompetencies to gather consensus. The modifications were made in the light of feedback to create guideline version 2.0, which was presented to the experts at the in-person meeting for round 2 during the IMSH 2022 conference in Los Angeles, CA.

In the second round, the international experts voted anonymously using printed voting ballots with the aim of achieving consensus in a more time-efficient manner. Attendance was significantly affected by the COVID-19 pandemic, with many institutions enforcing travel restrictions. The responses were collected, and kappa calculations were performed on-site to move the process to the third round of Delphi.

Step 4: Delphi Round 3—an In-Person Meeting

The third round of Delphi was conducted immediately after the conclusion of the second round, with experts discussing the undecided competencies in real time and making modifications. Because of the limited time, the third round was conducted as a nominal group technique44; all experts discussed the pros and cons of the subcompetencies for which agreement could not be achieved during rounds 1 and 2. This approach led to a common agreed-on statement for that specific competency culminating in a consensus vote. Traditionally, Delphi studies have all rounds executed asynchronously without dialogue between scholars. The modified technique allowed reaching a consensus on quality research in a more timely and cost-effective manner.5 This meeting lasted for 5 hours. Please see Table 2 for the number of attendees in rounds 2 and 3 for all 4 Delphi groups.

TABLE 2 - Number of Attendees in Rounds 2 and 3 for 4 Delphi Groups Rounds 2 and 3 Group 1 Group 2 Group 3 Group 4 Experts invited 18 18 17 14 Responded 8 9 7 9 Response rate 44% 50% 41% 64%
Step 5: Merging Individual Documents Into Guidelines 3.0

During the development of these guidelines, there were 4 Delphi processes occurring simultaneously, with 4 different groups of scholars working toward consensus. The finalized 4 individual documents were combined sequentially in order of domains 1 through 4 to construct one final set of guidelines. During data analyses, the researchers (M.B., R.A., A.H., M.M., and H.L.) tried to reduce the redundancy through constant comparison while matching the tone of the document without compromising the data integrity. Two researchers (M.B. and R.A.) performed the accuracy check at all stages of merging of data.

Validity and Reliability of Research

The credibility of the synthesized literature was established by repeated analyses during and after the construction of the guidelines. During synthesis and consolidation, the Frambach’s quality criteria for qualitative and quantitative research45 were closely followed to ensure validity of content by maintaining credibility (multiple data sources or data triangulation, researcher triangulation, member checking), transferability (meaningful findings by describing them in detail or thick description, resonance with existing literature), dependability (until no new data or saturation, continuous reexamining of data or iterative data analysis), and confirmability (looking for evidence to disconfirm the collective findings, peer debriefing, keeping documentation for reflection, and researcher's own influence or reflexivity).

Reliability was estimated by calculating the level of interrater agreement for each item, and a kappa coefficient was computed as a more robust measure of the agreement because it aims to test whether agreement exceeds chance levels.35,46 Increasing kappa value shows the stability of the experts' views within the group and between rounds.47 A free-marginal kappa was chosen instead of the fixed-marginal kappa because survey responders were not asked to assign a certain number of items to each category.35,48 A Fleiss kappa (for 3 or more raters) of 0.60 was chosen to reflect sufficient agreement among the experts, although a kappa of 0.70 is in accordance with the health care literature findings.35,49 The decision to lower the kappa was made considering the pandemic caused attendees to decline to attend the in-person session during IMSH 2022, which was an initial condition to participate. It was deemed safer to lower the kappa because this document, although related to health care, was not related directly to patient care.

RESULTS Demographic Information

The participants were from 8 countries and had 2 to more than 30 years of experience conducting health care simulations. The participants identified themselves as physicians and nurses from various specialties, nurse educators, nurse practitioners, health professions educators, fire/emergency medical services (paramedics), simulation industry personnel, distance/online learning specialists, instructional designers, JEDI experts, human factors specialists, simulation researchers, educational technologists, VR programmers and education specialists, simulation technicians and specialists, and simulation administrators. Their experience of constructing or facilitating distance simulation was from 10 to more than 40 sessions in their professional careers. There were 67 experts in round 1 and 33 experts in rounds 2 and 3.

Quantitative Analysis

The compilation of the results of the final round led to guidelines version 3.0, which is being presented through this publication (see Table, Supplemental Digital Content 1, https://links.lww.com/SIH/A912 for comprehensive Healthcare Distance Simulation Educator Guidelines). The first version published34 was rather lengthy: 40+ pages, 66 guidelines, with 395 individual subcompetencies in 4 domains. By the end of the Delphi process, in guidelines version 3.0, the number of competencies changed from 66 to 59, basic subcompetencies from 216 to 196, and advanced subcompetencies from 179 to 182 (see Table 3 for a condensed version of these guidelines).

TABLE 3 - The Condensed Table of Guidelines: 59 Main Competencies Without Subcompetencies for Distance Simulation Educators Competencies Domain 1: Professional Values and Capabilities 1. Demonstrate characteristics of a champion in distance simulation 2. Recognize opportunities to advocate for distance simulation 3. Demonstrate and cultivate respect in relationships with participants, faculty, and the community of distance simulation 4. Demonstrate characteristics of teamwork in distance simulation 5. Recognize ethical principles and personal responsibilities as they apply to distance simulation 6. Distinguish among the various roles of personnel involved in distance simulation 7. Demonstrate compliance with regulatory requirements related to distance simulation 8. Evaluate the credibility of resources in distance simulation education (eg, Web sites, listservs, literature) 9. Use credible resources to inform distance simulation practices 10. Explore the elements of research in distance simulation 11. Define elements of quality management related to distance simulation 12. Engage in professional development in distance simulation 13. Establish diversity and inclusivity in distance simulation Domain 2: Healthcare and Simulation Knowledge and Principles 14. Leverage knowledge of the factors affecting participant engagement within a distance simulation activity (eg, learner level, realism, suspension of disbelief) 15. Examine opportunities to integrate distance simulation into education, research, and practice 16. Differentiate the phases of a distance simulation activity 17. Align feedback methods with distance learning type 18. Differentiate elements of debriefing in distance simulation (reflection, facilitation, phases) 19. Differentiate among distance simulation modalities 20. Distinguish among applications of distance simulation for individual(s), team(s), and system(s) 21. Differentiate elements of realism for distance simulation 22. Recognize stressors contributing to individual and team performance (eg, cognitive, affective/emotional, psychomotor) when using distance simulation 23. Describe the concept of human factors as it applies to distance simulation 24. Identify roles for distance simulation to improve patient safety Domain 3: Educational Principles Applied to Distance Simulation 25. Distinguish principles of using distance simulation as an educational tool (eg, learning and digital taxonomies, assessment, learning theories) 26. Integrate instructional design concepts into distance simulation activities 27. Incorporate needs assessment into distance simulation activities to meet the learners' and organizational needs (consider psychomotor, technical, behavioral, and cognitive aspects) 28. Define goals of distance simulation activities 29. Create measurable learning objectives of distance simulation activities 30. Identify and integrate assessment methods pertinent to distance simulation 31. Prepare orientation and prebriefing/briefing for participants and simulation team for distance simulation 32. Plan logistics for distance simulation activities (eg, people, supplies, timing) 33. Balance risks and outcomes in distance simulation (eg, real vs. simulated equipment/supplies/hardware/software) 34. Design the case/scenario applicable to distance simulation 35. Select distance simulation modality/modalities appropriate for the learning objectives and the learners 36. Select the virtual or digital locations to conduct the distance simulation activity 37. Identify required resources in distance simulation (eg, personnel, equipment, supplies, etc) according to the overarching goals of the intended simulation 38. Collaborate within and across the teams to coordinate distance simulation in the light of educational theories, framework, and principles 39. Assemble distance simulation-specific resources (eg, scenario, SP case, teaching script, programming list, any modality-specific requirements) 40. Conduct pilot activity for distance simulations (ie, dress rehearsal, field test, run-through) 41. Plan for evaluation of the distance simulation activity 42. Create and maintain a psychologically safe distance simulation environment 43. Perform distance simulation debriefing 44. Manage evolving simulation needs during the simulation session 45. Manage physical and psychological risks during distance simulation 46. Participate in distance simulation team debriefing and feedback 47. Analyze distance simulation activity evaluations 48. Modify future distance simulation activities based on analyzed evaluations 49. Apply reliability and validity in distance simulation 50. Recognize the unique criteria for developing and implementing distance simulation in interprofessional education (Sim-IPE) activities Domain 4: Simulation Resources and Environments 51. Identify and use appropriate technologies (technological architecture) in distance simulation 52. Work with SPs in distance simulation 53. Establish relationships with distance simulation technology stakeholders 54. Acquire skills in multimedia in distance simulation in accordance with localized and institutional needs and desires 55. Recommend modifications to distance simulation facility/program to improve outcomes 56. Manage distance simulation technical and material problems (eg, connectivity, video capture, simulator failures, supplies, technical requirements) 57. Recognize and report gaps, needs, and/or opportunities for a distance simulation program (eg, equipment, staffing, policies) 58. Identify how specific factors impact operational changes in distance simulation (eg, purchases, staffing, logistics, policies) 59. Use distance simulation resources effectively and efficiently (eg, money, people, space)

At the conclusion of the in-person meeting, kappa scores for all competencies reached the consensus status after 3 rounds and ranged from 0.60 to 1.00. There were a few instances where an item was presented to the experts in the next round even if it had achieved consensus in the previous round. This was caused by adding new subitems or moving the subitems from one competency to another, or even moving one from basic to advanced or vice versa. For example, for subcompetency 33.1, a kappa of 0.67 was established in the first round. However, in the feedback from the first round, the experts raised some salient points leading to fact checking within the literature and with other experts. After this critical appraisal, these new subcompetencies were added, reflecting the reason for these guidelines to go back for further consensus discussion and subsequent votes (see Table, Supplemental Digital Content 2, https://links.lww.com/SIH/A913 for a detailed table of the kappa values of all 4 groups for all 3 rounds).

Key Findings

A thorough analysis led to several competencies being refined, reworded, or even deleted after discussion during the rounds.

Domain 1: Professional Values and Capabilities

The experts finalized 13 competencies with subcompetencies for basic and advanced-level educators. A newer competency was the champions in distance simulation, (competency number 1, comp #1) encompassing overall professional values for both basic and advanced-level distance simulation educators. We also added 2 crucial competencies: professional development (comp #12) and diversity, inclusion, and equity (comp #13) into the first domain to bring attention to these significant aspects of optimal professional environments. The experts helped refine the concept of diversity and inclusion according to the changing landscape of the learning environment in the light of social justice and equity issues faced by the participants of the distance learning environment.

Domain 2: Health Care and Distance Simulation Knowledge and Principles

In this domain, 11 health care and distance simulation knowledge-based competencies were finalized. The competency of human factors (comp #23), added initially after an extensive literature search and expert input, was thoroughly analyzed during the second and third phases of the Delphi, leaving a generic description as a future research avenue because this construct is still developing as it relates to distance simulation. The competency of modeling was deleted in the final guidelines version 3.0 and was left for future researchers because the experts felt unqualified to write this competency without knowing the depth and breadth of modeling in the realm of distance simulation.

Domain 3: Educational Principles Applied to Distance Simulation

There are 26 competencies consisting of the basics of the simulation process and phases. The competencies of interprofessional education (comp #50) and debriefing (comp #43 & #46) in the distance simulation environments were elaborated. The evaluation and assessment of the activity and participants (comp #30, #41, & #47) have dedicated competencies and subcompetencies to improve the process and learning. After consensus, the principle of ethics (comp #5), originally in the third domain of the original CHSE blueprints,40 was moved into the first domain of “Professional Values and Capabilities” with its basic and advanced-level competencies in distance simulation environments.

Domain 4: Distance Simulation Resources and Environments

This domain was modified greatly and has 9 competencies focused on technology considering its fundamental role in distance simulation. Identifying and employing appropriate technologies (comp #51), establishing relationships with technology stakeholders (comp #53), and acquiring multimedia skills (comp #54) were some of the competencies from the original CHSE blueprints36; however, they were expanded on in the light of distance simulation technologies and environments. The competency to work with standardized participants (SPs) in the distance simulation (comp #52) environment and telesimulation were combined after expert consensus. The SP competency was added into this domain considering the fact this instructional method was quite prevalent during the pandemic; it was felt that SP methodology was within reach of the average simulation program with low resources as being less costly than other forms of distance simulation warranting its own place.

DISCUSSION

These guidelines shed light on several previously discussed competencies as well as some newer concepts. Because of space limitations in this article, we will talk about only a few concepts that we deemed important, such as psychological safety, debriefing, standardized patients in distance settings, the presence of technological challenges, etcetera. We will also talk about what we considered a growth from the original CHSE blueprints.36 Participants explored other competencies of the document, and discussing them is beyond the scope of this article.

Growth From the CHSE Blueprint

The work behind structuring distance sim educator guidelines has brought attention to the need for simulation educators to be prepared to deal with the nuances generated by the distance between instructors and learners. This goes beyond CHSE blueprints36 expectations of an average certified simulation educator. The newly introduced constructs include but are not limited to the role of champions in distance simulation, continuous professional development, human factors, and the concepts of JEDI in the distance environment.

As a result of the rapid growth of distance simulation as a solution during the pandemic, the concept of champions of distance sim arose out of a need to identify leaders focused on the development of the training and attributes necessary for the execution of successful distance simulation. Emerging literature3,33 has suggested this is a priority for simulation educators moving forward in the current educational environment.

The need for professional development is supported by the literature1,50 for several reasons, leading us to include this element in the guidelines. Going beyond these benefits, this concept invites educators, institutional leaders, and other stakeholders to recognize and provide opportunities for educators to grow professionally. This could be achieved by integrating these guidelines into onboarding efforts and continuing education programs for educators to climb the institutional promotional ladder.

The significance of the concept of diversity, inclusion, and equity is supported by the literature,51,52 leading to its dedicated competency in the hope that the educators would be cognizant of this construct and cultivate respect, acceptance, and inclusivity of the diverse learners. The concept of human factors was also discussed considering it was a hot topic of debate during the second annual summit of Distance Simulation Collaboration.42 As a result of a dearth of research about human factors in distance simulation, the competency was left with a loosely structured framework to follow the best available practices without much detail of what those best practices would be. This presents a wonderful opportunity for future research and innovation as the field grows.

Psychological Safety and Debriefing

Psychological safety becomes paramount with the inherent dangers of distance learning in the absence of physical presence and the existence of factors not faced in the in-person environment. In physical space, the learners and educators share one common learning environment, that is, the simulation area, in which the proceedings are visible to anyone most of the time. However, in a distance setting, every participant is bringing their own environment into the “invisible learning space”, leaving other participants unaware of who and what is around in that invisible learning space with them, or more importantly, how everyone is faring in it as the simulation case progresses.53,54 The potential dangers of failing to achieve psychological safety could extend to the learners during the critical learning (or lack thereof) in the debriefing process. Considering this, several competencies are dedicated to psychological safety, debriefing in general, and debriefing in teams to prepare both basic and advanced-level educators to keep the participants “safe” during the entire simulation experience in a distance learning environment.

Standardized Participants and Telesimulation

Standardized participants in telesimulation, a form of distance simulation, has been prevalent in the light of COVID-19–induced instructional changes.1 This is likely for several reasons, including but not limited to wider application of clinical scenarios for preprofessional students, including history taking, communication and teamwork training, relative ease of learning and implementation than other modalities, and its overall cost-effectiveness.1,55 The widespread use of this submodality calls for educator training to leverage this opportunity to its maximum potential with minimum risks to psychological or physical safety of the participants, which ultimately lead to a dedicated competency in this document.

Technology and Environment-Based Competencies

Various combinations in which emerging technologies can be used lead to creative distance simulation solutions and associated challenges.1 The inherent dangers of the “distance” environment can enhance these challenges, prompting more educator training to handle emerging technologies. Recently, the researchers have identified the lack of educators' technological training as one of the contributing factors to the challenges faced by the simulation community at the time of widespread adoption of distance simulation at the beginning of COVID-19 and advocated for more faculty development.1,56–59 We hope that establishing these technology-related competencies will help make a foundation to train the faculty in tandem with the rest of the simulation team to understand and mitigate the challenges where ultimately, all simulation team members will gain this new skill set.

Several other constructs, part of the CHSE blueprint,36 were expanded on in these guidelines and will be further discussed with the academic community in the form of other manuscripts and national presentations.

LIMITATIONS

The inability of several experts to join the in-person Delphi session because of a surge in the COVID-19 pandemic provided the most significant limitation to this study. This restricted several scholars and simulation experts from contributing their expertise, leading to a smaller number of participants in each group. Therefore, each domain individually had only 7 to 9 participants, with a total of 33 participants providing their expert opinions in the final round. Although this was still within the acceptable range of the participants for reliability, more participants would likely have led to a richer discussion and a deeper insight.60 In addition, this study focuses primarily on the competencies of distance sim educators; therefore, further research more focused on specific competencies for simulation operation personnel and simulation administrators is needed. Furthermore, we did not include experts who could not communicate in the English language for logistical reasons. This might have introduced a selection bias and might have prevented us from including several non-English speaking experts, thus limiting the generalizability of this study.

CONCLUSIONS

This work was the result of research spanning numerous domains of study that culminated in a Delphi content validation process to bring the first distance simulation guidelines to the academic community. This document highlights the current gap in various simulation health care educators' proficiencies in the usage of distance simulation. We anticipate that these guidelines will pave the way to provide standardized and equitable knowledge, training, and education to those using distance simulation as a modality to reach learning outcomes safely and reliably. This document provides a foundation for future research on the individual constructs highlighted in this work.

ACKNOWLEDGMENTS

We would like to thank the MGH Institute of Health Professions Simulation Track for providing us with this wonderful opportunity to conduct research on this much needed aspect of health care simulation. The PhD candidates of the MGH Institute of Health Professions/Simulation Track would like to express their gratitude for the excellent guidance of our professors throughout this process, which led to this research work in distance simulation.

We would also like to thank the Fellows Academy of the Society for Simulation in Healthcare, the Research Committee, and the Planning Committee of the society for supporting this endeavor and providing us the opportunity to conduct this study at the International Meeting on Simulation in Healthcare 2022. This thank you note would not be complete without acknowledging the support of The Distance Simulation Collaboration.

And finally, this work was not possible without the collective wisdom and expertise of our byline authors. We would like to thank all the byline authors of this study who contributed to the preparation of this document: Wesley Lockhart, Maura Polanski, Brittany J. Daulton, Susan Seibold-Simpson, and Johnny Cartright. We would like to thank all the byline authors from the Healthcare Distance Collaboration who worked diligently during the Delphi process and served as content experts during the Delphi process: Michelle Aebersold, Mariju Baluyot, Matthew Charnetski, Alexander Croft, Ellen S. Deutsch, Yue Dong, Teresa Gore, Edgar Herrera, Peggy P. Hill, Suzan “Suzie” Kardong-Edgren, Jared M. Kutzin, Deborah Lee, Kim Leighton, Wesley Lockhart, Connie M. Lopez, Mary E. Mancini, Sally A. Mitchell, Ivette Motola, Vinay Nadkarni, Nicole Novotny, John M. O'Donnell, Yasuharu Okuda, Rosemary Samia, Mary Ann Shinnick, Mary Kay Smith, and Elizabeth Wells-Beede.

REFERENCES 1. Buléon C, Caton J, Park YS, et al. The state of distance healthcare simulation during the COVID-19 pandemic: results of an international survey. Adv Simul (Lond) 2022;7(1):10. 2. Wagner M, Jaki C, Löllgen RM, et al. Readiness for and response to coronavirus disease 2019 among pediatric healthcare providers: the role of simulation for pandemics and other disasters. Pediatr Crit Care Med 2021;22(6):e333–e338. 3. Duff J, Kardong-Edgren S, Chang TP, et al. Closing the gap: a call for a common blueprint for remote distance telesimulation. BMJ Simul Technol Enhanc Learn 2021;7(4):185–187. 4. Hughes M, Gerstner B, Bona A, et al. Adaptive change in simulation education: comparison of effectiveness of a communication skill curriculum on death notification using in person methods versus a digital communication platform. AEM Educ Train 2021;5(3):e10610. 5. Rosasco J, Hanson Z, Kramer J, et al. A randomized study using telepresence robots for behavioral health in interprofessional practice and education. Telemed J E Health 2021;27(7):755–762. 6. Poland S, Frey JA, Khobrani A, et al. Telepresent focused assessment with sonography for trauma examination training versus traditional training for medical students: a simulation-based pilot study. J Ultrasound Med 2018;37(8):1985–1992. 7. Ciullo A, Yee J, Frey JA, et al. Telepresent mechanical ventilation training versus traditional instruction: a simulation-based pilot study. BMJ Simul Technol Enhanc Learn 2018;5(1):8–14. 8. McCoy CE, Sayegh J, Alrabah R, Yarris LM. Telesimulation: an innovative tool for health professions education. AEM Educ Train 2017;1(2):132–136. 9. Ahmed RA, Atkinson SS, Gable B, Yee J, Gardner AK. Coaching from the sidelines: examining the impact of teledebriefing in simulation-based training. Simul Healthc 2016;11(5):334–339. 10. Prescher H, Grover E, Mosier J, et al. Telepresent intubation supervision is as effective as in-person supervision of procedurally naive operators. Telemed J E Health 2015;21(3):170–175. 11. Ahmed R, King Gardner A, Atkinson SS, Gable B. Teledebriefing: connecting learners to faculty members. Clin Teach 2014;11(4):270–273. &#

留言 (0)

沒有登入
gif