Understanding the Effects of Health Care Distance Simulation: A Systematic Review

Distance simulation and its related technologies have rapidly gained significance since the onset of the COVID-19 pandemic.1 Distance simulation refers to simulations where learners and facilitators do not all share the same geographic or physical space at the same time.2 Although it has been in use for many years in certain contexts such as overcoming geographical distance (including provision of simulation or content expertise over a distance), distance simulation uptake accelerated sharply in the context of the COVID-19 pandemic when institutional and governmental restrictions limited access to simulation facilities and traditional in-person simulation activities. This shift positioned many institutions into a reactive mode, working to address the immediate problem of learners in the middle of their programs unable to continue with essential experiential components of their education.3 Other organizations needed an ability to reach learners who could no longer take part in in-person learning.4 As a result, the innovation and adoption of distance education technologies in simulation increased dramatically.1

This forced paradigm shift led to a corresponding increase in publications sharing potential solutions and studying their use and effectiveness,2 layered on top of a decades-old literature that focused on more specific aspects or applications of distance simulation. With such a sudden and significant increase, examining the current body of published works related to distance simulation and its role within the larger paradigm of simulation offerings became essential.2 An earlier scoping review5 began the work of mapping and understanding the state of the distance simulation literature but did not capture more recently published scholarship. The present systematic review represents the planned continuation of this earlier work.

In this systematic review, we aimed to describe the demographic and methodologic characteristics, as well as reported outcomes, of the distance simulation literature, specifically focusing on work conducted within a health care setting.

METHODS

This systematic review followed Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines.6 The specific methods and details of this systematic review were submitted and published in PROSPERO, The International Prospective Register of Systematic Reviews (ID: 274837). Sources included bibliographic databases, reference lists of eligible studies and review articles, works that have cited the eligible studies, simulation journals, conference proceedings, and Web sites of simulation organizations.

Search Strategy

The groundwork for this study came from a scoping review borne during the onset of the COVID-19 pandemic from the Healthcare Distance Simulation Collaborative, which screened and performed a risk-of-bias assessment on 6969 articles. The intent of this current study was to systematically review distance simulation literature for the 2023 Society for Simulation in Healthcare (SSH) Research Summit.7 Using the same inclusion and exclusion criteria used for the scoping review, we performed 2 additional searches (September 2021 and January 2022) to keep our dataset as current as possible.

With the assistance of a research librarian, a search string was established (see document, Supplemental Digital Content 1, https://links.lww.com/SIH/A982, which shows search strings formatted for each database) and adapted for use in MEDLINE via PubMed, APA PsycINFO via OvidSP, CINAHL Complete via EBSCOhost, ERIC via EBSCOhost, Scopus, EMBASE, The Cochrane Database of Systematic Reviews, and Google Scholar. CitationChaser8 was used for forward and backward citation searching of articles that meet the inclusion criteria. The results of this search string were entered into Covidence Systematic Review Software.9 In addition, ancestry searches by reviewers from reference lists and hand searches of simulation journals were performed during the review.

Inclusion/Exclusion Criteria

Criteria for inclusion were peer-reviewed, primary research-based articles that studied educational methods that affect learning outcomes using synchronous distance simulation activities. There were no limits to study design, language, publication date, or geographical location. Gray literature, books, and studies that did not test distance simulation in a health care education setting were excluded. In addition, asynchronous distance simulation, non–simulation-based education, studies with no learning outcomes or not based on research (no data measures or analyzed observations) were excluded.

Screening Process

During the screening process, researchers found there was a distinct difference in learning environments in which all participants (learners and educators) were each at different geographic locations (which we refer to as “distance-only” setups) as compared with environments where some participants shared the same geographic location and others were in separate locations (referred to moving forward as “mixed distance sim” setups). Consequently, another screening was performed to separate articles that used distance-only setups from those that used mixed distance sim setups. This article is the analysis of the distance-only dataset.

In the first step, titles and abstracts were reviewed against inclusion and exclusion criteria. Each publication was reviewed by 2 reviewers and excluded or included only with agreement from both reviewers. In the case of disagreement, final determination was made via consensus during weekly research meetings. The full text of publications included from title and abstract reviews were then reviewed against inclusion and exclusion criteria. Two reviewers again reviewed each publication, with the final determination made for inclusion or exclusion only with consensus between the 2 reviewers. In the case of disagreement, final determination was made via consensus during weekly research meetings.

Data Extraction Process

Before data extraction, the reviewers conducted rater training for the extraction table by discussing as a group what was expected of each extraction item. Items were revised accordingly with instructions for items that tended to yield confusion within the group. Interrater reliability was then performed. Each reviewer independently extracted the same article, then met to address any differences in item interpretation and revise item language or instructions for clarity. Only when consensus was met by all reviewers, data for each publication were then extracted using the extraction table (see document, Supplemental Digital Content 2, https://links.lww.com/SIH/A983, which includes links to important extraction data). Key variables included demographics of researchers, faculty and learners, country of origin for both researchers and participants, and study objectives; training faculty received to implement the research or the simulation activity; distance technology used (eg, Zoom, Skype,10,11 Second Life12); modalities (eg, computer-based, standardized patients, clinical immersion) or presentation (eg, type of simulator)13; and study outcomes. Study outcomes were further categorized within the context of the Kirkpatrick framework14 to better understand the patterns in the types of outcomes being evaluated in the distance simulation literature to date.

In addition, as part of the extraction, each included study was assessed for risk of bias using the corresponding Critical Appraisal Skills Program (CASP) checklist.15 The CASP, which is based on medical literature and endorsed by the Cochrane Qualitative and Implementation Methods Group, was chosen with the guidance of our expert librarians, and after reliability testing of 4 different risk-of-bias tools, it was the best fit for our study and produced the highest reliability between our reviewers. Because there were several included prestudies and poststudies, and CASP does not have a pre–risk-of-bias or post–risk-of-bias tool, we also used the National Lung, Heart and Blood Institute (NHLBI, NIH) assessment tool for prestudies and poststudies with no control group.16 For each study, the tool appropriate to the study design was selected.

A single author extracted data and completed the risk-of-bias assessment for each study, with an additional author reviewing the extraction for accuracy and consensus. Conflicts were brought forward to the group in weekly meetings and were resolved via consensus.

RESULTS

The Preferred Reporting Items for Systematic Reviews and Meta-Analyses flow diagram (Fig. 1), depicts the screening and extraction process. After 12,554 articles and 3 searches, a total of 54 studies met the inclusion criteria for the current review.

F1FIGURE 1:

Output of PRISMA flow process.

Study Characteristics

Seventy-five percent of the included articles were published from 2020 to 2022 (n = 46) during the COVID-19 pandemic. Many of these articles encapsulated a time during which education was forced to be delivered online. Conversion of traditionally in-person simulation activities to what was described as telehealth (n = 4, 7%), telemedicine (n = 5, 9%), telementoring (n = 2, 4%), or telesimulation (n = 5, 9%) made up 25% of those studies. Aside from these terms, other terminology used to describe distance simulation activities included virtual (n = 21, 39%), remote (n = 13, 24%), and online (n = 20, 37%).

Included studies represented 14 different countries. There was a North American predominance to authorship, with 30 studies featuring authors located in the United States17–46 and 5 studies included authors from multiple countries.23,27,47–49Figure 2 highlights the reported origins of all included studies. One study reported learners in distributed locations, but only provided continents, not specifics of individual countries.

F2FIGURE 2:

Heat map of geographical origin of included studies.

Most included articles were published in medical or surgical journals (n = 31; 57%), followed by nursing (n = 5; 1%), education or technology (n = 5; 1%), and other journal types (n = 21; 39%). Only 2 were published in a simulation journal.

Theoretical frameworks were infrequently reported and were described in 8 of the 54 studies.21,38,40,46,50–53

Most studies were descriptive in nature. The most common methodologies used were pretest/posttest and survey studies (n = 8; 15%), quasi-experimental design (n = 7; 13%), and randomized control trials (n = 6; 11%). The most common topics addressed in the simulations were knowledge or nontechnical skills.

There was limited reporting of faculty simulation or research training (n = 11, 20%). Facilitator training in distance simulation was reported in very few of the included studies.17,30,45 Descriptions of debriefing within the context of these simulation activities were similarly limited. Sixty-one percent of the studies (n = 33) specifically reported that a debrief occurred in the simulation activity; of these, 4 articulated a specific debriefing method.

Outcomes and Objectives

Many studies focused on simulation (ie, research about simulation; n = 34 [63%]), and 16 (30%) studies researched a topic using simulation (ie, research examining behavior or patient care using simulation). The remaining 4 studies did a mix of both.28,32,36,46

Reviewers evaluated and categorized the outcomes in each study within the context of the Kirkpatrick framework.14 Forty-three (80%) studies included Kirkpatrick level 1 outcomes, whereas 30 (56%) studies included Kirkpatrick level 2 outcomes. Notably, no studies included Kirkpatrick level outcomes higher than level 2.

Quality and Evidence

Overall, less than half of the included studies were found to have a low risk of bias (n = 27/54, 49%) as seen in Appendix B (Table, Supplemental Digital Content 2, https://links.lww.com/SIH/A983). Of the 16 studies that were evaluated with the NIH assessment tool (NHBLI, NIH) for prestudies and poststudies with no control group,18,19,21,22,26,29,33,34,36,38,47,49,53–56 more than half were assessed to be at low risk for bias (n = 9/16, 56%).21,26,29,33,34,36,38,47,54 Twenty-eight studies additionally were evaluated using the appropriate CASP tools.17,20,23,25,27,30–32,35,37,39,40,43,44,46,48,50–52,55,57–64 Eleven of these studies (11/28; 39%) were found to have a low risk of bias.17,23,31,32,37,40,43,46,48,51,55 Of the remaining 10 studies,24,28,41,42,45,65–69 3 had some concerns with regard to their overall quality and risk of bias.28,41,67

DISCUSSION

The last few years have seen a dramatic rise in the use of distance simulation to address different needs for delivering health professions education. From this systematic review of the literature, 3 trends of particular interest emerged.

THERE HAS BEEN A SIGNIFICANT INCREASE IN THE USE OF TECHNOLOGIES NOT ROUTINELY FOUND IN HEALTH CARE SIMULATION

The included studies encompassed a wide array of simulation modalities, representing most of what many would consider the “traditional” simulation activities (manikins, standardized patients, task trainers, and others). A broad collection of technologies and platforms were used to adapt and deliver these simulations to learners who were physically distant from one another, from commonly used videoconferencing tools to gaming or virtual reality platforms. Some modalities served a more reactionary function to deliver traditional simulation modalities at a distance, such as the use of Zoom10 to connect learners to standardized patients. In other studies, though, several more novel technologies were used, such as multiplayer or immersive screen-based or VR platforms where learners could engage in their simulation activities. Each of these techniques have their strengths and weaknesses and may be better suited to certain simulation modalities and topics. However, because this research is still in its infancy, simulation researchers and educators have little empirical evidence to make decisions about which technology to use. Furthermore, there has been early work done that has begun to highlight some of the important differences or considerations in distance simulation—it is likely not sufficient to take an in-person experience and deliver it over a distance. For example, Cheng and colleagues provided suggestions on how debriefing techniques may need to be adapted when used in distanced settings.70 More work needs to be done to examine these technologies and best practices for creating effective distance simulation activities to ensure we are choosing the appropriate technology, tools, and structure for the learning objectives of the session.

THERE IS INCONSISTENT REPORTING OF DETAILS OF SIMULATION INTERVENTIONS MAKING COMPARISONS BETWEEN INTERVENTIONS VERY DIFFICULT

Many studies provided insufficient details about the structure of the simulation including specifics such as scenario flow; relative location of learners, educators, and equipment; debriefing strategies; and technical details of how distancing was accomplished. This lack of detail makes it incredibly difficult to understand precisely how the activity or curriculum was developed and implemented. In addition, many studies either failed to mention or provided very little detail regarding faculty development of educators participating in the simulation, and with regard to specific training for distance approaches. This lack of detail has important implications for replicability of reported work as well as the ability to fully evaluate the methodological rigor of distance simulation scholarship. The addition of distance simulation-specific considerations to existing simulation reporting guidelines71 may prove helpful in this regard.72

MANY INCLUDED STUDIES REPORTED ON RELATIVELY LOW-LEVEL OUTCOMES (SATISFACTION AND IMMEDIATE LEARNING)

Consistent with the work of our earlier scoping review,5 we found many studies limited their focus to early-stage outcomes (Kirkpatrick levels 1 and 2). This result is not surprising given the young age of this body of research, consistent with a discipline that is still in its formative years.72 Although this can capture learner attitudes or satisfaction or demonstrate immediate learning, it limits our ability to assess potential further reaching impacts of distance simulation; importantly, these include whether knowledge and skills obtained during distance simulation are retained or transferred outside of the simulation space to the clinical arena, as well as effects on patient-level outcomes.

Limitations

This systematic review has 4 main limitations. Given the design of our systematic review, we only included studies until January 2022. An ongoing distance simulation systematic review task force will help include articles published later to allow us to continue to understand the current state of distance simulation research. Secondly, given the challenges of publishing novel research in medical education, published articles may be skewed due to publication bias. It could be challenging to publish results showing that a novel approach (distance simulation) is inferior to more traditional approaches (in-person simulation). The dataset was also hampered by a lack of detail in the presentation of simulation interventions, which has been reported in other large simulation systematic reviews.73,74 Distance simulation complicates this reporting because multiple components of the simulation experience (instructors, learners, manikin operators, equipment) can all be in different locations. This results in a myriad of possibilities, and describing this level of detail can be complicated. Lastly, given that a significant proportion of included studies had some concerns with regard to quality and risk of bias, it is difficult to make large generalizations about next steps for distance simulation.

Future Directions

As a discipline, distance simulation is still in its infancy, which is reflected by the types of research studies being published. This work was important during the seismic shift in simulation strategies during the COVID-19 pandemic. However, to understand when and how distance simulation is best used, higher quality studies are required, ideally looking at immediate learning, learning retention, and system- and patient-level outcomes. Larger studies with better reporting of study methods and outcomes are needed. A better understanding of how best to deploy distance simulation approaches is also required. During the pandemic, many educators selected technology that was readily available and familiar (ie, videoconferencing systems), not necessarily the best technology for their specific need. Each technology used for distance simulation has its pros and cons, and selecting the appropriate technology for the context, learning objectives, and feasibility is important; however, empirical data to support faculty in making these choices are currently lacking.

In addition to technological considerations, other aspects of simulation education may need to be modified for the distance environment.75 This may include changes to the simulation prebrief to ensure learners understand the technology being used, modifications to debriefing techniques,70 and unique considerations with regard to how assessments are performed. Notably, balancing each of these considerations with the technical facility and expertise needed to support the distance element of these activities may bring with it a significant increase in cognitive load for faculty, and additional faculty may be required—particularly for those newer to distance simulation—to help distribute this load. A better understanding of each of these facets of distance simulation will be key in providing effective faculty development to distance simulation educators.

Distance simulation, although in use in specific contexts for many years preceding the pandemic, saw a sharp uptake when COVID-19 restrictions were implemented; as these restrictions continue to relax, it remains to be seen what role distance simulation will assume in the simulation toolbox. In a previous survey of distance simulation practices, more than 80% of simulationists around the world articulated an intention to continue using distance simulation in some form even as restrictions ease.1 To help inform these decisions for simulationists, several key questions remain to be answered. Are distance simulations effective and equivalent to traditional curricular activities in preparing learners for practice? Can these modalities be used in other contexts to reach learners who might not typically have access to high-quality simulation-based education? Alternatively, there may be specific contexts or use cases where distance simulation is preferable or superior to in-person simulation, and vice versa. For these questions to be answered, high-quality comparative studies will be required. A specific focus on simulation assessment strategies that are inclusive to distance modalities and have reported validity and reliability evidence will help to create a mechanism for meaningful comparisons between different curricula. Cost-benefit analyses are also needed; although the technology associated with distance simulation can be expensive, it may be a much more feasible approach than bringing remote learners to a simulation center.

Overall, our findings align with findings across the multitude of systematic reviews in health care simulation. These include lack of rigor, lack of detail for replication, lack of studies ON the efficacy of various simulation designs and strategies (rather than studies USING simulation as an investigative methodology),76 and the need for robust work leveraging rigorous experimental, mixed methods, multisite, or longitudinal approaches, as well as research funding supporting this work. Perhaps the most impactful takeaway from this study was our identification of areas for future research, which can form a starting point for a research agenda. We highlight these areas in Table 1.

TABLE 1 - Suggested Areas of Future Research in Distance Simulation General ON Simulation USING Simulation Outcomes-focused research (Kirkpatrick 3 and 4) Study efficacy and relative efficacy of different modalities When to use different modalities to achieve research goals Accrue validity and reliability evidence for assessment tools within the context of distance simulation Examine validity and reliability arguments for tools evaluating different modalities of distance simulation Expand validity and reliability arguments for assessment tools in distance simulation evaluating learners and behaviors More mixed methodology research Return on investment/expectations (ROI/ROE) Examine the needs of debriefing in a distance simulation context Increased research rigor (eg, RCT) Assess whether distance methodologies have continued in use as pandemic restrictions lighten Tracking changes in technology related to health care and educational delivery through simulation activities More longitudinal studies Focus on faculty development and experience that improves development and consistency of distance simulation experiences More multisite studies, especially international and multinational studies Evaluate which professions beyond learner populations may need to be brought into simulation development and research Continue to standardize terminology and reporting guidelines to improve comparison
CONCLUSION

In conclusion, the growing usage of distance simulation in health professions education has brought about significant advancements and increased flexibility in simulation practices, while also raising important considerations. Although highlighting the increasing use of nontraditional technologies in health care simulation, this systematic literature review also revealed continued challenges with inconsistent or incomplete reporting of simulation details and faculty training, as well as a predominance of early-stage outcomes. With a focus on consistent and comprehensive reporting to facilitate replicability, an emphasis on higher level outcomes/measures of impact and a clearer understanding of the contexts in which distance simulation best fits into the simulation armamentarium, the field of distance simulation can continue to advance in a rigorous and evidence-based manner, leading to the improvement of health professions education and the delivery of quality care.

ACKNOWLEDGMENTS

The authors would like to thank the SSH Research Summit Leaders, staff, and committees who helped inform the direction of this study.

The authors would like to thank the Healthcare Distance Simulation Collaboration for their thoughts and feedback on our work. The Healthcare Distance Simulation Collaboration members part of the SSH Research Summit effort include Diego Olmo-Ferrer, Richard Friedland, Jill Sanko, Susan Eller, Marc Lazarovici, Alex Morton, Amanda Tarbet, Patrick Hughes, Cynthia Mosher, Stephanie Stapleton, Rami Ahmed, Mandy Kirkpatrick, Jared Kutzin.

REFERENCES 1. Buléon C, Caton J, Park YS, et al. The state of distance healthcare simulation during the COVID-19 pandemic: results of an international survey. Adv Simul 2022;7(1):10. 2. Gross IT, Clapper TC, Ramachandra G, et al. Setting an agenda: results of a consensus process on research directions in distance simulation. Simul Healthc 2023;18(2):100–107. 3. Shea KL, Rovera EJ. Preparing for the COVID-19 pandemic and its impact on a nursing simulation curriculum. J Nurs Educ 2021;60(1):52–55. 4. Rabe A, Sy M, Cheung WYW, Lucero-Prisno DE. COVID-19 and health professions education: a 360° view of the impact of a global health emergency. MedEdPublish 2020;9(148):148. 5. Elkin R; on behalf of the Distance Simulation Scoping Review Core Team. Scoping Review Update. Second Annual Healthcare Distance Simulation Summit, 10/2021. (meeting held virtually on Zoom). 6. Page MJ, et al. The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ 2021;372:N71. 7. Society for Simulation in Healthcare, 2023 IMSH Research Summit - in International Meeting on Simulation for Healthcare. Orlando, Florida; 2023. 8. Haddaway NR, Grainger MJ, Gray CT. Citationchaser: an R package and Shiny app for forward and backward citations chasing in academic searching. Zenodo 2022;
doi:10.5281/zenodo. 9. Veritas Health Innovation. Covidence Systematic Review Software [Computer software]. Melbourne, Australia: Available at: www.covidence.org. Accessed August 4, 2023. 10. Zoom Video Communications Inc. Security guide. Zoom Video Communications Inc. Available at: https://d24cgw3uvb9a9h.cloudfront.net/static/81625/doc/Zoom-Security-White-Paper.pdf. Accessed August 4, 2023. 11. Skype Technologies. Skype. Available at: https://www.skype.com. Accessed August 4, 2023 12. Linden Labs. Second life (6.6.13.580918) [computer software]. Linden Labs. Available at: https://secondlife.com. Accessed August 4, 2023 13. Chiniara G, Cole G, Brisbin K, et al; Canadian Network For Simulation In Healthcare, Guidelines Working Group. Simulation in healthcare: a taxonomy and a conceptual framework for instructional design and media selection. Med Teach 2013;35(8):e1380–e1395. 14. Kirkpatrick DL, Kirkpatrick JD. Evaluating Training Programs: The Four Levels. Oakland, CA: Berrett-Koehler; 2012. 15. CASP. CASP checklist. Available at: https://casp-uk.net/casp-tools-checklists/. Accessed August 4, 2023 16. U.S. Department of Health and Human Services. Study Quality Assessment Tools. Bethesda, MD: National Heart Lung and Blood Institute. Available at: https://www.nhlbi.nih.gov/health-topics/study-quality-assessment-tools. Accessed August 4, 2023. 17. Afonso N, Kelekar A, Alangaden A. I have a cough: an interactive virtual respiratory case-based module. MedEdPORTAL 2020;16:11058. 18. Andersen D, Popescu V, Cabrera ME, et al. Medical telementoring using an augmented reality transparent display. Surgery 2016;159(6):1646–1653. 19. Arciaga PL, Calmes D, Windokun A, et al. Distance learning during COVID-19 mitigates learning loss for interprofessional education. Simul Healthc 2022;17(1):68–69. 20. Berg BW, Beamis EK, Murray WB, Boedeker BH. Remote videolaryngoscopy skills training for pre-hospital personnel. Stud Health Technol Inform 2009;142:31–33. 21. Cornes S, Gelfand JM, Calton B. Foundational telemedicine workshop for first-year medical students developed during a pandemic. MedEdPORTAL 2021;17:11171. 22. Cowart K, Updike WH. Pharmacy student perception of a remote hypertension and drug information simulation-based learning experience in response to the SARS-CoV-2 pandemic. J Am Coll Clin Pharm 2021;4(1):53–59. 23. Haginoya S, Yamamoto S, Pompedda F, Naka M, Antfolk J, Santtila P. Online simulation training of child sexual abuse interviews with feedback improves interview quality in Japanese university students. Front Psychol 2020;11:998. 24. Langenau E, Kachur E, Horber D. Web-based objective structured clinical examination with remote standardized patients and Skype: resident experience. Patient Educ Couns 2014;96(1):55–62. 25. Luke S, Petitt E, Tombrella J, McGoff E. Virtual evaluation of clinical competence in nurse practitioner students. Med Sci Educ 2021;31(4):1267–1271. 26. Matthiesen MI, Hiserodt J, Naureckas Li C, Frey-Vogel AS, Johnson JH. Going virtual: objective structured teaching exercises as an innovative method for formative resident education. Acad Pediatr 2022;22(1):12–16. 27. McCoy CE, Alrabah R, Weichmann W, et al. Feasibility of telesimulation and Google Glass for mass casualty triage education and training. West J Emerg Med 2019;20(3):512–519. 28. McRoy C, Patel L, Gaddam DS, et al. Radiology education in the time of COVID-19: a novel distance learning workstation experience for residents. Acad Radiol 2020;27(10):1467–1474. 29. Miles MS, Donnellan N. Learning fundamentals of laparoscopic surgery manual skills: an institutional experience with remote coaching and assessment. Military Med 2021;usab170. 30. Montgomery EE, Thomas A, Abulebda K, et al. Development and implementation of a pediatric telesimulation intervention for nurses in community emergency departments. J Emerg Nurs 2021;47(5):818–823.e1. 31. Musa D, Gonzalez L, Penney H, Daher S. Interactive video simulation for remote healthcare learning. Front Surg 2021;8:713119. 32. Newcomb AB, Duval M, Bachman SL, Mohess D, Dort J, Kapadia MR. Building rapport and earning the surgical patient's trust in the era of social distancing: teaching patient-centered communication during video conference encounters to medical students. J Surg Educ 2021;78(1):336–341. 33. Nix K, Liu EL, Oh L, et al. A distance-learning approach to point-of-care ultrasound training (ADAPT): a multi-institutional educational response during the COVID-19 pandemic. Acad Med 2021;96(12):1711–1716. 34. Pang JH, Finlay E, Fortner S, Pickett B, Wang ML. Teaching effective informed consent communication skills in the virtual surgical clerkship. J Am Coll Surg 2021;233(1):64–72.e2. 35. Patel S, Chawla A, Unruh M, et al. A proposed model for a comprehensive virtual subinternship in vascular surgery. J Vasc Surg 2021;74(6):2064–2071.e5. 36. Quaranto BR, Lamb M, Traversone J, et al. Development of an interactive remote basic surgical skills mini-curriculum for medical students during the COVID-19 pandemic. Surg Innov 2021;28(2):220–225. 37. Quinlin L, Clark Graham M, Nikolai C, Teall AM. Development and implementation of an e-visit objective structured clinical examination to evaluate student ability to provide care by telehealth. J Am Assoc Nurse Pract 2020;33(5):359–365. 38. Raynor P, Eisbach S, Murillo C, Polyakova-Norwood V, Baliko B. Building psychiatric advanced practice student nurse competency to conduct comprehensive diagnostic interviews using two types of online simulation methods. Journal of Professional Nursing 2021;37(5):866–874. 39. Reid HW, Branford K, Reynolds T, Baldwin M, Dotters-Katz S. It's getting hot in here: piloting a telemedicine OSCE addressing menopausal concerns for obstetrics and gynecology clerkship students. MedEdPORTAL 2021;17:11146. 40. Sanseau E, Lavoie M, Tay KY, et al. TeleSimBox: a perceived effective alternative for experiential learning for medical student education with social distancing requirements. AEM Educ Train 2021;5(2):e10590. 41. Scoular S, Huntsberry A, Patel T, Wettergreen S, Brunner JM. Transitioning competency-based communication assessments to the online platform: examples and student outcomes. Pharmacy (Basel) 2021;9(1):52. 42. Swerdlow B, Soelberg J, Osborne-Smith L. Synchronous screen-based simulation in anesthesia distance education. Adv Med Educ Pract 2021;12:945–956. 43. Tellez J, Abdelfattah K, Farr D. In-person versus virtual suturing and knot-tying curricula: skills training during the COVID-19 era. Surgery 2021;170(2021):1665–1669. 44. Winkelmann Z, Eberman LE. The confidence and abilities to assess a simulated patient using telemedicine. Athl Train Educ J 2020;15(2):132–147. 45. Yang T, Buck S, Evans L, Auerbach M. A telesimulation elective to provide medical students with pediatric patient care experiences during the COVID pandemic. Pediatr Emerg Care 2021;37(2):119–122. 46. Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P. Design, development, and evaluation of an online virtual emergency department for training trauma teams. Simul Healthc 2008;3(3):146–153. 47. Gupta B, Jain G, Mishra P, Pathak S. Preparedness to combat COVID-19 via structured online training program regarding specific airway management: a prospective observational study. Indian J Anaesth 2020;64(9):796–799. 48. Haginoya S, Yamamoto S, Santtila P. The combination of feedback and modeling in online simulation training of child sexual abuse interviews improves interview quality in clinical psychologists. Child Abuse Negl 2021;115:105013. 49. Morgan G, Melson E, Davitadze M, et al. Utility of Simulation via Instant Messaging—Birmingham Advance (SIMBA) in medical education during COVID-19 pandemic. J R Coll Physicians Edinb 2021;51(2):168–172. 50. Budrionis A, Hasvold P, Hartvigsen G, Bellika JG. Assessing the impact of telestration on surgical telementoring: a randomized controlled trial. J Telemed Telecare 2016;22(1):12–17. 51. Conradi E, Kavia S, Burden D, et al. Virtual patients in a virtual world: training paramedic students for practice. Med Teach 2009;31(8):713–720. 52. Rogers L. Developing simulations in multi-user virtual environments to enhance healthcare education. Br J Educ Technol 2011;42(4):608–615. 53. Trujillo Loli Y, D'Carlo Trejo Huamán M, Campos Medina S. Telementoring of in-home real-time laparoscopy using whatsapp messenger: an innovative teaching tool during the COVID-19 pandemic. A cohort study. Ann Med Surg (Lond) 2021;62:481–484. 54. Harendza S, Gärtner J, Zelesniack E, Prediger S. Evaluation of a telemedicine-based training for final-year medical students including simulated patient consultations, documentation, and case presentation. GMS J Med Educ 2020;37(7):Doc94. 55. Harrison N, Sharma S, Heppenstall-Harris G, et al. A virtual hour on-call: creating a novel teaching programme for final-year undergraduates during the COVID-19 pandemic. Future Healthc J 2021;8(Suppl 1):2–2. 56. Melson E, Davitadze M, Aftab M, et al. Simulation via instant messaging-Birmingham advance (SIMBA) model helped improve clinicians' confidence to manage cases in diabetes and endocrinology. BMC Med Educ 2020;20(1):274. 57. Bay U, Maghidman M, Waugh J, Shlonsky A. Guidelines for using simulation for online teaching and learning of clinical social work practice in the time of COVID. Clin Soc Work J 2021;49(2):128–135. 58. De Ponti R, Marazzato J, Maresca AM, Rovera F, Carcano G, Ferrario MM. Pre-graduation medical training including virtual reality during COVID-19 pandemic: a report on students' perception. BMC Med Educ 2020;20(1):332. 59. Hartmann L, Kaden JJ, Strohmer R. Authentic SP-based teaching in spite of COVID-19—is that possible? GMS J Med Educ 2021;38(1):Doc21. 60. Kaliyadan F, ElZorkany K, Al Wadani F. An online dermatology teaching module for undergraduate medical students amidst the COVID-19 pandemic: an experience and suggestions for the future. Indian Dermatol Online J 2020;11(6):944–947. 61. Kasai H, Shikino K, Saito G, et al. Alternative approaches for clinical clerkship during the COVID-19 pandemic: online simulated clinical practice for inpatients and outpatients—a mixed method. BMC Med Educ 2021;21(1):149. 62. Knie K, Schwarz L, Frehle C, Schulte H, Taetz-Harrer A, Kiessling C. To zoom or not to zoom—the training of communicative competencies in times of Covid 19 at Witten/Herdecke University illustrated by the example of "sharing information". GMS J Med Educ 2020;37(7):Doc83. 63. McCallum J, Ness V, Price T. Exploring nursing students' decision-making skills whilst in a Second Life clinical simulation laboratory. Nurse Educ Today 2011;31(7):699–704. 64. Patel M, Hui J, Ho C, Mak CK, Simpson A, Sockalingam S. Tutors' perceptions of the transition to video and simulated patients in pre-clinical psychiatry training. Acad Psychiatry 2021;45(5):593–597. 65. Lenes A, Klasen M, Adelt A, et al. Crisis as a chance. A digital training of social competencies with simulated persons at the Medical Faculty of RWTH Aachen, due to the lack of attendance teaching in the SARS-Cov-2 pandemic. GMS J Med Educ 2020;37(7):Doc82. 66. Nelson DL, Blenkin C. The power of online role-play simulations: technology in nursing education. Int J Nurs Educ Scholarsh 2007;4:Article1. 67. Rauch C, Utz J, Rauch M, Kornhuber J, Spitzer P. E-learning is not inferior to on-site teaching in a psychiatric examination course. Front Psych 2021;12:624005. 68. Takahashi K, Tanaka C, Numaguchi R, et al; Committee of the Japanese Society for Cardiovascular Surgery under forty. Remote simulator training of coronary artery bypass grafting during the coronavirus disease 2019 pandemic. JTCVS open 2021;8:524–533. 69. Venail F, Akkari M, Merklen F, et al. Evaluation of otoscopy simulation as a training tool for real-time remote otoscopy. Int J Audiol 2018;57(3):194–200. 70. Cheng A, Kolbe M, Grant V, et al. A practical guide to virtual debriefings: communities of inquiry perspective. Adv Simul 2020;5:18. 71. Cheng A, Kessler D, Mackinnon R, et al. Reporting guidelines for health care simulation research: extensions to the CONSORT and STROBE statements. Adv Simul 2016;1:25. 72. Duff J, Kardong-Edgren S, Chang TP, et al. Closing the gap: a call for a common blueprint for remote distance telesimulation. BMJ Simul Technol Enhanc Learn 2021;7(4):185–187. 73. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA 2011;306(9):978–988. 74. Cheng A, Eppich W, Grant V, Sherbino J, Zendejas B, Cook DA. Debriefing for technology-enhanced simulation: a systematic review and meta-analysis. Med Educ 2014;48(7):657–666. 75. Bajwa M, Ahmed

留言 (0)

沒有登入
gif