Seeing Yourself Through the Learner's Eyes: Incorporating Smart Glasses Into Objective Structured Teaching Exercises for Faculty Development

The Liaison Committee on Medical Education (LCME) and Accreditation Council for Graduate Medical Education (ACGME) require institutions to provide effective faculty development opportunities. Faculty development is recognized by many medical education organizations as an essential framework provided to faculty members to assist them in responding to challenges within their multiple roles and evolving responsibilities.1 Faculty development initiatives can be effective in improving knowledge, behaviors, and skills and may ignite faculty's motivation to change and embrace effective learning strategies.2

Physicians, whether or not they have an academic role, teach every day.3 Although physicians gain clinical knowledge to become board certified in their specialty, this does not necessarily mean they receive adequate instruction to effectively teach learners in an academic environment. In particular, there is often a lack of attention to skills related to teaching, providing learner feedback, and communication.4 One way to assist faculty in these domains is by using an Objective Structured Teaching Exercise (OSTE) as a tool.

An OSTE, initially described by Simpson in 1992, uses a standardized encounter to assess performance and enhance teaching skills.5 The OSTE consists of a simulated teaching scenario involving a standardized learner with immediate objective feedback given to the teacher based on a predetermined behavior-based scale or checklist to assess teaching performance.6 To properly give feedback to the participant, their performance is either observed directly by the standardized learner or recorded for video-based assessment.

Video-based assessment suggests that reviewing and analyzing one's performance is important and can help people grow and improve.7 However, traditional mounted wall cameras (MWCs), used to record video for faculty and student feedback and evaluation, provide a limited view of key nonverbal communication behaviors during clinical encounters. In addition, the utility of MWC recordings depends heavily on the technical quality of the recording systems and the proximity of the device to the student being observed.8 One way to harness technology and indirect recording options is using smart glasses (SG). SG are a fairly new technology, with a growing interest among health care professionals in medical education.9 They are worn as glasses and contain a computerized communicator with a transparent screen and a video camera. When compared with MWCs, SG uniquely provide video-based assessment from the first person. Although SG seem to be finding more use in medical education, very few studies have been conducted about their utility, and we found none conducted about their use in faculty development efforts. This gap in continuing medical education literature was the nidus for our investigation into applying SG within an OSTE for faculty development related to delivering feedback. The theoretical framework for this approach is anchored in Kolb's experiential learning theory, which includes these four tenets: concrete learning, reflective observation, abstract conceptualization, and active experimentation. We used Kolb's theory to explore our primary objective and to examine the experience of faculty while using SG within an OSTE as a tool for faculty feedback and development.

METHODOLOGY Design

This is a descriptive study of physician feedback skills. Physicians at the assistant professor level were offered the opportunity to participate in a 6-session continuing medical education–bearing certificate course titled “Developing Clinical Educators in Academic Medicine.” The theoretical framework used for this faculty development experience was Kolb's experiential learning theory. We drew on his four stages of the learning cycle for the course. The physicians who chose to participate were classified as participants for the research.

As part of this course, course instructors designed an OSTE as a concrete experience to provide feedback to the participants regarding their approach to giving medical students feedback during a clinical scenario involving a patient with abdominal pain. During the OSTE, participants were simultaneously recorded by MWCs and SG. After the OSTE, participants immediately received feedback on their performance. After this concrete experience, participants engaged in reflective observation as they reviewed recorded content from the MWC and the SG individually to identify areas for improvement. Through abstract conceptualization, participants were able to alter their predetermined perception of how they deliver feedback by comparing the information obtained between the two recordings. Participants were empowered to use Kolb's active experimentation in which they will test out their new ideas and lessons gathered by the concrete experience, specifically when it comes to opportunities to deliver feedback to learners in the future. This study was reviewed by our institutional review board at Loyola's Stritch School of Medicine and was judged exempt. The authors did not receive grants or other financial support for this research.

Setting and Population

The experience occurred in the virtual hospital in the Center for Simulation Education at Loyola University Chicago, Stritch School of Medicine. Seventeen assistant professors across 12 different medical specialties participated in the experience.

Educational Instruction

Course instructors served in an acting role as a standardized student (SS) in two prerecorded clinical scenarios. Participants watched the video of SSs obtaining a history and physical from a patient who was experiencing abdominal pain. Participants were then provided the opportunity to give SSs feedback in person on their performance. This interaction was recorded with MWCs and SG worn by SSs. The SSs used Hereta Spy Camera Glasses that were battery powered and had 1080P resolution. The MWC was a Hikvision DS 2DE4425IW MWC. Immediately after the interaction between the participant and the SS, the participant was provided formative feedback by the SS and peer participant-observers using a feedback checklist. The peer participant-observers were fellow participants who had either completed the OSTE already that day or who were going to complete the OSTE later on a different clinical scenario. Participants were later sent their recorded MWC and SG videos for review and were asked to complete a self-reflection assignment and a survey.

Feedback Checklist Development

The OSTE assessment tool was created by 10 medical education experts using a modified Delphi technique and was used to frame the formative feedback for the participants.10 Miller et al11 state that the modified Delphi method is an interactive process designed to establish expert consensus on a specific question or criteria by systematically collecting information. We used a method similar to Miller et al,11 consisting of three rounds of surveys assessing central tendency and dispersion until full consensus is reached.

The first round generated an initial list of important skills related to delivering feedback. To compile the list, we spoke with 10 faculty with extensive expertise in medical education and feedback as well as clinical expertise (2) emergency medicine, (1) family medicine, (1) medicine and pediatrics (2) neurology, (3) obstetrics and gynecology, and (1) oral surgery. We constructed and distributed the list of possible criteria to a panel of experts. Item ratings were based on a four-point scale: one, very important; two, important; three, not important; and four, eliminate. We also asked panelists to suggest different wording, note redundancies, or propose additional items for the instrument. Measures of central tendency and dispersion were used to analyze the data collected from the first survey round. Calculating these measures allowed us to determine the level of group consensus for inclusion or exclusion of each criterion.

The mean value of 2.5 is the midpoint of our four-point scale and was chosen as the numerical indicator of group consensus. Those criteria with mean values less than 2.5 were included. SD was used to measure the dispersion of responses for each criterion and provide further evidence of group consensus. The smaller the SDs, the greater the consensus. Those criteria with an SD of less than one were included. For items to qualify for consensus, they had to reach the mean value criteria (≤2.5) plus an SD criteria (<1). The outcome of this first Delphi resulted in listing of 15 items of criteria. We conducted a second round, which resulted in 10 items of criteria. A third round and final Delphi resulted in eight checklist items of criteria, and this final list was agreed on by the group. Feedback was not provided to the raters regarding mean plus SD, and they were simply provided with the updated checklist items after each round. The group also agreed on a rating scale to assess faculty performance on the list: one, needs improvement; two, satisfactory; and three, excellent with an additional option to add comments. The final tool along with average scores is shown in Table 1.

TABLE 1. - Assessment Feedback Checklist Checklist Items Average Scores (1 = Needs Improvement to 3 = Excellent) 1. Ask the learner for self-reflection on the situation you are discussing. 2.66 2. Identify areas of strength. Engage the learner by using specific examples for positive feedback. 2.65 3. Identify opportunities for improvement. Engage the learner by using specific examples for constructive feedback. 2.43 4. Only give feedback based on what you observe (not based on perception or assumptions). 2.61 5. Show appropriate level of engagement and learner support through voice tone/volume, body language, and eye contact. 2.83 6. Use a good balance of listening and talking by listening more than talking. 2.54 7. Synthesize and summarize at the end (can be provided for the learner or asked of the learner). 2.54 8. Please rate the overall performance. 2.67
Statistical Analysis

Instructors and peer participant used the 8-item feedback assessment checklist to frame their formative feedback. Each item was scored on a Likert scale ranging from 1 (needs improvement) to 3 (excellent), and mean score on individual items was calculated to determine patterns for curriculum quality improvement. The postsession survey consisted of 17 questions on a Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). We paralleled the use of a 5-point Likert scale assessing attitudes toward using SG by Lee et al.12 A percentage response distribution was calculated for each question. The survey also consisted of six dichotomous questions that were reported by percentage response.

For the qualitative analysis, the research team used a conventional content analysis approach because the data collection was primarily surveys and a self-reflection assignment.13 The data analysis started with the research team reading all the data to obtain a sense of the whole. Then, the data were read word-by-word to derive codes by first highlighting the exact words from the text that appear to capture key thoughts or concepts. Labels for codes emerged that are reflective of repeated concepts. Once the initial coding scheme was complete, codes were then sorted into categories. The emergent categories were used to organize and group into thematic categories.

RESULTS

Seventeen physician participants completed the OSTE. Fourteen participants had both MWC and SG recordings and completed both the survey and reflection. There were technical difficulties with three of the participants' videos that stemmed from SG user error where some of their clinical scenario encounters were not recorded. All participants received MWC recordings. Surveys were received and analyzed from those 14 participants who had both MWC and SG recordings. Table 1 represents the breakdown of all checklist results.

All physician participants strongly agree or agree that they were comfortable with the SS wearing SG during the encounter, SG did not affect their ability to communicate effectively with the SS actor during the encounter, and that the SG did not distract them during the encounter. Seventy-seven percent of the participants believed the feedback received from viewing the SG was helpful. Eighty-five percent of the participants felt that the SG recording provided additional feedback not available with the MWCs (Table 2).

TABLE 2. - Smart Glasses Experience Feedback Survey Please Provide Feedback on the Use of the Smart Glasses during the Exercise. 5 = Strongly Agree, 4 = Agree, 3 = Undecided, 2 = Disagree, and 1 = Strongly Disagree 5 4 3 2 1 I was comfortable with the standardized student actor wearing smart glasses during the encounter. 43% 57% 0% 0% 0% The smart glasses did not affect my ability to communicate effectively with the standardized student actor during the encounter. 50% 50% 0% 0% 0% The smart glasses did not distract me during the encounter. 36% 64% 0% 0% 0% Knowing the smart glasses were recording did not affect my performance during the encounter. 29% 64% 7% 0% 0% The feedback I received from viewing the smart glasses was helpful. 45% 31% 16% 8% 0% I feel the smart glasses recording of me allowed an opportunity for additional feedback that did not exist. 39% 46% 0% 15% 0%

Participants were asked for their opinion on using SG in education, and 64% of participants looked forward to more opportunities to use SG. Eighty-six percent of participants see the value of using SG for medical education and for faculty development. Finally, 79% of faculty felt that periodically using SG in their teaching efforts would help improve the quality of their teaching (Table 3).

TABLE 3. - Smart Glasses Use in Education Survey Please Let Us Know Your Opinion of Using Smart Glasses in Education: 5 = Strongly Agree, 4 = Agree, 3 = No Opinion; 2 = Disagree, and 1 = Strongly Disagree 5 4 3 2 1 I look forward to more opportunities to use smart glasses. 36% 29% 28% 7% 0% I see the value of using smart glasses for medical education. 29% 57% 7% 7% 0% I see the value of using smart glasses for faculty development. 36% 50% 7% 7% 0% In my personal development, I feel that using smart glasses periodically in my teaching efforts would help improve the quality of my teaching. 29% 50% 14% 7% 0%

Participants were also asked to compare the MWR recording with the SG recording to determine whether additional feedback was possible by viewing the SG recording. Seventy-nine percent of physician participants believed there was additional feedback with eye contact, 64% believed there was additional feedback with body language, and 57% believed there was additional feedback with voice inflection and tone (Figure 1).

F1FIGURE 1.:

Smart glasses and MWC recording comparison: Additional feedback provided by SG recordings in comparison with MWC recording. MWC, mounted wall camera; SG, smart glasses.

Our qualitative analysis resulted in the following three themes: first-person perspective, body language, and provided opportunities for improvement on giving feedback. Two exemplary quotes include “The standardized student encounter video, in particular, was a window into how students may see me and how I may improve my delivery and rapport” and “The standardized student feedback session was useful for me to step back to see how students might feel when they are observed in those situations.”

DISCUSSION

Our study demonstrated that using SG as part of an OSTE for faculty development provides helpful feedback uniquely related to body language, eye contact, and voice tone without causing a distraction in comparison with MWCs without causing a distraction. Using Kolb's experiential learning theory as our theoretical framework, we deliberately drew on its four stages for our faculty development experience regarding feedback: concrete learning, reflective observation, abstract conceptualization, and active experimentation.

We chose to use an OSTE as the concrete learning experience. According to Boillat, OSTEs can assist in clarifying the goals that a faculty member needs to focus on in their development.6 Our OSTE experience was recorded through SG and MWCs, which allowed for reflective observation. Not only did participants receive immediate formative feedback to reflect on but also the participants later watched their own performance, which provided another layer of reflective observation. With these modalities, abstract conceptualization was facilitated allowing participants to form new ideas and adjust their thinking based on the experience and reflection. The experience culminated in active experimentation where participants were charged to apply the lessons learned from the OSTE experience to subsequent interactions with learners.

SG allow for recording directly from the perspective of the person wearing them as opposed to a “birds-eye” perspective from an MWC. Obtaining this first-person perspective has been helpful for complex care environments and surgery where the line of sight may be limited; the SG recording allows multiple people to have the same perspective.14 In addition, this first-person perspective through the SG has been shown to provide a novel perspective for medical students in comparison with traditional MWCs for feedback on key nonverbal communication behaviors during clinical encounters.15 In a study by Son et al, SG were used by residents in an outpatient clinical setting, resulting in improved patient satisfaction scores based on physician communication-related questions.16 Logically, one could extrapolate this data to other scenarios, including faculty development, but the details of how faculty would respond to SG use had not previously been published before our study.

Although our findings suggest a preference to using SG in faculty development, there have been reports that integration of SG may not be a positive experience for all learners. For example, Tully et al used Google Glass to record 30 student/SP encounters. Fewer than 23% of students reported it being a positive and nondistracting experience, whereas fewer than 17% reported it being a positive but an initially distracting experience.8 According to follow-up survey responses, 16 students (of 23; 70%) found Google Glass “worth including in the clinical skills program,” whereas 7 (30%) did not. However, this study assessed opinions of SG, whereas our study examined a comparison between SG versus the more standard approach of MWCs. Comparing two options could bias people to have increased positive responses and better impressions as opposed to independently evaluating a single option. Our results differed significantly in that the majority found the glasses nondistracting and resulted in a positive experience. This could be related to the type of glasses used, but it could also be related to our participant population. Our participants chose to complete a faculty development course knowing that they would receive instruction and feedback on teaching skills such as delivering feedback. However, participants did not know the details of the instructional methods or session goals. Bias may still exist for employee faculty to give agreeable marks to workplace experiences.

Faculty are charged with pursuing ongoing professional development, as is required through LCME, ACGME, and the American Board of Medical Specialties (ABMS). Outside of the requirements, effective faculty development efforts not only empower and improve the physician but also improve their training of students and residents. We demonstrated that allowing faculty the opportunity to obtain a first-person perspective with SG provides unique feedback related to body language, eye contact, and voice tone without being a distraction.

Creating this experience allowed faculty to take a pause from their ongoing clinical teaching. According to Di Stefano et al, once an individual has accumulated a certain amount of experience with a task, the benefit of accumulating additional experience is inferior to the benefit of articulating and codifying previous experiences.16 We subscribe to this theory and helped to provide that experience to our faculty. Our OSTE successfully provided a unique opportunity for faculty to participate in a simulated teaching experience while using SG, which allowed for peer and personal feedback and reflection.

CONCLUSION

Although SG are currently being used in medical education, specifically undergraduate and graduate medical education, there is no literature investigating their use in faculty development. SG can be used for faculty development on delivering feedback by enhancing clinical educator OSTE experience without causing a distraction. SG provide unique feedback captured through body language, eye contact, content, and voice tone that would not have been available using the “birds-eye” MWC. Faculty see value in taking time out of their busy schedules to pause and reflect on feedback provided through the use of SG. Lessons for Practice ■ Utilization of smart glasses for faculty development provides helpful and unique feedback to participants related to body language, eye contact, and voice tone without causing a distraction. ■ Objective Structured Teaching Exercise recordings support faculty development by facilitating review and reflection of their ability to provide feedback to learners. ■ Faculty acceptance of both smart glasses and Objective Structured Teaching Exercises supports their continued use and implementation for faculty teaching development.

REFERENCES 1. Leslie K, Baker L, Egan-Lee E, et al. Advancing faculty development in medical education: a systematic review. Acad Med. 2013;88:1038–1045. 2. Lee SS, Dong C, Yeo SP, et al. Impact of faculty development programs for positive behavioural changes among teachers: a case study. Korean J Med Educ. 2018;30:11–22. 3. Stull MJ, Duvivier RJ. Teaching physicians to teach: the underappreciated path to improving patient outcomes. Acad Med. 2017;92:432–433. 4. Lang F, Everett K, McGowen R, et al. Faculty development in communication skills instruction: insights from a longitudinal program with “Real-time feedback”. Acad Med. 2000;75:1222–1228. 5. Simpson DE, Lawrence SL, Krogull SR. Using standardized ambulatory teaching situations for faculty development. Teach Learn Med. 1992;4:58–61. 6. Boillat M, Bethune C, Ohle E, et al. Twelve tips for using the objective structured teaching exercise for faculty development. Med Teach. 2012;34:269–273. 7. Vaughn CJ, Kim E, O'Sullivan P, et al. Peer video review and feedback improve performance in basic surgical skills. Am J Surg. 2016;211:355–360. 8. Tully J, Dameff C, Kaib S, et al. Recording medical students' encounters with standardized patients using google glass: providing end-of-life clinical education. Acad Med. 2015;90:314–316. 9. Klein G, Singh K, Heideken J. Smart glasses—a new tool in medicine. Stud Health Technol Inform 2015;216:901. 10. Hasson F, Keeney S, McKenna H. Research guidelines for the delphi survey technique. J Adv Nurs. 2000;32:1008–1015. 11. Miller KA, Collada B, Tolliver D, et al. Using the modified Delphi method to develop a tool to assess pediatric residents supervising on inpatient rounds. Acad Pediatr. 2020;20:89–96. 12. Lee Y Kim SK, Yoon H, Choi J, et al. Integration of extended reality and a high-fidelity simulator in team-based simulations for emergency scenarios. Electronics. 2021;10:2170. 13. Hsieh HF, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–1288. 14. Romare C, Skär L. Smart glasses for caring situations in complex care environments: scoping review. JMIR mHealth and uHealth 2020;8:e16055. 15. Son E, Halbert A, Abreu S, et al. Role of google glass in improving patient satisfaction for otolaryngology residents: a pilot study. Clin Otolaryngol. 2017;42:433–438. 16. Di Stefano G, Francesca G, Pisano GP, et al. Making experience count: the role of reflection in individual learning. Harvard Business School Working Paper, No. 14-093, March 2014. Revised June 2016.

留言 (0)

沒有登入
gif