Frameworks for Integrating Learning Analytics With the Electronic Health Record

BACKGROUND

There exists a considerable divide between the information systems used in health professions education (HPE) and those used for clinical care. Educational programs in the health professions, often affiliated with universities or colleges, typically have a dedicated learning management system (LMS) organized around classes with online lecture schedules, slide presentations, bulletin boards, quizzes, and examinations.1 Clinical systems, however, have grown out of administrative and laboratory; systems, with a necessary focus on representing the current state of a patient accurately. As the knowledge and processes in both domains are increasingly digitalized, the divide that arose from the separate origins of the respective information systems risks becoming entrenched.

New ways of capturing the educational data contained in information systems are emerging. This evolution is reflected in the term Learning Analytics, defined as “the interpretation of a wide range of data produced by, and gathered on behalf of learners to assess progress, predict future performance and spot potential issues (p.13).”2 Although this process has always been a part of education, the new term captures what has changed in the data landscape with digitalization—the integration of computer science, statistics, and learning science to enable improved use of data, algorithms, and learning techniques.3 (See Figure 1). Computer science has advanced in technical materials (improved information infrastructure) and its processes (improved algorithms for deep learning and artificial intelligence). Mathematical and statistical sciences have improved in quantitative approaches to modeling large heterogenous datasets.4 The learning sciences have developed an improved understanding of how to use data to organize learning (eg, spaced repetition, adaptive learning) and provide individual feedback.5,6

F1FIGURE 1.:

Learning analytics conceptual model. Learning analytics represent the intersection of a number of contributory disciplines, operating within clinical practice. The authors argue that where clinical practice and learning analytics intersect, the electronic health record will be disproportionately important.

The same advances in analytic capacity are at play in clinical care, with clinical work increasingly facilitated by advanced digital environments with increased capacity to aggregate and analyze clinical data.7 The problem, from a learning organization standpoint, is that the development of these two new branches of analytics has happened in separate digital silos, with one oriented to learning and the other to clinical care.8,9

In this perspective, we advocate for the enhancement of health information systems, including electronic health records (EHRs), so that they intentionally facilitate learning. We describe well-regarded frameworks for learning that can point toward how health care information systems can best evolve. Our motivation in doing this is to highlight the considerable number of missed opportunities for learning that arise from separated conceptualizations of individual learning, organizational mission, and information infrastructure. We contend that organizations need to engage experts in HPE (especially continuing professional development) with the clinicians, informaticians, and quality/safety personnel charged with the design and use of EHRs.

IMPROVED MODELS FOR HEALTH PROFESSIONS LEARNING

The importance of the learning sciences in the health professions continues to grow. Advances in outcomes-based education,10 simulation,11 adaptive learning,2 interdisciplinary education,12 and health systems science curricula13 are examples that have the potential to improve how we develop clinicians and improve their care of patients.

Educators have long anticipated the day when patient outcomes would meaningfully be used to inform the design and delivery of HPE interventions. Several factors are bringing this closer to reality. Medical education models are increasingly aligned with those guiding clinical practice, following the lead of nursing education.14,15 At the same time, health care organizations are transitioning to become learning organizations.13,16,17 HPE assessment frameworks, such as Competency Milestones or Entrustable Professional Activities, are oriented to assessing the actual clinical activities of each profession, and less toward educational content measures such as examination scores.10,18,19 Although the assessment frameworks differ in their specifics, each calls for more workplace observation and tighter linkage to patient care delivery and outcomes.10,15,20

Consider the example of acute cerebrovascular stroke care. Since the advent of thrombolytic drug therapy there has been an imperative to deliver the drug as quickly as possible, since the earlier it is given the more cerebral function can be preserved.21 For emergency departments “Door-to-needle time,” which reflects timely clinical and radiologic diagnosis and an efficient workflow, has become an organizing metric with national standards.21,22 For the training and ongoing assessment of a neurologist, a key competency milestone states: “…manages common cerebrovascular disorders including appropriate use of thrombolytics.”23 In modern information systems, the door-to-needle time is tracked per patient down to the minute with key decision-making transparently attributed to each individual clinician.22 The contribution of the trained clinician, measured in this information-rich context, can be assessed and integrated in a fashion that properly takes into account speed, accuracy, cost, and impact. This immediate availability of dense information is in sharp contrast to a time where incremental improvement depended on onerous and slow paper chart reviews.24

In the following sections, we first describe three learning frameworks that can guide organizational learning and then discuss how the frameworks can inform beneficial changes to EHR systems.

LEARNING FRAMEWORKS—MULTIPLE COMPLEMENTARY PERSPECTIVES

In Figure 2, we have listed three learning frameworks applicable to modern health care organizations. These were intentionally chosen for their relevance to different perspectives on learning in health care. The Plan-Do-Study-Act framework seeks to improve patient safety and support quality improvement.25,26 The Master Adaptive Learner (MAL) framework is loosely based on the PDSA framework but with a different focus: developing an individual clinician's identity as a learner.6 The third framework, Senge's9 Five Disciplines also promotes a learning perspective, but operates at the organization level. These evidence-based structured frameworks, with their complementary perspectives on learning, offer a more integrated vision of how the data infrastructure of a health care organization can be organized to achieve the shared goal of delivering high-quality evidence-based health care.

F2FIGURE 2.: Learning frameworks. Three common conceptualizations of learning in the health professions. A, Master adaptive learner. B, Plan-do-study-Act. C, Senge learning organization. Each operates at different levels from individual through to organization. Although the frameworks differ in details and intentions, we point out the commonalities. Each is concerned with learning. Each proposes an initial reflection, a subsequent phase of learning and then monitoring of the success of that learning and its alignment with a larger goal. Taken together, the frameworks define the landscape across which some aspects of cognition must be distributed, with the electronic health record being an important mediator. The frameworks are described in greater detail in the Supplemental Digital Content (see Appendix, https://links.lww.com/JCEHP/A164).

The Plan-Do-Study-Act process, from the domain of Quality Improvement and Patient Safety, promotes improvement of health care quality through a rigorous process that involves careful data collection and analysis.26,27 The target for learning is the clinical system. Planning involves formulating a theory of how a process may be improved and includes specifying a goal. In the Do phase, the planned change is implemented in an intentional fashion that allows Study where the success of the change is evaluated. Refinements are made until the organization is ready to Act, that is, implement the change more widely.

The MAL framework is intentionally similar to the PDSA framework, although oriented to clinician development.6 It emphasizes adaptive rather than routine expertise, such that the practitioners are well-equipped to deal with patients and situations where the problems are complex, requiring individualized, in-the-moment solutions.6,28,29 In the MAL Planning phase, the practitioner formulates a theory as to how they may learn (improve), including recognizing gaps and setting goals. In the Learning phase they engage in learning in an evidence-based fashion, iteratively Assessing to determine whether their learning has been successful. The final phase of the MAL learning process is termed “Adjusting” where the practitioner, having successfully learned as an individual, now advocates for appropriate uptake by the health care system of that learning, whether through interdisciplinary quality initiatives, care pathways, checklists, or other means. This makes the role of change agent an explicit requirement of the clinician.30

The third framework, from the business literature, is Senge's five disciplines of a learning organization. It also makes clear the interplay between the mental models of the individual and the values and goals of an organization.9 A shared vision makes clear the goal of the organization to all concerned. The processes to accomplish the vision lean on system thinking and team learning so as to take into account the interconnectedness of the many stakeholders and processes involved. As in the MAL framework, personal mastery is an important orientation of the individual, even within the organizational context. Shared mental models bridge the space between individuals and the organization allowing the ongoing updating of both.

Key to all three of these representative conceptual models is the intentional way cognition (and learning) is distributed among and between individuals, teams, and organizational structures to accomplish the cognitive and procedural work of health care.

A LEARNING FRAMEWORK PERSPECTIVE ON THE EHR

The core digital infrastructure of a health care organization is the EHR. Practitioners and staff interact with it throughout the day to deliver care. EHR process analysis is optimally guided by Quality Improvement and Patient Safety (clinical analytics).31 However, the potential of the EHR for promoting learning and lasting behavioral change in the workplace is currently underexploited. For example, influential descriptions of the EHR by the Institute of Medicine, World Health Organization, and Centers for Medicaid and Medicare Services make minimal mention of education or learning in their descriptions of the core functions of the EHR, usually only referring to the training necessary to use an EHR and not to the opportunities for learning by practitioners (Table 1).32,33

TABLE 1. - Core Functions of the Electronic Health Record Core Functions of EHR Health information and data Result management Order management Decision support Electronic communication and connectivity Patient support Administrative processes and reporting Reporting and population health Coordination of care through patient Engagement  Health information exchange Public health reporting. Collecting and reporting on quality of care measures …[Missing] education and training of clinicians

The authors make the point that education and training opportunities are not listed amongst the core functions of an EHR.

Source: 2003 IOM report: key capabilities of an electronic health record system.

This represents a missed opportunity. The learning frameworks we have described point to how an EHR can be engineered to enhance learning throughout an organization. For example, the process of knowledge gap identification, where a clinician learns that there is a difference between their current performance and an external benchmark, is an important entry point for the MAL and PDSA models. Yet, it is known that clinicians have difficulty identifying their own weaknesses.34 Alerting functions of EHRs have considerable potential not only for clinical decision support, but also for identifying opportunities to induce lasting behavioral change on the part of the clinician.6,35 Coupled with this, EHRs can better support HPE if they capture data aligned with the requirements of regulatory bodies.36 However, current implementations have not realized this potential for informed self-assessment.37,38

The learning or doing phase of the MAL and PDSA cycles respectively could also be integrated with the EHR to take advantage of the individual clinician's data. Unfortunately, in many academic health centers, learning information system infrastructure is housed at a university, where the instruction is delivered in a course framework that centers on scheduled classroom instruction, based on a predetermined curriculum. These systems perform poorly in a clinical environment where the goal is different: to respond in a timely fashion to individual clinician gaps and opportunities detected during clinical care. The result is a mismatch between the built educational information systems and the learning needs of the practitioner in the workplace. In clinical areas, the existing educational activities (eg, grand rounds, bedside rounds, case conferences) are based on educational infrastructure that often overlooks constructivist adult learning principles.39 Even outside of university-affiliated health settings, the educational imperative tends to reside separate from the EHR, being housed within a separate LMS whose learning analytic metrics are poorly connected to clinical-practice data.

Drawing from the clinical example above, multiple-choice question responses on stroke management in an LMS are not typically connected to practice analytics metrics such as individual door-to-needle times for stroke thrombolysis.24 The EHR could provide valuable opportunities for learning and practice at the point-of-care through individualized clinician metrics.40,41 However, EHR data are not leveraged in this way. One large structural impediment is the relatively lower value placed on learning analytic data compared with billing or clinical data.

After their learning stages, the PDSA (studying stage) and MAL (assessing/monitoring stage) frameworks include a specific stage for reflecting on the effectiveness of learning and/or new processes. Here too a learning perspective on the EHR could better support clinicians. Feedback is an essential element of assessment and learning in the health professions.42 The form and timing of feedback in EHR is generally poorly connected to opportunities for the types of learning that can accomplish lasting behavioral change.43,44 Dashboards are based on an audit and feedback model that has been shown to have limited effectiveness in changing behavior.45 Dashboards could be improved by being paired with interventions that provide data-driven deliberate practice opportunities and coaching.46 Take the example of a pediatrician presented with benchmarked data showing that they order more chest x-rays for wheezing infants compared with their peers. Ideally, they could have the normative data be meaningfully connected to education interventions such as online virtual cases or personal coaching. More broadly, we advocate for the development of instructional design capacity within any quality improvement and patient safety initiative.47 In fact, we would argue that any data analysis that lacks an educational perspective is a missed opportunity, because it perpetuates an ongoing structural divide between educational (LMS) and clinical (EHR) information systems.48

The final stage of each of the representative learning frameworks involves connecting the individual's learning to meaningful change in the health care system. The context could vary from the clinician's immediate microsystem (eg, a clinical practice group), to larger scales such as institutional, regional, or national health care organizations.49 The key is that the individual clinician is involved in a learning dialogue with the larger organization that results in effective change in both.50

The individual also has an important role to play in negotiating what data analysis and interpretation is carried out by the information system and which remains the responsibility of the clinician. EHR evolution is a synthesis of the community's impressions of how best to address at least some clinical issues. Does this (new) process need to be learned by the individual clinician? Or does the information system need to learn it? And how will the two interact?51 In our stroke protocol example, which parts of the clinical process should depend on a trained clinician's judgement (eg, invoke stroke protocol)? And which should depend on the information system (eg, a bundled stroke alert order-set that calculates drug dosages and alerts the radiology department to prioritize this patient for CT)? These are design decisions that require careful planning, and they have considerable learning implications at the system and individual level. Increasingly, this can be a data-driven design process. Collecting practitioner data over time allows determination of optimal pathways for introduction of new processes and proper distribution between the individual clinicians and the information system.

LEARNING ANALYTICS IN HEALTH CARE

Cutting across the HPE trends we have discussed so far is the broader societal trend of moving data, information, knowledge, and, indeed, cognition into the digital realm where it can be more cheaply stored, more easily accessed and made available to algorithmic analysis and visualization.4,52 The optimal distribution of cognition is greatly facilitated in the digital realm.53 For example, large amounts of digital data enable the development, testing, and implementation of predictive algorithms that complement practitioner insight.54 Digital data are also the lifeblood of patient safety and quality improvement efforts, and can inform the attendant educational activities.55,56

The field of learning analytics in general education has made considerable strides in recent years with scientific meetings,57 a discipline specific journal,58 and a textbook.59 The intersection between learning analytics and HPE is only beginning to emerge. Chan et al60 provide a recent systematic review of learning analytics in HPE that revealed only 19 articles, describing heterogeneous applications, none of which involved the more advanced computer or statistical sciences we have described above. Furthermore, none of the included articles considered the digital learning analytics from the organizational perspective, although this has been explored in a recent perspective61 and described within national milestones data.62,63

However, one can discern a growing role for centrally promoted online learning in the health professions. The opioid epidemic prompted many U.S. states to mandate specific online continuing medical education for all licensed physicians, a considerable undertaking.64 Despite the potential of this natural experiment, learning analytics data from continuing medical education courses were never linked to practice/prescription data to determine whether prescribing behaviors changed. Similarly, The Joint Commission mandated institution-wide sedation training by pediatric faculty, which was associated with decreased adverse events logged in a prospective one-off research study.65 Each of these examples demonstrate a “push” model in which educational materials are sent to the practitioner but little if any learning analytic data are pulled back from the them to inform care processes or organizational learning. Qualitative studies of health professionals indicate their willingness to have their health data used for learning, with sensible caveats around attribution, privacy, and performance management.47,66

Thus, to promote a vibrant learning health care organization, the digital processes within and beyond the EHR need to be considered as a flow within an organization that continuously updates and upgrades based on a built-in propensity to learn at every level. In addition to business and clinical analytics, learning analytics need to flow to each place where lasting improvement is possible. This improvement may be in people (education), in the system (quality improvement), or the interaction of the two (distribution and augmentation of cognition). Importantly, an information system infrastructure that balkanizes educational activities into segregated spaces and times will not achieve the full potential of a learning organization. In an information abundant digital environment, education will no longer be about teaching an average learner about an average patient at regular, but suboptimal intervals (Table 2). Instead, the deployment of more holistic information systems, disciplined by conceptual frameworks that scale, will allow the full development of the individuals and the learning health care organization.

TABLE 2. - Properties of the Learning Electronic Health Record Property or Process Existing EHR Learning EHR Data collection Business analytics Business analytics Clinical analytics Clinical analytics … And learning analytics Boundaries QI-PS distinct from education Education-QI-PS integrated Learning IT architecture EHR and LMS are distinct Integrate LMS with EHR; recognizes common mission Educational philosophy Cognitivism—one right objective truth Constructivism—health care complexity and individuality taken into account Instructional design Instructor Coach Mentor; team Target educational unit Average brain Individual brain Team/collective competence EHR memory Each learner interaction with EHR forgotten. Accumulated personal experience reflected in longitudinal learning views Communication mode Pushing from organization Listening by EHR; Dialogue with organization Feedback Scarce Abundant Decontextualized Contextualized

EHR indicates electronic health record; LMS, learning management system; PS, patient safety; QI, quality improvement.


MOVING FORWARD: THE LEARNING HEALTH CARE ORGANIZATION

Connecting learning analytics, meant to improve educational outcomes, to clinical data often requires considerable technical and cultural bridging between different sections of an organization. Efforts to accumulate education data in longitudinal “education data warehouses” have sought to grow the HPE informatics infrastructure and join it to the health information system through data sharing and mirroring.48,67 Literature examples of this kind of educational epidemiology (ie, data amalgamated across the population of learners) exist68–70 but the promise of real-time actionable learning analytics is only beginning to be realized in HPE.55 One-off or siloed data approaches are being questioned as educators increasingly advocate for systemwide approaches.8,71

In Table 2, we describe ways that an EHR may be modified to realize its educational potential and become a “Learning EHR.” First, it should be designed to listen for, identify, and report data that can inform (individual and organizational) learning opportunities. In other words, it should be optimized to listen for learning opportunities. What knowledge deficits does the practitioner have at the point-of-care? When is a practitioner looking up information? And which information? These can be tracked, accumulated, and drive educational outreach efforts.41 Second, the built information infrastructure should be integrated between the EHR and the LMS. Can LMS data be integrated with EHR data? Can a single sign-on allow access to both? Is the LMS locked into presenting only learning modules? Can events in the EHR trigger learning activities in the LMS? For example, if the door-to-needle time for a stroke patient exceeded specified norms, could the practitioners of record be automatically sent appropriate materials? Third, the organizational entities charged with practitioner learning should all have access to all learning data. Can the QI-PS leaders access education data? Can the educators synthesize EHR data? We advocate for coaches who, using a rich trove of learning analytic data, can suggest to the practitioner with surgical precision their next learning activity. Finally, in a learning organization as described by Senge, we need to signal that the educational ambition is higher: to increase the potential of every individual, and thus the organization. Instead of delivering standard learning based on a conceptualization of the average practitioner at disconnected intervals, the idea is to individualize learning based on a timely, rich understanding of the individual and the context. This requires data—from the EHR and the LMS—and the informatics sophistication to put it into action.

In summary, we advocate for health information systems that are explicitly guided by learning frameworks to increase the educational (and clinical) effectiveness of the learning health care organization. We have highlighted the MAL model, the PDSA cycle for quality improvement, and the Senge model for a learning organization, each offering different lenses on health organizations, and a useful conceptual basis to organize learning analytics. The flow of learning analytics through a health care organization should be recognized for its educational and transformational potential and organized not by where the information tracks were laid historically, but rather by how people, micro-systems, and organizations are known to learn.Lessons for Practice ■ Information technology infrastructure for learning has classically developed in a separate silo from the EHR ■ The digitalization of care presents an opportunity for increased learning data (learning analytics) to inform health care and continuing professional development ■ The authors describe a Learning EHR based on well-regarded conceptual frameworks for individual and organizational learning

REFERENCES 1. Ellaway R, Masters K. AMEE Guide 32: E-Learning in medical education Part 1: learning, teaching and assessment. Med Teach. 2008;30:455–473. 2. Bienkowski M, Feng M, Means B. Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief. U.S. Department of Education Washington, DC: 2013. 3. Chan T, Sebok-Syer S, Thoma B, et al. Learning analytics in medical education assessment: the past, the present, and the future. AEM Educ Train. 2018;2:178–187. Available at: http://www.ncbi.nlm.nih.gov/pubmed/30051086. Accessed May 2, 2022. 4. Cukier K, Mayer-Schoenberger V. Rise of big data: how it's changing the way we think about the world. Foreign Affairs. 2013;92(3), 28–40. 5. Rohrer D, Pashler H. Recent research on human learning challenges conventional instructional strategies. Educational Researcher. 2010;39:406–412. 6. Cutrer WB, Miller B, Pusic MV, et al. Fostering the development of master adaptive learners: a conceptual model to guide skill acquisition in medical education. Acad Med. 2017;92:70–75. 7. Classen DC, Metzger JB, Welebob E. Fostering systems change to drive continuous learning in health care. In: Engineering a Learning Healthcare System: A Look at the Future. Washington, DC: The National Academies Press; 2011:237–270. Available at: https://nap.nationalacademies.org/read/12213/chapter/7 Accessed May 2, 2022. 8. Gupta R, Arora VM. Merging the health system and education silos to better educate future physicians. JAMA. 2015;314:2349–2350. 9. Senge PM. The Fifth Discipline: The Art and Practice of the Learning Organization. 2nd ed. New York, NY: Random House; 1990:464. 10. Frank JR, Snell LS, Cate Oten, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32:638–645. 11. Issenberg SB. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282:861–866. 12. Reeves S, Perrier L, Goldman J, et al. Interprofessional education: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2013;2013. Available at: https://www.cochranelibrary.com/cdsr/doi/10.1002/14651858.CD002213.pub3/full. Accessed May 2, 2022. 13. Gonzalo JD, Wolpaw T, Wolpaw D. Curricular transformation in health systems science. Acad Med. 2018;93:1431–1433. 14. Kalet AL, Gillespie CC, Schwartz MD, et al. New measures to establish the evidence base for medical education: identifying educationally sensitive patient outcomes. Acad Med. 2010;85:844–851. Available at: http://journals.lww.com/00001888-201005000-00030. Accessed May 2, 2022. 15. Schumacher DJ, Holmboe ES, van der Vleuten C, et al. Developing resident-sensitive quality measures. Acad Med. 2018;93:1071–1078. 16. Akhnif E, Macq J, Idrissi Fakhreddine MO, et al. Scoping literature review on the learning organisation concept as applied to the health system. Health Res Pol Syst. 2017;15:16. Available at: http://health-policy-systems.biomedcentral.com/articles/10.1186/s12961-017-0176-x. Accessed May 2, 2022. 17. Taylor B. The rise of the teaching organization. Harv business Rev. 2009;2–3. Available at: https://hbr.org/2009/11/companies-with-class-the-rise. Accessed May 2, 2022. 18. Holmboe ES, Batalden P. Achieving the desired transformation: thoughts on next steps for outcomes-based medical education. Acad Med. 2015;90:1215–1223. Available at: http://www.ncbi.nlm.nih.gov/pubmed/26083400. Accessed May 2, 2022. 19. Ten Cate O, Chen HC, Hoff RG, et al. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE guide No. 99. Med Teach. 2015;37:983–1002. 20. Englander R, Cameron T, Ballard AJ, et al. Toward a common taxonomy of competency domains for the health professions and competencies for physicians. Acad Med. 2013;88:1088–1094. 21. Fonarow GC, Smith EE, Saver JL, et al. Improving door-to-needle times in acute ischemic stroke: the design and rationale for the American Heart Association/American Stroke Association's target: stroke initiative. Stroke. 2011;42:2983–2989. Available at: http://www.ncbi.nlm.nih.gov/pubmed/21885841. Accessed May 2, 2022. 22. Kamal N, Holodinsky JK, Stephenson C, et al. Improving door-to-needle times for acute ischemic stroke: effect of rapid patient registration, moving directly to computed tomography, and giving alteplase at the computed tomography scanner. Circ Cardiovasc Qual Outcomes. 2017;10. Available at: http://www.ncbi.nlm.nih.gov/pubmed/28096208. Accessed May 2, 2022. 23. Accreditation Council for Graduate Medical Education. The Neurology Milestone Project. 2015;1–31. Available at: https://www.acgme.org/Portals/0/PDFs/Milestones/NeurologyMilestones.pdf. Accessed May 2, 2022. 24. Chong C. Using stroke thrombolysis to describe the role of repetition in learning a cognitive skill. Med Educ. 2016;50:250–258. 25. Cleghorn GD, Headrick LA. The PDSA cycle at the core of learning in health professions education. Jt Comm J Qual Improv. 1996;22:206–212. 26. Leis JA, Shojania KG. A primer on PDSA: executing plan-do-study-act cycles in practice, not just in name. BMJ Qual Saf. 2017;26:572–577. 27. Taylor MJ, McNicholas C, Nicolay C, et al. Systematic review of the application of the plan-do-study-act method to improve quality in healthcare. BMJ Qual Saf. 2014;23:290–298. 28. Mylopoulos M, Woods NN. When I say adaptive expertise. Med Edu. 2017;51:685–686. 29. Schumacher DJ, Englander R, Carraccio C. Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment. Acad Med. 2013;88:1635–1645. Available at: http://www.ncbi.nlm.nih.gov/pubmed/24072107. Accessed May 2, 2022. 30. Pusic MV, Santen SA, Dekhtyar M, et al. Learning to balance efficiency and innovation for optimal adaptive expertise. Med Teach. 2018;40:820–827. 31. Bates DW, Saria S, Ohno-Machado L, et al. Big data in health care: using analytics to identify and manage high-risk and high-cost patients. Health Aff (Millwood). 2014;33:1123–1131. 32. Handbook for Electronic Health Records Implementation. World Health Organization, Pan American Health Organization; 2017. 33. Tang PC. Key Capabilities of an Electronic Health. Record System. Letter Report. 2003. Available at: https://www.ncbi.nlm.nih.gov/books/NBK221800/#ddd00006. Accessed May 2, 2022. 34. Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med. 2005;80:S46–S54. Available at: http://www.ncbi.nlm.nih.gov/pubmed/16199457. Accessed May 2, 2022. 35. Patel VL, Yoskowitz NA, Arocha JF, et al. Cognitive and learning sciences in biomedical and health instructional design: a review with lessons for biomedical informatics education. J Biomed Inform. 2009;42:176–197. 36. Grad RM, Pluye P, Shulha M, et al. EBM, CME and the EMR. Evidence-Based Med. 2021;19:1–3. 37. Cook DA, Holmboe ES, Sorensen KJ, et al. Getting maintenance of certification to work: a grounded theory study of physicians' perceptions. JAMA. 2015;175:35–42. 38. Sargeant J, Armson H, Chesluk B, et al. The processes and dimensions of informed self-assessment: a conceptual model. Acad Med. 2010;85:1212–1220. Available at: http://www.ncbi.nlm.nih.gov/pubmed/20375832. Accessed May 2, 2022. 39. Chuang S. The applications of constructivist learning theory and social learning theory on adult continuous development. Perform Improvement. 2021;60:6–14. 40. Pusic MV, MacDonald WA, Eisman HO, et al. Reinforcing outpatient medical student learning using brief computer tutorials: the Patient-Teacher-Tutorial sequence. BMC Med Educ. 2012;12:70. Available at: http://www.biomedcentral.com/1472-6920/12/70. Accessed May 2, 2022. 41. Aakre CA, Pencille LJ, Sorensen KJ, et al. Electronic knowledge resources and point-of-care learning: a scoping review. Acad Med. 2018;93:S60. (11S Association of American Medical Colleges Learn Serve Lead):S60–S67. 42. Sargeant J, Bruce D, Campbell CM. Practicing physicians' needs for assessment and feedback as part of professional development. J Cont Educ. 2013;33:S54–S62. 43. Telio S, Ajjawi R, Regehr G. The “educational alliance” as a framework for reconceptualizing feedback in medical education. Acad Med. 2015;90:609–614. Available at: https://journals.lww.com/academicmedicine/Fulltext/2015/05000/The__Educational_Alliance__as_a_Framework_for.21.aspx. Accessed May 2, 2022. 44. Weinstein DF. Feedback in clinical education: untying the gordian knot. Acad Med. 2015;90:559–561. Available at: https://journals.lww.com/academicmedicine/Fulltext/2015/05000/Feedback_in_Clinical_Education__Untying_the.11.aspx. Accessed May 2, 2022. 45. Ivers N, Jamtvedt G, Flottorp S,

留言 (0)

沒有登入
gif