A learning health systems approach to integrating electronic patient‐reported outcomes across the health care organization

1 INTRODUCTION

Capturing data from patients through patient-reported outcomes is an emerging aspect of healthcare delivery and aligns with the core objectives of a learning health system (LHS).1-3 As articulated by Friedman and others, an LHS aligns data, technology, and care delivery to generate and apply evidence to practice in a continuous cycle to advance care.4, 5 A central tenant of an LHS is harnessing the power of data in ways that empower and engage patients and care teams with the knowledge to provide the best care for each patient.5 Patient-reported outcome (PRO) data, when integrated into care, provide standardized assessments of how patients experience health and healthcare. Such data support LHS initiatives by providing a mechanism to bring the patient voice into real-time clinical documentation and decision-making.6-8 Leveraging health information technology (IT) to facilitate the digital capture of electronic patient-reported outcomes (ePROs) generates new or expanded data sources that can more readily enhance learning and care improvement across the organization.1 Specifically, ePROs allow a pivot from analog approaches (ie, paper-based) that are often more cumbersome to synthesize and integrate into care to electronic capture which allows for real-time capture and reporting.

Despite the role ePROs can play in advancing patient-centered care, there is limited evidence on how an LHS should approach the implementation and evaluation of ePRO use in practice.2 That is, while an LHS emphasizes the goal of leveraging data to transform care, guidelines for the process of integrating patient-reported data into improved care delivery within an LHS context does not exist. Establishing such guidelines is especially important as efforts to scale and spread the use of ePROs across the clinical contexts and the health system at large grow. We previously documented a wide diversity of use cases for ePROs within a single health system, and the nuanced differences between how ePROs are used in preventive, chronic or specialty, and interventional or surgical care contexts.3 There is therefore a strong imperative to better understand a systems approach for integrating ePROs throughout the organization in ways that enhance learning.9

The integration of ePROs into care delivery involves multiple considerations for patients, providers and care teams, and health system resources and infrastructure.9, 10 For example, using ePROs may introduce new uses of technology into a health system's ecosystem. This requires a need to understand how clinical teams interact with patient-reported data at the point of care and may require rapid-cycle improvements to the design of how health IT tools are used. Existing evidence from the field highlights that integrating ePROs into clinical care includes structural considerations such as how the patient portal is used, how to leverage technology to triage and display ePRO data in actionable ways, how to introduce the use of tablets or kiosks into clinic workflow, and how to move ePRO data between data systems (eg, ePRO platform, electronic health records, and electronic data warehouses).10, 11 Health systems must also consider how workflows for clinical care activities will adapt to accommodate expanded ways of engaging with patients and support providers in applying new data sources to practice. Currently, ePRO use is often fragmented and siloed in scope, limiting the capacity for knowledge generation and sharing on best practice models and generalizable learnings. As a result, health systems face gaps in understanding how to maximize the ability of ePROs to enhance organization-wide learning and improve care.

Addressing this gap merits an understanding of the sociotechnical, interpersonal, and organizational factors that characterize how providers and LHSs at large can effectively use ePROs in clinical care. Experimental designs, for example, may not be well suited to evaluate how ePRO tools function in complex, adaptive LHS settings that are marked by dynamic interactions between the individual, team, and system-level behaviors. Alternatively, action research methods allow for the emergence of learnings, or guidelines that are rooted in real-world experiences and can provide a foundation for continuous learning, iteration, and application across LHSs looking to implement ePROs.12, 13

2 RESEARCH AIMS

In this Experience Report, we share experiences and learnings from a 5-year initiative guided by action research that aimed to develop generalizable recommendations to support system-wide ePRO integration. Specifically, we approached this work intent on a research process (ie, action research) that is rooted in learning and able to accommodate the complexity of real-world practice, and would result in an output (ie, guidelines) that reflects the process by which health systems can harness patient data to advance the goals of an LHS. While the full results of this project are available at our project website (epros.becertain.org), this paper reports on our experience engaging in action research in a real-world health setting and distilling diverse learnings across multiple years, settings, and stakeholders into generalizable guidelines that can support LHS at large.

3 METHODS 3.1 Research design

With funding from the Agency for Healthcare Research and Quality (R01HS023785), we sought to develop guidelines that could inform ePRO integration into clinical care through an LHS lens. To accomplish this, we took an action research approach to generate knowledge that informs change, marked by multiple, iterative cycles of planning, action, observation, and reflection, and continuous stakeholder participation.13, 14 Over the course of 5 years, our iterative cycles of action research14 centered on two areas of focus: (1) mobilize an ePRO community of practice to facilitate knowledge sharing, and (2) establish guidelines for ePRO use in the context of LHS practice. Throughout the project, the processes of stakeholder engagement, co-production of knowledge, and reflexivity guided and informed continuous feedback loops.12 Figure 1 provides a visual overview of how the community of practice (Table 1) and data-generating (Table 2) activities interacted throughout the guideline development process. By engaging in multiple, iterative action research cycles, we were able to enhance learning and better capture the continuous evolution of the practice. The majority of activities took place within our health system, located in the greater Seattle metropolitan area, with some community of practice activities engaging national and international audiences via professional society membership (eg, American Medical Informatics Association, International Society for Quality of Life Research, Academy Health). All activities were reviewed and approved by the University of Washington Institutional Review Board, and informed consent was obtained as appropriate for study activities.

image

ePRO guideline development process

TABLE 1. Activities and outputs related to mobilizing ePRO community of practice throughout action research phases Goals Activities Identify practice community members (local, national) ePRO practice community within the health system (~50 members, inclusive of clinical, administrative, IT/informatics, research, and patient perspectives), 2016–2020 National ePRO advisory network (~20 members, inclusive of stakeholders involved in ePRO use within external health systems, research around ePRO use, and patient advisors), 2018–2020 Document community experiences & challenges with ePRO use Identification of common use cases for ePROs in clinical practice (n = 3 use cases, details reported elsewhere)3 External site visits with external health systems and EHR vendors (n = 4 site visits, occurring in 2017, 2018, and 2019) Facilitate knowledge sharing across the network Community workshops (n = 8 workshops) with local ePRO practice community PRO Governance Committee (n = 17 meetings) with stakeholders from local ePRO practice community National workshops (n = 2 workshops, totaling 130 participants, including patient advisors) Conference presentations/networking activities (n = 24 events), including with national ePRO advisory network Identify practice community priorities Stakeholder needs assessment (involving patients, staff, providers, administration, leadership, and IT stakeholders) to evaluate needs for ePRO workflows and data use across health system Future research agenda (details reported elsewhere)15 Recommendations for ePRO practice guidelines (content, format) TABLE 2. Description of data-generating activities that contributed to guideline development Data generating activity Associated research approach Alignment with action research phase Evidence review and qualitative synthesis Systematic search and qualitative synthesis16 of peer-reviewed literature (n = 82) describing ePRO use. Details and results of the qualitative synthesis are provided as a supplemental material (A). A synthesis of the current evidence around ePRO use supported the planning phase of action research activities. Semi-structured interviews Qualitative interviews with clinical providers (n = 20) engaged in ePRO use across a variety of specialties.17 Interview questions were guided by the Sittig & Zhang sociotechnical models of HIT implementation.18, 19 Thematic analysis of interview data provided insights into the challenges associated with ePRO use, and informed the planning phase. Local ePRO use cases Catalog Catalog existing ePRO implementations across health system (n = 14 use cases) via structured survey to identify and report on sociotechnical characteristics of ePRO tools, workflows, and ePRO measures used (details reported elsewhere).3 Cataloging of local implementations supported the planning phase by characterizing the breadth of ePRO uses across the health system. Implementation monitoring data

Monitor ePRO assessment tools (n = 9) within the EHR, spanning preventive (eg, annual health risk assessment), chronic (eg, depression & anxiety management), and interventional (eg, total joint replacement, spine fusion) care contexts, over the course of 4 years (2016–2020). See Table 4 for more details.

Monitor implementation process, which included bi-weekly data review of EHR-generated metrics (eg, ePRO deployment and completion rates), brief interviews with patients (n = 15), and informal discussions with ePRO clinical sites (n = 4 sites, inclusive of providers and administrative staff) that evaluated usability and satisfaction with ePRO tools.20

Triangulate quantitative and qualitative data to inform needed modifications and implementation support.

Involvement in the design of ePRO tools, workflows, and training materials enabled the action phase. Monitoring ePRO use over time, and across multiple settings of care, supported the observation phase. Additionally, data across implementations were compared and triangulated to support systemwide learnings and reflection. Fieldnotes & observation Document analysis of field notes from implementation team meetings for individual ePRO implementations, clinic and staff trainings, and periodic observations of clinic workflow and ePRO use in practice. Document review and analysis performed on an iterative (eg, monthly/quarterly) basis with implementation teams and broader community of practice for feedback and reflection. Data from fieldnotes and participant observation supported the observation and reflection phases. Guideline development Learnings across data collection efforts aggregated and organized into thematic areas. Guidelines drafted and iteratively reviewed with local and national ePRO practice communities for feedback and validation via workshops and networking events, and further tested against data gathered from ePRO implementations, as described above. Reflection across implementation experiences contributed to development of generalizable guidelines. 3.2 Mobilize an ePRO community of practice

Action research approaches are grounded in the continuous involvement and participation of stakeholders.12-14 Consequently, mobilizing an ePRO community of practice, or a learning community comprised of stakeholders that bring broad experiences and perspectives on the use of ePROs in practice, played a pivotal role in facilitating practice-based insights and reflexivity during all phases of the action research process. Table 1 details the community of practice activities and outputs that occurred throughout our action research phases and informed activities related to our second area of focus, developing guidelines for ePRO use. We began locally within our health system by canvassing existing committees and organizational units involved in patient engagement work or the use of patient-facing technologies in practice. From this, we identified a core group of stakeholders that reflected a breadth of experiences related to ePRO use, health information technology, workflow, and health system leadership to comprise our governance committee. We then expanded nationally via professional groups associated with PRO measurement and practice, medical informatics, and health services research. Once we had a foundation of members identified, we leveraged the diverse experience of members to document real-world use cases and learnings related to ePRO use in practice.3 We shared learnings across the community of practice to augment community knowledge and engagement and distill experiences into generalizable learnings and recommendations. Lastly, we presented learnings back to the practice community for review, feedback, and identification of future research and practice topics.15

3.3 Establish guidelines for ePRO use in practice

The central goal of our action research cycles and activities was to produce generalizable learnings in the form of guidelines (see Figure 1) that document the tangible best practices for integrating ePROs into clinical practice. Guidelines are advantageous in the context of complexity because they can offer simple rules that allow for application and adaptation across diverse settings and scenarios.21 In alignment with our action research approach, guidelines were informed by multiple, iterative data-generating activities that integrated existing evidence (eg, peer-reviewed evidence), new and contextualizing data (eg, prospective qualitative and quantitative data), and learnings from the field (eg, process and outcomes from real-world application, community of practice input) (Table 2).

To start, we mobilized an ePRO community of practice (as described above), and reviewed existing peer-reviewed evidence related to ePRO use in practice, both of which supported the planning and problem identification phase. As part of our action phase, we generated a variety of new data from real-world ePRO implementations to contextualize experiences from the field. In our observation phase, we integrated data across prior phases to develop draft guidelines that can inform how LHS use ePROs to provide more patient-centered care. Lastly in our reflection phase, guidelines were further tested and refined via collaboration with our community of practice (local, national). For example, we presented draft guidelines during a national workshops in 2018, during which we solicited feedback from attendees (n = 100) about how guidelines aligned with their ePRO experiences. Workshop attendees provided practice-based insights that worked to align guidelines with real-world practice and refine supporting tools and resources. The next iteration of the guidelines was then presented at a national workshop in 2019, where new attendees (n = 30) provided additional feedback. In some instances, feedback from community of practice stakeholders identified unresolvable gaps, which we documented as areas for future research.15 Multiple iterations of feedback and refinement produced our final set of guidelines for ePRO use, which we organized around core thematic areas related to organizational and interpersonal readiness and ePRO intervention fit and alignment with broader LHS contexts.

4 RESULTS

We report results by three thematic areas identified through our action research approach: governance, integration, and reporting. Table 3 highlights the thematic areas generated during our ePRO guideline development process, along with target audiences and a contextual example of an issue addressed for each area in efforts to implement ePROs for depression management. We summarize our learnings in the form of 24 guidelines that emerged across these thematic areas, following the process outlined in Figure 1. The guidelines are agnostic to any specific health IT platform or PRO measurement approach. This is intentional, recognizing that health systems have different sociotechnical resources available. As a result, the guidelines provide a tangible framework to support considerations and decision-making around ePRO use that can be adapted to the local context and various LHS approaches. The section below provides an overview of guidelines (indicated by italics) that emerged, as well as reflections on the experience and lessons learned from the guideline development process. A full listing of guidelines is included as a Data S2; additional guideline details and tools to support use in practice are available at epros.becertain.org.22

TABLE 3. Thematic areas for ePRO use in LHS practice Thematic Area Purpose Target audience Example issue addressed Governance Inform strategies, processes, and communication for governing ePROs across the LHS Stakeholders involved in ePRO oversight, governance, and evaluation activities How will the health system efficiently manage the collection of ePROs for depression across multiple areas of care? Integration Inform approaches to ePRO workflow, technology, and user adoption at the point of care Stakeholders involved in the design and implementation of ePRO tools in practice How will workflows for ePRO depression data capture differ for preventive care vs specialty care? Reporting Inform the design, functionality, and presentation of ePRO reporting tools Stakeholders involved in designing or integrating ePRO reporting tools into care delivery How should ePRO depression data be reported to support appropriate clinical action? 4.1 Guideline theme 1: Governance

The role of governance in healthcare delivery is rapidly expanding as health systems face increasingly complex health IT tools and implementation needs.9, 23, 24 Our experience working with the local community of practice (see Table 1), including leading workshops, needs assessments, and governance activities, provided insights that helped clarify the scope and role of governance for ePRO data within the system. Further, governance provides a holistic view across all ePRO needs, not just single implementations, and how to balance those needs to support complex practice change. For example in the case of measuring depression, we identified multiple needs for ePRO capture, reporting, point of care, and system-level decision-making.3 The most critical role of governance has often focused on the work of boundary setting that encourages and enforces health system behavior in ways that balances health system goals while avoiding overextension of roles, responsibilities, or resources.

We identified five guidelines that encompass the governance practices needed to support organizational readiness for ePRO use across an LHS. These guidelines consider both the technical and clinical ramifications of scaling ePRO implementation across a healthcare delivery system and echo the importance of multidisciplinary stakeholder engagement and continuous learning around best practices. First, health systems must define objectives for ePRO use and align ePRO objectives with broader health system goals, to create pathways for organizational support and incentive alignment, and clear leadership for ePRO objectives. Next, health systems should develop an IT strategy for ePRO use that considers the desired technical capabilities for ePROs, existing IT architecture, and needed modifications to the IT environment to support the development of ePRO tools. Third, health systems should establish formal structures for ePRO governance that define membership, systemwide roles and responsibilities, decision-making capacity, and relationships to other governing structures within the organization. ePRO governance teams can then develop the tools, resources, and practices needed to operationalize governance objectives. For example, ePRO governance teams may need to develop processes to manage the intake and prioritization of new ePRO project requests. Lastly, ePRO governance will need to actively disseminate best practice models for ePRO use by facilitating organizational learning across stakeholder networks and engaging in knowledge management of ePRO learnings over time.

Experience reflections: Our experience with ePRO governance highlighted that expanding to new forms of patient-reported data, such as ePROs, often disrupts the existing governance structures, requiring the formation of new, multidisciplinary models of governance that span clinical practice and IT infrastructure. We also learned that the needs for ePRO governance will evolve over time, initially requiring a more specialized team (eg, “ePRO Governance Committee”) that can foster knowledge-building and strategy. Later, needs for ePRO governance require greater integration with existing teams (eg, groups that oversee the patient portal) that can align decision-making with existing processes. When it comes to leveraging ePRO data to inform organizational decisions, the evidence for using ePRO data to guide patient care and assess care quality is emerging. Establishing governance can enable ongoing support for continued learning and refinement of processes as new knowledge is generated.

4.2 Guideline theme 2: Integration

The theme of integration encapsulates the work of moving ePRO use from conceptual to practical, considering the interactions between workflow, technology, and human factors that influence how ePROs are used at the point of care. In this, it is crucial to take a systems approach that acknowledges the dynamic nature of how ePRO data facilitates LHS goals across the micro, meso, and macro layers of the organization.25 While many evidence-based models to guide the implementation process exist, our goal was to identify the core considerations for ePRO implementation that augment the LHS goal of harnessing the power of data. Table 4 provides a brief overview of the ePRO implementations that contributed to our experiences and learnings related to ePRO integration. Throughout these implementations, combinations of direct observation, data collection (eg, implementation monitoring), and reflective evaluation with implementation teams helped to document experiences and adaptations that improved workflows and the broader infrastructure to support ePRO use across the organization.

TABLE 4. Summary of ePRO practice sites and implementation reach Practice setting Driver of ePRO use ePRO measures Implementation reach Experiential learning opportunity

Surgical/Interventional

(eg, total joint replacement)

Statewide contractual reporting and Accountable Care Network Oswestry Disability Index (ODI) Hip Disability and Osteoarthritis Outcome Score (HOOS) Jr. Knee Injury and Osteoarthritis Outcome Score (KOOS) Jr. PROMIS-10 Patient Health Questionnaire (PHQ)-4 Timeframe: 2016–2018 Launch of ePRO tools across 3 specialty clinics Understanding ePRO workflow across inpatient and outpatient care settings Understanding ePRO reporting for population health contexts

Preventive care

(eg, annual wellness visits)

Medicare billing System-wide care transformation initiative PHQ-2 Alcohol Use Disorders Identification Test (AUDIT-C) Timeframe: 2018–2019 Launch of tools across 16 primary care clinics Understanding approaches to maximize automation and efficiency in ePRO data collection

Chronic care

(eg, depression care management)

System-wide care transformation initiative PHQ-9 General Anxiety Disorder (GAD)-7 Timeframe: 2018–2020 Launch of tools across 15 primary care clinics & behavioral health care teams Understanding approaches to align ePRO tools with individualized care and existing care pathways

We identified five guidelines that reflect the unique considerations for supporting ePRO integration at the point of care, including preventive, chronic, and interventional care delivery settings.3 An important starting point is clarifying how ePRO data will support care delivery. Project teams looking to implement ePROs should explore questions around how care teams will need to receive ePRO results, what the appropriate clinical response is for ePRO data, and whether clinical teams have the capacity to facilitate that response. Next, teams should design workflows and tools that enable easy ePRO data capture for patients and care teams, considering alignment with existing workflows for clinical visits. Once workflows have been developed, teams should leverage health IT to facilitate efficiencies and improved user experience in ePRO use. As teams prepare to launch ePRO use, they should identify and implement strategies that actively engage users (eg clinical team, patient) in ePRO adoption and use. Lastly, teams should encourage continuous learning throughout ePRO implementation to facilitate ongoing process improvement, and share learnings throughout broader organizational networks that support the identification of best practice models.

Experience reflections: Learnings from our reflections on ePRO integration showcase the value of standardized implementation models that can enable comparison of experiences across diverse settings and more rapid learnings. Furthermore, ensuring the availability of real-time implementation monitoring data, paired with ongoing access to stakeholders with expertise in clinical practice and IT to support knowledge brokering, were key factors in our ability to promptly understand and address barriers to implementation and support organizational learning. The need for and benefit of monitoring ePRO implementations has not been well studied and best practices are not yet clear; this gap warrants future research and the development of practice standards.

4.3 Guideline theme 3: Reporting

The role of reporting ePRO data is perhaps the lynchpin to translating ePRO data into action supporting healthcare transformation. While ePROs have a long history of use in research, their application in point of care decision-making still warrants the need for strategies that can support the data to knowledge to action pathways, considering the user needs of patients, providers, and other key stakeholder groups. Consequently, our focus of reporting centered on the needs and desires for how ePRO data could inform shared decision-making, recognizing that underlying IT abilities will continually evolve. Through in-depth interviews with current ePRO users and documentation of experiences supporting the design and use of ePRO reports for our implementation practice sites (Table 4), our experience highlights the challenge of aligning health system and technology functional capabilities with user needs and preferences. In order to align and support the goals of LHS, the ePRO reports and options selected in practice should leverage feedback loops that facilitate continuous learning about the use of ePRO reports and their application across diverse areas of practice.

We identified 14 guidelines that address considerations for curating data and information, designing system functions and interactions, and appropriate presentation of ePRO data to end users. These guidelines reflect a breadth of ePRO reporting for supporting providers and other stakeholders across clinical and organizational use cases, including point of care, quality improvement, population health, and contractual reporting. The theme of reporting also considers that the activity of reporting involves the interaction between reporting tools themselves and the use of those tools by patients and care teams. When determining what data and information are included in ePRO reporting options, teams need to consider what information will most appropriately facilitate the interpretation of ePRO data. As part of this process, project teams should identify whether to provide longitudinal or comparative ePRO information to clinical users, determine the most useful statistical presentation to display, and consider how to augment PRO data with contextual information, such as clinical variables that support decision-making. As teams look to design the ePRO reporting tool functions and interactive elements, they should identify functions that facilitate ease of use and reduce barriers to ePRO data access and review. Teams may face decisions around the use of automation to improve ePRO reporting workflows, allowing customization to enhance usability, including drill down or drill up capacities, or providing means to filter PRO data for different review needs.

Additionally, teams should consider the need to integrate ePRO reporting tools with clinical data platforms (including those that are internal or external to the EHR) and accommodate reporting across multiple platforms where clinical users may interact. Lastly, as teams consider approaches for presenting ePRO data to clinical users, they should identify design strategies that enhance ePRO data review, especially quick review at the point of care. Strategies could include visually enhancing key information, providing simple and familiar graph formats, organizing displays of multiple ePRO visualizations together, and modeling clinical use of ePRO reports via simulation and training.

Experience reflections: Our experience clarified that a critical aspect of ePRO reporting is considering how data needs to move between platforms to support the layers of clinical and ePRO reporting needs across stakeholders. The introduction of new forms of patient data, such as ePROs, can challenge the existing infrastructure in place to support data movement and a standard approach to ePRO report visualization. The use of ePRO reports will vary widely in practice and may draw from different data sets and systems. As new ePRO reporting capabilities are developed, health systems will need to balance the capacity for user customization with the continually evolving need to aggregate ePRO data to provide more generalized evidence on how this data should inform practice within and across specialties.

5 DISCUSSION

This work, and resulting guidelines, address a gap in resources available for LHSs working to integrate ePRO data across the organization. Prior work outlines the vision for how the components of LHS (organizational, structural, behavioral) should align to improve care delivery.3 In this work, we anchor this vision in the context of leveraging ePROs to provide more patient-centered care. We leveraged action research methodology and a robust community of practice to develop guidelines that guide the integration of ePROs into LHS contexts. Three thematic areas (ie, governance, integration, and reporting) reflect the application of ePRO use to LHS goals and reinforce the need for LHS processes to translate across multiple layers of the organization to support continuous learning and practice change.

This work echoes prior work by Harrison and Shortell (2020) that describes the functions and inter-workings of the LHS, emphasizing the importance of a system-level approach that facilitates “deep learning”1 across layers of the organization and ensuring that innovations such as ePROs align with system incentives, leadership goals, and available infrastructure to support data and knowledge management.1, 5 For example, one of our guidelines related to the theme of governance recommends identifying an IT strategy for ePRO use that considers the technology used, approach to electronic health record (EHR) integration, and management of IT resources and standards. Achieving this guideline will require input and influence from all organizational layers, and the output of this guideline (ie, the IT strategy) will vary significantly from one health system to another health system. Yet, those differences do not preclude health systems from achieving the guideline objective and provide a framework for cross-site learning and evaluation. Our experience highlighted continued challenges in understanding how to disseminate learnings through an LHS effectively. In the spirit of fostering LHS principles, it will be essential to continue to evolve our guidelines in response to their use in practice. Yet, there is little guidance on how to establish feedback loops that facilitate effective continuous learning across layers of the organization, as well as beyond a single healthcare setting.

In addition to the tangible guidelines and recommendations that emerged from this work, several cross-cutting learnings can continue to inform how LHSs integrate ePROs and other forms of patient-generated data. ePROs are complex interventions whereby the implementation will impact multiple layers of healthcare organizations. From a technical standpoint, ePROs represent much more than the addition of a few clinical variables to EHR; ePROs require dynamic technical capabilities for data flow and data representation. Workflows for ePRO data capture can involve complex and often hybrid combinations of EHR-integrated and external ePRO platforms, as well as electronic and paper-based tools. Approaches to visualizing ePRO data may need to consider the visual requirements of diverse users, the nuanced parameters for ePRO score interpretation, and the potential need for normative or population health displays of ePRO data in context with other clinical variables. From a social perspective, effective ePRO data collection and use rely on the engagement of multiple user groups, including patients, clinical staff, providers, and administrators or leadership. Consequently, while LHSs are well-poised to advance the data-knowledge-practice continuum, LHS may need to expand their focus on the complex relationship between implementation strategies, processes, and outcomes that influence the quality and adoption of ePRO data in practice.25-27

A core feature of an LHS is its focus on evaluating the intersection between data, technology, and how care delivery evolves. From this perspective, evaluation is not just about meeting performance goals; it is about reflexive learning from experience across levels of organizations (individual, interpersonal, team/unit, department) to drive improvements that enhance system performance across multiple dimensions.1, 5 There is increasing recognition that understanding the impact of innovative tools in practice needs to move beyond positivist approaches to research.23 A goal of our work was to develop guidelines that were durable, practice-based, transferable, and generalizable across diverse settings. The process of developing guidelines required more robust learning and evaluation models to maintain a thoughtful balance between specificity and adaptability. Our experience demonstrated that action research methodology offers a few key benefits that make it particularly well-suited for evaluating complex health innovations such as ePROs in practice.12 First, action research emphasizes the critical role of stakeholder engagement throughout the research process. Second, action research is marked by the iterative cycles of evaluating and triangulating learnings in practice, which help translate learnings from multiple single experiences into generalizable learnings.28 And third, action research's constant focus on reflection and reflexivity offers opportunities to engage health system stakeholders in more thoughtful approaches to knowledge generation than traditional healthcare operations often afford.12 Though the use of action research methods was a strength, it also presented some challenges; in particular, the challenge of recognizing when learnings had reached a point of saturation, such that they could be summarized and shared. In our experience, engaging a community of practice that involved stakeholders beyond our practice site (eg, stakeholders involved in ePRO use at other healthcare settings) played a pivotal role in informing decisions around how inquiry activities were designed, conducted, and concluded. When we were able to achieve consensus between our local and national ePRO stakeholders, this confirmed that saturation had been reached; however, when community of practice stakeholders continued to bring up experiential examples that did not align with our learnings, this signified the need to continue the inquiry and the action research process.

There are a few limitations that are important to acknowledge in this work. First, our focus was on generating learnings that could inform health systems looking to be responsive to external pressures and policy changes related to patient data, considering the perspectives of providers, administrators, and other stakeholders working to provide clinical care. While we consulted with patients as part of both the local and national community of practice (including members of the Patient and Family Advisory Council, and patient advisors that served on operational committees within the health system), we did not directly involve patients on the the research team. During the conduct of this work, we recognized this gap in efforts and involved patients in our local planning and development efforts for ePROs and, more broadly, patient-generated health data.29 As the use of technology for involving and engaging patients in healthcare delivery advances, more work is needed to focus on methods, processes, and results of involving patients in the co-design of system-wide approaches. Second, it is essential to acknowledge that technology, such as the tools used to facilitate ePRO, are continually evolving, and in parallel, so are stakeholder interests and uses of those tools in practice. The learnings presented from this work reflect the experiences of our community of practice at this point, but is subject to continued change and evolution. Finally, the learnings presented reflect our experiences within a U.S.-based healthcare system; while the guidelines produced from this work are intended to be agnostic to healthcare setting, there are likely important differences to acknowledge around the use of ePROs in other contexts outside the U.S.

留言 (0)

沒有登入
gif