Partnership to drive implementation science and practice

A person who wants to find a solution to a public health problem has a different task than someone who wants to create or test a theory.1 (p. 8)

While those who seek to implement and those who seek to understand implementation might be required to undertake different tasks, those tasks are complementary, and deeper partnership has the potential to extend and advance the field. Others have similarly noted ‘the somewhat paradoxical gap between scientific knowledge concerning implementation and actual real-life implementation’ and the need to close this gap.2 They argue that ‘for the practice of implementation to be furthered, we as researchers have an obligation to contribute to improved utilization and translation of the knowledge produced in the implementation science field’.

Partnership between clinical and academic settings has frequently been cited as beneficial.3 JBI Collaborating Centre (JBI's global evidence network) partnerships where academics are supporting pragmatic implementation projects offer a unique opportunity to further evaluate utility by testing, refining, and integrating existing theories and frameworks in programmatic research and guideline development in partnership with implementation collaboratives, an approach informed by Wensing and Grol on increased contribution of implementation science to knowledge translation in health.4

JBI's approach to the practice of implementation is centred on audit and feedback; a powerful, and complex team-based approach to practice change with benefits for quality improvement, systems, patients, and practitioners via emphasis on structures, processes and outcomes. There is substantive depth of expertise required in each audit and feedback project, and diversity of professions across clinical and academic groups collaborating to improve system-level issues impacting care delivery and patient outcomes. In contrast with routine health service audits that focus on standardized reporting data, the JBI approach relies on a pretest/posttest model with audit criteria derived from high-quality evidence. The evidence informing these audit criteria is pluralistic, coming from diverse research traditions (FAME), including:

(1) Feasibility (the extent to which an activity or intervention is practical or viable in a context or situation – including cost effectiveness) (2) Appropriateness (the extent to which an intervention or activity fits within a context or situation) (3) Meaningfulness (refers to how an intervention or activity is experienced by an individual or group and the meanings they ascribe to that experience) (4) Effectiveness (the extent to which an intervention achieves the intended result or outcome)5

The integration of types of evidence (FAME) within EBHC presents a unique opportunity for research and guideline groups to develop implementation and evaluation approaches supported by pluralistic conceptualizations of types of evidence and their relevance to research and guideline planning. This is already practised extensively in relation to evidence synthesis (i.e. ensuring methodologies are available to systematically review evidence as it relates to diverse types of evidence across paradigms). However, there is opportunity to extend the use of FAME to implementation science, to complement and inform pragmatic implementation efforts.

Audit and feedback are a clinical prerogative. The absence of complementary integrated primary research exploring or describing aspects, feasibility, applicability, meaningfulness and effectiveness of the clinical problem or topic, and specifically, methods that are complementary to clinician-led audit and feedback is a missed opportunity. JBI and the JBIC should consider how to address this gap. As stated by Cameron6 in 1963, ‘Not everything that counts can be counted, and not everything that can be counted counts’; therefore, research methods for study of feasibility, applicability and meaningfulness in particular should be considered in parallel with implementation. The framework is in place: the JBI Model defines EBHC, the JBI collaboration operationalizes EBHC. These academic centres of excellence in synthesis and implementation science achieve this by working in partnership with local health services using evidence from high quality, reliable JBI reviews to inform practice change.

We propose that the clinician-led audit of structures and processes of care could integrate research methods and guideline recommendations that value-add to the academic contribution to clinical partnerships. This could be realized by combining the individual research strengths of academic staff with the clinical leadership of practising health professionals. This work is already underway but is under-reported.

Lead researchers of The Finnish Centre for Evidence-based healthcare at the Nursing Research Foundation in Helsinki (https://www.hotus.fi/jbi-cc/) use FAME as a comprehensive framework for determining feasibility, appropriateness, meaningfulness and effectiveness of the need for a clinical practice guideline. Feasibility and Appropriateness’ operational definitions guide consensus processes to evaluate the proposed practices, methods, the prevalence or appropriateness of the phenomenon or health problem, knowledge gaps or inconsistency in knowledge and load or burden in system, professionals or economics to the Finnish population. Meaningfulness is used to guide analysis of relevance to quality of life, while Effectiveness is the lens to ask questions about the quantitative benefits versus risks of health problems; whether it is possible to reduce risks and harms; whether health promotion interventions have the potential to prevent health problems, and whether there are other societal impacts associated with the recommended nursing methods or practices.

It is important to establish what principals should guide research planning associated with clinician-led implementation. The guiding principles for this should be to select methods that align with academic staff research expertise and capacity; equally, research must be ethical and not spuriously investigate questions that do not add value to an implementation site. FAME defines key parameters for types of evidence to ensure relevance and consistency across the JBIC, and researchers should consider where and how their inquiry might be of maximum benefit to the implementation site and therefore contribute to a growing body of knowledge within the parameters of evidence of feasibility, appropriateness, meaningfulness, and effectiveness.

The JBI Implementation Frameworks’ seven phases across preimplementation, implementation and evaluation of uptake and sustainability each represent a significant opportunity for academic-led research that complements and adds strength to knowledge of clinician-led implementation. Finally, research should primarily aim to build a stronger basis for implementation within the clinical context and align with professional expectations while contributing to knowledge.

The JBI Evidence Implementation Training Program presents a unique opportunity for research by considering audit and feedback as not only a vehicle for practice change but also for the accumulation of knowledge on the benefits of implementation within the FAME criteria for evidence generation and transfer through research and guideline group planning and activity.

Acknowledgements

The work which underpinned this submission was unfunded. The authors have a professional and scholarly interest in implementation science, and conceived of the idea, undertook the methods, and wrote the article without any direct or indirect financial support.

We, the listed authors for this article, have agreed to the version which is being submitted and confirm that no part of this work has previously been accepted for publication elsewhere. In accordance with the guidelines for authorship, we declare that all named authors have appropriately met the criteria for authorship.

Conflicts of interest

The authors have no conflicts of interest to declare.

References 1. Eldredge LKB, Markham CM, Ruiter RAC, Fernández ME, Kok G, Parcel GS. Planning health promotion programs: an intervention mapping approach. 4th ed.San Francisco, California: Jossey-Bass; 2016. 2. Westerlund A, Sundberg L, Nilsen P. Implementation of implementation science knowledge: the research-practice gap paradox. Worldviews Evid Based Nurs 2019; 16:332–334. 3. Robinson JS, Turnbull DA. Changing healthcare organisations to change clinical performance. Med J Aust 2004; 180 (S6):S61–S62. 4. Wensing M, Grol R. Knowledge translation in health: how implementation science could contribute more. BMC Med 2019; 17:88. 5. Jordan Z, Lockwood C, Munn Z, Aromataris E. The updated Joanna Briggs Institute Model of Evidence-Based Healthcare. Int J Evid Based Healthc 2019; 17:58–71. 6. Cameron W. Informal sociology: a casual introduction to sociological thinking. 1963; New York: Random House, 170.

留言 (0)

沒有登入
gif