Leveraging academic initiatives to advance implementation practice: a scoping review of capacity building interventions

Our initial search strategy yielded 1349 studies that were entered into Covidence for screening. After title/abstract and full-text screening, a total of 14 studies were included in our final analysis (Fig. 2). Reviewers achieved a Cohen’s kappa of 0.49 during the review process, indicating a moderate level of agreement. Common reasons studies were excluded were lack of intervention outcome measurement, description of an intervention not unique to implementation practice capacity, and failure to clearly describe a specific capacity building intervention. The majority of included studies were descriptive in nature and none used experimental or quasi-experimental designs to evaluate the efficacy of the capacity building interventions.

Fig. 2figure 2

PRISMA flow diagram of study selection process

Targets of capacity building interventions

Researchers, practitioners, and practitioners-in-training (e.g., professional program graduate students) were all targets of implementation practice capacity building interventions. A total of eight interventions collectively targeted researchers, practitioners, and practitioners-in-training simultaneously [23,24,25,26,27,28,29,30], three interventions targeted practitioners only [31,32,33], and the remaining three targeted practitioners-in-training who represented the fields of social work [34], public health [35], and nursing [36].

Researchers, practitioners, and practitioners-in-training interventions

Studies that targeted researchers, practitioners, and practitioners-in-training were conducted in the USA [25,26,27], Canada [28, 30], South Africa [29], and Sweden [23], and jointly across the USA, Mexico, and India [24]. Each intervention consisted of a combination of core components with the most common interventions including didactic activities, mentorship and expert consultations, knowledge sharing activities, and practical application activities. As an example, from the University of Kentucky, the Value of Innovation to Implementation Program (VI2P) sought to build implementation capacity across its health system and six health professional colleges [25]. The VI2P was also designed to foster a local learning collaborative to facilitate implementation knowledge sharing, similar to the local implementation learning communities developed by the Colorado Research in Implementation Science Program (CRISP [26]) (Table 3).

Table 3 Characteristics of capacity building interventions for a combination of researchers, practitioners, and/or practitioners-in-trainingPractitioners

Three capacity building interventions targeted practitioners comprised of healthcare providers [31], clinical leaders and managers in behavioral health settings [32], and individuals in community-based service delivery [33] and were conducted in Canada [31] and the USA [32, 33]. All of these practitioner-only interventions incorporated didactic activities as a core intervention component with other common components including knowledge sharing activities and practical application exercises. For instance, intervention developers of the Training in Implementation Practice Leadership (TRIPLE) led activities to build implementation capacity among clinicians or administrators in behavioral health practice settings using the term “practice leader” to describe intervention targets. Core components of the TRIPLE intervention included the facilitation of knowledge sharing activities, didactic activities, consultations with implementation experts, practical application activities, and the provision of technical assistance [32] (Table 4).

Table 4 Characteristics of capacity building interventions for practitioners onlyPractitioners-in-training

A total of three interventions, all delivered in the USA, described graduate-level academic work aimed at engaging students in implementation for professional careers in social work [34], public health [35], and nursing [36]. For instance, nurse practitioner students applied implementation concepts during their doctoral practice to gain experience leading efforts to implement evidence in practice [36]. Similarly, public health students engaged in opportunities to assess implementation determinants, strategies, and outcomes during in real-world contexts [33]. Masters of social work students developed implementation capacity by examining the delivery of a specific evidence-based practice at an assigned field site over a 16-week field rotation (Table 5).

Table 5 Characteristics of capacity building interventions for practitioners-in-trainingSummary of capacity building intervention components

Our analysis yielded five common components of capacity building interventions across studies. These core components included didactic activities, mentorship and expert consultation, practical application activities, knowledge sharing activities, and technical assistance.

Didactic activities

Most often, capacity building interventions consisted of structured didactic activities that were delivered in-person, online, or in a hybrid format using a combination of lectures, readings, case studies, and self-paced modules. In-person didactic content was delivered in the form of university-level courses [23, 27, 34,35,36] or through workshop events led by implementation science experts [26, 28, 30,31,32,33]. Duration of in-person didactic workshops ranged from one and a half to 2 days. Hybrid didactic content was highly variable in structure and length. For instance, the PH-LEADER program lasted a total of 1 year consisting of a 2-month preparation period, a 3-week, in-person summer short course, and an in-country mentored project phase. Throughout the program, participants received didactic instruction from expert faculty as well as through recorded webinars [24]. Didactic content to build implementation capacity was also provided in an ongoing manner, such as through the Knowledge Translation Strategic Training Initiative in Canada [30]. The only didactic content delivered entirely online was structured in the form of a four-course series to build implementation capacity through the University of North Carolina Chapel Hill’s Gillings School of Global Public Health [35].

Mentorship and expert consultation

Across capacity building interventions, 11 included descriptions of formal mentorship and/or consultations from implementation experts to researchers, practitioners, or practitioners-in-training. Mentorship and expert consultation were provided to increase participants’ capacity to understand implementation principles and/or lead implementation-focused projects at their respective organizations. Mentorship and consultation activities occurred in settings such as academic institutions [25,26,27, 32, 34,35,36] and specialized institutes [28, 30, 31]. One global model of mentorship consisted of implementation faculty who provided training to field mentors. As part of this model, public health students and practitioners then completed implementation projects related to HIV/AIDS research in South Africa and received routine mentorship from their trained field mentors with a focus on implementation determinants, strategies, and outcomes [29].

Practical application activities

Interventions to build implementation capacity also included development and deployment of practical application activities. These activities allowed intervention participants to lead their own implementation projects in real-world contexts through pilot projects or evaluations of implementation determinants. Practical application activities were completed through graduate student field placements [34, 36], as well as small scale implementation projects [24, 28, 30, 32, 35]. In the academic medical center setting, Li et al. [23] facilitated practical application of implementation principles by convening an informal network of implementation researchers, practitioners, students, quality improvement experts, and community stakeholders. Individuals within this network formed implementation teams who submitted grant applications (funded through the University of Kentucky) to complete implementation-related projects by partnering with medical center affiliates. Of the 26 teams who submitted applications, four projects were funded as of 2019, allowing teams to gain practical experience conducting projects informed by implementation methodologies.

Knowledge sharing activities

Knowledge sharing took the form of small group reflections and exercises, expert panel discussions, breakout activity sessions, and the development of learning collaboratives [25, 26, 29, 30, 32, 33, 35, 36] Among practitioners-in-training, as one example, social work graduate students examined the implementation of evidence-based practices over a 16-week field placement, and students completed weekly field portfolios that described the process of implementing specific evidence-based practices. Weekly cohort seminars allowed students to share content from their portfolios and their experiences implementing evidence within their field placement sites [34]. For practitioners involved in the Practicing Knowledge Translation program, participants were encouraged to engage in small group discussions and share their progress towards completing their implementation and learning goals [31].

Technical assistance

Capacity building interventions also provided technical assistance to facilitate the development of implementation projects and grant proposals. Two studies described using this type of assistance—referred to explicitly as technical assistance or technical support—which included building implementation support networks, identifying appropriate implementation projects to deploy in the community setting, and facilitating implementation training opportunities for participants [32, 33]. Specifically, the TRIPLE program aimed to build implementation capacity among behavioral health organizations and provided technical support to assist mid-level organization leaders in their implementation of evidence-based practices at their own agencies. Technical assistance (e.g., developing project activities; planning for outcome measurement) was offered by TRIPLE faculty who provided coaching on the development of implementation projects, the theories, models, and frameworks that inform implementation, strategies for implementing change, and methods for conducting organizational change evaluations [32].

Outcomes of capacity building interventions

All interventions included in this review described favorable outcomes. Outcomes were categorized into the following groups that are described in further detail below: knowledge attainment, increased perceived ability to implement evidence, productivity, and satisfaction.

Knowledge attainment

Six studies in the review explicitly gathered data on overall knowledge of implementation [23, 26, 28, 31, 34, 37], defined as the understanding and awareness of implementation models, factors, and strategies influencing evidence use. Moore and colleagues [31], for example, conducted a longitudinal study of practitioners’ knowledge and application of implementation principles at intervals over a year; all subjects showed a significant increase of knowledge, which Moore et al. associated with improved application of implementation methodologies to local projects. Bertram and colleagues [34] built implementation capacity through graduate coursework and found that the course components led to increased knowledge and understanding of implementation models, factors influencing program implementation, and implementation interventions.

Increased perceived ability to implement evidence

A total of seven studies had outcomes specifically targeting participants’ ability to implement evidence into practice, as measured through self-efficacy, confidence, and competence in evidence implementation [24, 28, 31,32,33,34, 36]. Of these studies, only four measured self-efficacy or confidence at multiple time points [24, 28, 31, 32], whereas the two academic courses targeting social work [34] and nursing [36] students used student feedback in the form of narrative course evaluation comments to assess outcomes. Tools used at baseline and follow-up included the Evidence-Based Practice Confidence Scale [38], a 3-item tool measuring intentions to use evidence [39], the Implementation Leadership Scale [40], the Implementation Climate Scale [

留言 (0)

沒有登入
gif