Methodology for evaluation of complex school-based health promotion interventions

The Project Spraoi description

Project Spraoi, an Irish, whole-school, multi-component primary school intervention aims to deliver 20 min extra daily PA to students during the school day and improve student’s nutritional knowledge and behaviours [7]. We assessed the primary outcome measures of the intervention (body mass index (BMI) standard deviation score (BMI z-score), waist circumference (cm), 550 m walk/run time (secs) and PA levels) only in children who at induction were in ‘senior infants’ (6–7 years of age) and ‘fourth class’ (9–10 years). The parent or guardian granted permission for each child to participate in the measurement element of the study. All children in the intervention schools participated in the intervention activities because the lead author delivered them to all students.

The intervention ran throughout the school year (September 2015 through June 2016) with the Energizer visiting the school for a maximum of 2 days per week. We developed an overview of the process evaluation methodology for Project Spraoi presented in Fig. 1.

Fig. 1figure 1

Process evaluation methodology of Project Spraoi

We measured outcomes among the same senior infant and fourth class cohort of children at the beginning and end of the academic school year. Process evaluation ran concurrently to intervention delivery, with data collected at multiple timepoints throughout the school year (Table 1), using a variety of process evaluation data collection tools (Table 2).

Table 1 Process evaluation data collection timelineTable 2 Process evaluation data collection toolsEvaluation process

As the intervention evolved, so too did the scope of the process evaluation. We developed and refined methods for the process evaluation over three phases from 2013 to 2016 (Fig. 2).

Fig. 2figure 2

Progression of process evaluation of Project Spraoi

The process evaluation of this study alone generated close to 2,000 data sheets from interviews (n = 40), PA logs (n = 630), reflective journals (n = 69), write & draw (n = 585) and questionnaires (391).

Following phase one of Project Spraoi (2013–2014), an impact evaluation of the intervention revealed discrepancies between expected and observed outcomes (unpublished observations). An initial attempt at process evaluation undertaken on a post hoc basis during the first phase did not yield sufficient data to allow for a valid explanation for the unexpected results observed ( substantial improvements in the control cohort) (unpublished observations by the Energizer/evaluator). Our inability to interpret phase 1 findings highlighted the need to implement and report a robust methodology to document the process by which the Project Spraoi intervention achieved the observed effects and to distinguish the relative contribution of each intervention component to overall outcomes, in context.

PENZ is a health promotion programme from New Zealand that sees a team of 26 ‘Energizers,’ working with their local schools and communities, to increase children's PA, improve nutrition, and enhance their overall health [8]. Due to the lack of any process evaluation data from PENZ, the authors, in consultation with the Project Spraoi research team, conducted a preliminary study during phase two, the second year of implementation (2014–2015) in site A, a rural, mixed gender school. We intended to inform the final design of the process evaluation methods and data collection tools. Following the phase two study, we expanded a refined robust methodology for process evaluation of Project Spraoi to include all intervention and control schools (n = 7) for the third year of implementation of Project Spraoi (2015/16) (phase three) (Table 3).

Table 3 Summary of data collected by questionnaires relative to process evaluation

Before beginning the process evaluation of Project Spraoi, the research team needed first to define the intervention, its activities and theory of change and distinguish both how we expected the effects of each specific intervention activity to occur and how these effects might be replicated by similar future interventions [24]. As already stated, Project Spraoi is based on a proven methodology, PENZ, and as such, the theories of change upon which Project Spraoi is based were set out by its NZ counterpart.

The process evaluation of Project Spraoi employed the three themes described by the British MRC guidance for process evaluation: context, implementation and mechanisms of impact [19, 23]. These themes are described in more detail in Supplementary Material Part 1. Each theme was further subcategorised into three key dimensions for evaluation by the authors: reach, fidelity and dose, which are explained in more detail below.

The HEALTHY study [25] was a large-scale multicomponent school-based intervention targeting both diet and PA using a combination of observations of intervention sessions, interviews and focus groups (with school staff and children), alongside teacher feedback forms on class behaviour, in their evaluation. Such a diversity of methods allowed for triangulation of data from different sources and as a result, the authors of Project Spraoi chose to follow a similar approach. As a result, in line with the HEALTHY study, the process evaluation of Project Spraoi opted to omit the analysis of ‘reach’ [25]. ‘Reach’ evaluates how widely the intervention is adopted within the intended population. This dimension was fixed throughout the course of the intervention, due to the programme being delivered under controlled conditions to all classes in intervention schools each year; that is, students did not have the opportunity to ‘opt out’ of the intervention, as the project applied to all students in the school. Therefore, implementation of Project Spraoi was analysed by the authors using two (fidelity and dose) of the three (fidelity, dose, reach) dimensions described by Linnan & Steckler [26].. Fidelity refers to the degree to which an intervention is implemented as planned or intended. It ensures that the programme is delivered consistently, aligning with its original design and objectives [27]. Dose refers to the amount of an intervention delivered or received during an intervention [27].

We categorized adaptations (intentional modifications made to the Project Spraoi intervention when it was delivered) under three headings, ‘innovation’, ‘drift’ and ‘subversion’ [27] in line with the FRAME structure [28], which supports research on the timing, nature, goals, and impact of adaptations to evidence-based interventions. In the Project Spraoi context, innovation relates to the implementation of the PA and healthy eating components that intended to lead to positive change or improvement. Drift refers to any unintended deviations that took place in the school delivery which were different from the intended design of the intervention. Finally, subversion refers to any deliberate efforts to undermine or alter the intended purpose of an intervention or policy. It may involve resistance, intentional modifications, or circumvention of established procedures.

Fidelity (whether Project Spraoi was delivered as planned) was evaluated by the authors using questionnaires and interviews with intervention implementers (Energizers and teachers) and participants (students) (see Table 2). Interviews were conducted 1 week after completion of the Write and Draw task for students and questions were grouped under three headings; (i) write and draw, (ii) intervention activities and interactions with Project Spraoi, and (iii) the school environment. The questionnaire questions were presented as a set of statements with which respondents were asked to indicate their level of agreement using a 5-point Likert scale. Open ended questions were used to gather information about their interactions with Project Spraoi, prompt any unanticipated information and suggest innovation/s to maximise intervention delivery and support at each site.

Dose was evaluated by the authors using PA logs and an Energizer reflective journal to quantify the total minutes of extra daily PA and number of healthy eating lessons delivered by implementers of the Project Spraoi intervention. Teachers indicated, by ticking a box, the time spent (5, 10, 15 or 20 min) and type of activity (Huff and Puff, learning games, activity breaks or other) delivered to their class each day during a given week. Teachers also indicated which day, and for how long, physical education (PE) lessons were delivered each week by ticking the appropriate box and writing in the minutes of PE delivered. If teachers did not deliver any activity on any given day, they indicated the reason why in a comments box.

The Energizer reflective journal included structured questions to quantify dose delivered and received by participants. It also documented what activities were delivered that week and how participants interacted with them using an enjoyment scale. Open ended questions that allowed Energizers the opportunity to openly and honestly reflect on their daily experiences as the interventionist (“What went well?/could be improved?”) were also included.

To evaluate the influence of context on the delivery of Project Spraoi we focused on two evaluation dimensions, (i) barriers (obstacles or challenges that hindered the successful implementation of the project) and facilitators (factors that enhanced or supported the implementation of Project Spraoi) and (ii) adaptations (intentional modifications made to the intervention). Questionnaires and focus groups with implementers of Project Spraoi, Energizers and teachers, were conducted by the lead author and analysed throughout the course of the intervention year (phase 3) to document and track adaptations made to implementation of the intervention across sites. This information enabled Energizers to make mid-course adaptations to intervention delivery in response to individual teacher’s perceived barriers and facilitators. These adaptations were then documented by the lead author using Energizer’s reflective journals and questionnaires.

留言 (0)

沒有登入
gif