The results of the qualitative analysis explored in detail how implementation teams worked through the toolkit to undertake tailored implementation in relation to (1) developing goals and the focus of their projects; (2) matching strategies to determinants; (3) executing the projects they designed, and (4) reflecting on the impact of their strategies. We also present data on (5): how the context of the effectiveness trial may have shaped the conduct of tailored implementation work in the ImpleMentAll study.
Project goals and focusSites varied in how they approached setting goals and determining the focus of their projects. Within sites that developed multiple projects, some projects aimed to address different goals and barriers, or to address a common barrier across projects but with different strategies, or to undertake similar projects but with different stakeholder groups. This first step was not always straight forward. Agreeing on goals and focus for implementation projects ‘take time’:
…it took quite some time to (…) find agreement on this, because like the ICT guys, they want to improve the usability of the system, but yeah, but for us it’s we just want patients to use the system. So it was a bit of a discussion, so it took us, I think, two meetings. [Site 13]
Developing goals and a focus for projects to be developed using ItFits-toolkit took negotiation. The project-related goals generally aligned with the trial’s primary aim of increasing service uptake. The concept of ‘uptake’ however involves consideration of different points along a trajectory from initial awareness to completing therapy programmes. Sites’ implementation projects targeted different points in this process – for example, some were targeting initial referral and engagement; others were working on increasing the conversion rate from expression of interest to commencing therapy.
For some sites, facilitating staff engagement with the iCBT service was seen as an essential first step to achieving increased service uptake. For example, one site approached this by working on the usability of the iCBT platform and technical integration alongside assessing local needs from a clinician perspective, for developing more targeted strategies. In contrast, another site worked on increasing the proportion of ‘treatable’ patients coming through the registration process. Such an approach may not align with outcomes related to increasing service uptake yet could potentially impact positively on therapy completion.
Choices made about goals should be understood with reference to the ways in which ItFits-toolkit is designed – to get implementation teams to ‘focus on one thing at a time’, while also engaging in stakeholder discussions and designing projects collectively. In one site, the team were initially interested in (and planning to undertake) a ‘project on reach’. After an initial core team meeting, they decided to focus on patient suitability, as this would align better with the preferences of the therapists involved in the service. After that project ended, they set up a new project on reach. Here, there seemed to be a tension – and balance to be achieved—between staying within the focus of the trial (increasing uptake) and working on problems that are central to certain stakeholders. The toolkit’s requirement to choose one goal to focus on (P2 ‘be focused’), was also seen as a challenge for some. One team noted that:
We have to choose at the beginning three goals, and for each goal three barriers, and at the end we have to choose only one…and we wanted to keep all of them, so at the end finally what we are going to do with in any case we will work on the goals we had. It’s just that finally we have one large goal [Site 3]
This team chose the highest-level goal, and then developed a multi-faceted strategy to represent the different goals initially established in Module 1. The teams, at times, found ‘work around’ solutions to navigating the needs of the IMA trial and the ItFits-toolkit process.
Matching barriers and strategiesWorking to engage with a range of stakeholders to generate and verify ideas, was considered a key driver of the matching process. Most of the implementers described how the initial discussions and engagement with stakeholders resulted in a broad range of barriers. They could also be used to check whether they were addressing problems that were relevant to them:
And then we wanted to double check if we’re going into the right direction, so I created this very small survey and sent it out to the [therapists] to collect their opinions if this is really the important next step or if we are on the […] if we’re moving to the wrong direction. And actually, it fits really well […] Yeah, so our goals were not changed through the survey results [Site 7]
So, working with stakeholders in this way, supported implementers in the process of narrowing down their focus, translating their broad ideas into more specific solutions. Engaging with them increased confidence in the implementation activities that they were designing.
They also described how various steps and principles embedded in the toolkit helped them match one barrier to a specific strategy.
I think the toolkit has been very helpful in limiting us, like really narrowing it down. We had lots of ideas in the beginning […]. But when the […] barriers are chosen [it was then] that we really…we really saw that it’s important to be concrete and (..) yeah, I think it helps narrowing us down in our way towards the strategy so that we didn’t just pick, pick and choose everything that we could think of. [Site 2]
This narrowing process was perceived as useful, in part as it was different from implementers’ normal way of working. Implementers liked that the toolkit provided them with both the content – repositories of barriers and strategies – and the technical instructions for carrying out the matching work. In one site, the list of strategies in the toolkit helped them to ‘think outside the box’ (P3 ‘be different’) to consider ‘approaches that I think we would not have usually come up with’ [Site 7]. The pre-specified lists of barriers and strategies for selection were generally considered sufficient and ‘workable’. Some sites did however need to use the toolkit’s option to customise and add their own barriers or strategies to their problem of focus (listed in Table 2).
At another, a strategy to address incoming legal changes, needed faster action and progress than the toolkit would normally suggest – so the iterative cycles of stakeholder engagement were not undertaken.
Planning and executing implementation projectsSites acknowledged that developing strategies and executing projects takes time. At some sites, it was the engagement work with stakeholders on the ground that was labour- and time- intensive. At others, the project work, like adding new content to an existing iCBT platform, ‘took quite some time’. Unexpected challenges sometimes emerged. Despite early involvement of relevant stakeholders, one site discovered that the production and dissemination of new promotional materials they had developed required additional organisational approval.
[We] were not aware that we need an official permission. So, we thought we only have to show it to inform, to present it, but not that it is required as official permission. [Site 10]
Gaining that official permission took an extended period of time. At other sites, problems identified at a later stage through working with stakeholders could be more easily accommodated. For example, a solution that had early direction and backing from stakeholders turned out to not align with their needs.
So it was kind of crazy because you were in contact so much with each other “and yes, this is what we need” but then in the end it seemed like the process was just a little different and it was not exactly what they need […]. [Site 6]
The team ended up shifting their focus. The ItFits-toolkit process was designed to allow for, and to encourage, the adaptation of implementation projects (P6 ‘be flexible’) as they progressed. Some sites also achieved this either by running multiple projects in parallel within the toolkit, or by finishing projects and initiating new ones with a different focus.
The sites’ implementation projects were also influenced by a range of situational factors that were outside of their direct control. These included earthquakes, coronavirus pandemic, and extensive bushfires that caused major disruption to regular service provision in some of the sites, as well as routine ‘holiday periods’ (summer, festive holidays, school holidays) where delays in progressing project plans were anticipated or experienced. The coronavirus pandemic directly influenced many projects. Some impacts were potentially positive, as iCBT services at some sites gained more relevance to target participants under periods of (face-to-face) contact restriction. However, those who relied on referral processes taking place in service settings (such as general practices) were negatively impacted. The pandemic often included a re-prioritisation of the work of those involved in providing care and services during the pandemic. The implementation teams recognised that some of their intended plans and strategies might not be possible to progress during that period. The toolkit allowed teams to adapt and re-prioritise what they focused on, which is what some teams did.
Finally, changes within the implementation teams themselves and their organisations became a limiting factor in some sites. Here, issues relating to staff capacity, staff turnover and changes in management impacted on decisions about the number of projects that teams undertook or delayed or halted specific projects altogether. For example, a site reported dropping one of their (three) planned toolkit projects as a direct result of a team member going on long term leave; in another, an important ‘problem area’ identified by the implementation team was seen as unlikely to be progressed under an incoming manager at the site.
Reflecting on impactIn Module 4, the implementation teams were prompted to make self-assessment of the impact of the strategies they had implemented. Some teams could see some immediate impacts of their work. At Site 3, it was observed that their work to develop ‘nicely made’ guides and resources for practitioners in the iCBT service were well received, and that they found the practitioners to be ‘motivated’ as a result. In another site, face to face training was thought to have increased the numbers of general practitioners referring patients to their service. However, sites generally expressed that within the six exposure months, there ‘hasn’t been enough time’ to observe any impacts from their toolkit projects in terms of ultimate goals of increased service uptake.
Expectations of impact on referral or uptake rates, was at times tempered by a sense of modesty as to the scope of the implementation projects that were possible. One respondent explained that ‘I’m not that optimistic […] I think you need a bigger [awareness raising] campaign’ (Site 10). Due to a range of external constraints, including partner organisations’ policies, the project did not have the scope intended – instead it was kept ‘modest’. Discussion in another site centred on two projects being developed with the toolkit, both focused on increasing referrals. One targeted psychologists, the other on general practitioners and nurses:
I am not sure yet to say how much [impact we] have, for example, in increasing number of referrals, but I think that in overall it was good for us because gives us an impression of what we might do better, what we might change in our activities, implementation as usual. [Site 12]
So, despite such positive assessment—that both these projects enabled a ‘learning process’, a focus on ‘what they might do better’—an increase in referrals and uptake could not be assumed.
For several sites, a lack of access to data to make confident evaluations limited their ability to assess impact as instructed in the toolkit. Some found that gaining access to service level data for assessing progress in relation to metrics like conversion rates (from screening/referral), service uptake, and programme completion was more difficult. For example, one site directly received the numbers of patients registered to use the iCBT platform directly and had basic information about those patients. However, they could not access the numbers of patients engaged in therapy or information about how many sessions they completed, as ‘it was very difficult to get this kind of information by our therapists’ (Site 9). As such, they could not ‘confirm’ the anecdotal information they were gaining. Generally, sites used a more multi-faceted approach to evaluating the impact of their implementation work. Alongside accessing service data, many sites undertook surveys (a functionality of the toolkit), discussions with informants, or even more formal ‘qualitative’ research.
Tailored implementation work in the context of the IMA trialThe implementation work undertaken with ItFits-toolkit was also shaped by the very reason for initially engaging with this process, that this work was taking place within a trial. For example, some mechanisms were built into the toolkit, in part, to enable better data collection from the toolkit within the trial. This included ‘movement’ restrictions—being unable to move back and forth through modules and change ideas and responses – and progression restrictions—separating out of the work into ‘one step at a time’. This was not ideal for some:
[F]or example, we have two goals, which I think maybe they need to be modified or we can edit a little bit, not change them but just edit them, and I don’t think I can go back, for example, to goals and then review the barriers and then go further. [Site 9]
People at several sites expressed a desire for ‘reading ahead’ through the toolkit, to anticipate upcoming work and to allow more efficient planning of work with the implementation team. Although this was possible to an extent and advised during training, some either didn’t see this or found it limited. For some, the separation of Modules 1 (about barriers) and 2 (about strategies) and the inability to move back forth between them and adjust them was counter intuitive. However, several teams undertook multiple projects in the toolkit in parallel, so there was scope to create ‘work arounds’ where teams wanted to work on different strategies at the same time. Notably, the six-month period that sites were asked to complete the full toolkit programme through to the end of Module 4, was reported as challenging. Such a timeline was seen, by some, as ‘too short’ to collaboratively develop and introduce new strategies, let alone see and evaluate any impact from the strategies they implemented.
Although for some, the trial timeline and allocation of resources for implementation work was seen as ‘motivating’, the pace of working through the toolkit modules was slowed down. Some teams reflected that the trial provided research-driven constraints around the way they worked:
If it was an effectiveness trial, we wouldn’t change things [about the service]. [...] if we were trying to implement the service then […] it would be a different process again, I think. We would be trying to (.) constantly and rapidly optimise how we do the implementation. (.) But I think we want to do it systematically and carefully in the context of the trial because we want to know, we want to document what we’re doing to see what effect that is having rather than just trying to throw everything at it and not research them as systematically. [Site 5]
In this way, the broader trial context, alongside the ideas embedded in the toolkit (P1 ‘be pragmatic’; P2 ‘be focused’) changed what they were doing with regards to the iCBT service implementation. However, this raises questions around sustainability of this slower, more methodical approach. As one Implementation Lead noted,
So, if you are part of a trial and you know “OK now you are under this intervention phase”, you want to use [ItFits-toolkit], and we are using it. And we were very engaged in this and motivated and so on, but it was not like “oh now we have in our company a new ItFits-toolkit, it’s part of our processes” […] so it was not the perception that it is a normal workflow in our company. [Site 10]
This was clearly ‘different’ to everyday implementation work. Out with the motivating context of the trial, at this site, ItFits-toolkit was yet to be seen as routine, every day and normal implementation work.
留言 (0)