Protocol for Improving Care by FAster risk-STratification through use of high sensitivity point-of-care troponin in patients presenting with possible acute coronary syndrome in the EmeRgency department (ICare-FASTER): a stepped-wedge cluster randomised quality improvement initiative

Data monitoring

From a data monitoring perspective, there is no requirement for a Data Monitoring Committee. The data regularly provided by hospitals to the MoH on ED presentations undergo cleaning. From a safety perspective, this is part of routine care and is described below.

Harms

For the purpose of the evaluation of this quality improvement initiative in patients discharged from ED by the pathway, any death within 30 days for any reason related to ischaemic heart disease and/or the patient’s original presentation will be considered a serious adverse event.

Serious adverse events

These will be reported to the principal investigator within 7 days of the event. Reports will include the following:

Grade of event

Date and time of ED visit

Details of ED investigations and biochemistry

Date and time of event

Description of event

The site champion’s assessment of the relatedness (possible, probable, or definite) to the ED visit.

A multidisciplinary clinical governance committee will review any adverse events at each site. Following review events will be acted on depending on severity and in keeping with the standard governance processes for that health institution. Any site may decide to withdraw from the study as a result of such a review, if they have safety concerns.

Auditing

An interim data extract for each individual hospital will occur prior to the end of the initial evaluation period with data from the control and run-in periods only and a simple hospital level analysis will be undertaken by the statistician (JWP) to provide information to hospital management (independent of the investigators) who need to make decisions about ongoing use of POC-cTnI after the end of the initiative evaluation period. This will include an analysis of the uptake of the POC assay and continued use of the laboratory assay.

Assessment for pragmatism

Pragmatic studies, as opposed to explanatory studies, are designed to work within real-world clinical situations. They allow for issues encountered in clinical practice. Results, therefore, take into account the complexities of routine practice.30 The PRagmatic Explanatory Continuum Indicator Summary (PRECIS-2) tool has been designed to help design pragmatic studies.32 On a 5 point scale, with 5 being the most pragmatic and 1 the most explanatory it, it asks nine domain questions. We assessed our design from both the patient and hospital site perspective on each domain using the guidance provided by PRECIS-2 (figure 3).

Figure 3Figure 3Figure 3

PRECIS-2 diagram demonstrating how pragmatic the initiative is from the hospital (open circles) and patient (closed circles) perspective; 5 is very pragmatic, 1 is very explanatory. PRECIS-2, PRagmatic Explanatory Continuum Indicator Summary.

Methodology limitations

Without a detailed inspection of the notes of every ED presentation, we are unable to ascertain which presentations were adjudicated for possible ACS and which had a troponin measured for other reasons (eg, myocarditis or pulmonary embolism). While the run-in phase allows for some delay in starting to use the POC device at a hospital, there could still be delays due to IT development, laboratory approvals, nurse training, site-specific requirements or unforeseen local or national events (eg, such as a pandemic). The size of the study precludes blinded adjudications of myocardial infarction outcomes. However, an audit of ICare-ACS found ICD-10 codes for MACE corresponded to adjudicated MACE in 98% of cases.1 Mortality outcomes must be adjudicated to find those that are cardiac caused, but as these will be small in number this is possible.

A particular challenge with implementing change and assessing with a stepped-wedge design is that the outcome may be influenced by time varying effects.33 The presence of shift, season and time from start of study in the statistical model attempt to account for this. How this time is input into the model, as a categorical, continuous, fixed or random effect may affect the outcome. Our design and analysis approach is similar to that of Anand et al where hospitals with patients with suspected ACS and underwent a change in assessment pathway.17 A post-hoc investigation that included comparing time from study start as a cubic spline, categorical variable and as a random variable, found very little influence compared with using time as a simple linear fixed effect variable.34 Nevertheless, having time as fixed effect assumes a similar linear trend in all hospitals, which may have been the case in the Anand study, but may not be the case in this study. Therefore, we will not finalise the model until after comparing several options, using a simulated intervention and LOS data from a period preceding the present study.

This study design is known for issues where clusters split, drop-out or do not start at the expected time, or have an effect which changes over time.35 The run-in period which allows for gradual adoption (including short delays) and familiarity of the intervention pathway will help mitigate the risk of the latter two issues. We screened all New Zealand hospitals and determined that we could include up to nine hospitals with five clusters of up to two hospitals per cluster. At the time of writing six clusters of one hospital, each have agreed to participate and have been randomised. This provides a sufficient sample size for the preplanned primary and secondary analyses.

留言 (0)

沒有登入
gif