Training health professionals to reduce overreporting of birthing people who use drugs to child welfare

Overall study design

This evaluation study collected data from health professionals who registered to participate in a professional education webinar about child welfare reporting. Data were collected upon webinar registration (baseline), immediately post-webinar, and then 6 months later (follow-up). Differences in baseline, post-webinar, and 6 month follow-up data were used to examine changes from before to after the webinars in beliefs, attitudes, and practices related to pregnant and birthing people who use drugs in general and who have opioid use disorders (OUD) in particular, and child welfare reporting. The immediate post-webinar surveys also collected participants’ feedback on the webinars.

Webinar description

The webinar is one component of a larger project (Doing Right at Birth) that convened experts in health, policy, law, and advocacy, along with people affected by the child welfare system (CWS) to co-develop, implement, and evaluate trainings (webinars, videos, toolkit) for healthcare providers on legal, scientific, and ethical aspects of CWS reporting and consequences for birthing people with OUD. The goal of the overarching Doing Right at Birth (DRB) project is to make interactions between birthing people with OUD and both health care providers and the CWS more ethically sound, respectful, grounded in evidence, and within but not exceeding legal requirements, and thereby to reduce stigma and discrimination and improve treatment engagement and recovery postpartum and beyond. DRB also seeks to address the racial inequities in CWS reporting [19]. DRB is guided by two core values: First, people who use drugs should be treated with dignity and respect when they seek healthcare. Second, parenting is hard, and we support non-punitive approaches that empower the parent, infant, dyad, and family to thrive together. In this manuscript, we focus on one component of DRB—the webinar.

We co-developed webinar content collaboratively with input and contributions from physicians, nurse midwives, public health researchers, nurses, lawyers, social workers, and people with lived experience of CWS involvement, some of whom participated in the process as members of a community advisory board (CAB). The CAB, which met quarterly by videoconference for the duration of the project, included nine people with relevant personal and professional experience. CAB members were compensated for meetings and to prepare for, participate in, and provide input and feedback outside of each meeting. CAB members who contributed content to the webinar, including videos sharing their own experiences, received additional compensation. We also drew on findings from the previous implementation science research led by the first author about how clinicians make decisions about reporting pregnant and birthing people who use alcohol and/or drugs to child welfare [1]. The webinar was approximately 90 min in length and included didactic presentations by physicians and lawyers which we interspersed with brief recorded videos from community advisory board members and others with relevant lived experience. Topics covered included: the history of the CWS, the legal parameters for reporting, the effects of substance exposure on child development, racial inequities in the CWS, and the consequences of CWS reports to pregnant people, their children, families, and communities (Table 1). We also kept the chat feature open so webinar participants could communicate with webinar panelists and each other in real time.

Table 1 Webinar content outline

We delivered the webinar three separate times over a four month period. We conducted preliminary analyses to examine changes in some beliefs/attitudes items from pre- to immediate post-webinar surveys after each of the first two webinars to assess whether changes in attitudes and beliefs were occurring and, if so, if they were in the hoped for direction. We also used information from open-ended responses about feedback on the webinar from the immediate post-webinar surveys to identify areas for improvement in subsequent webinar deliveries. Based on preliminary analyses of pre-post attitudes/beliefs items, which suggested changes in some of the attitudes/beliefs items in the hoped for direction, we did not alter the content of the webinars. Based on open-ended response feedback and the real-time reactions and conversations in the chat, we made some changes in webinar logistics and in how participants’ experienced the webinar. Logistical changes included tightening sections for length and improving audio. Participant experience changes involved adapting the way we framed content, including shifting the wording we used to describe the webinar purpose and content; adding some language to name emotions that might be coming up for people in response to webinar content; and adapting our approaches to managing the chat.

Study participants and data collection

We advertised the webinar widely through professional networks, our own and others’ listservs, and through social media. 1279 people registered to participate in one of the three webinars. People who registered were invited to participate in the evaluation. Those interested in participating then completed eligibility screening. People were eligible if they worked: with pregnant or birthing people; with newborns; and/or on programs, policies, protocols, or systems related to pregnant or birthing people. Eligible participants then reviewed electronic consent materials, and those who consented were then able to begin the baseline survey. After each of the three webinars, we sent the post-webinar survey at the end of the webinar to participants who had registered for that session and asked them to complete it. We sent the 6-month follow-up survey 6 months after the post-webinar survey and asked participants to complete it. We sent reminders to complete post-webinar and 6-month follow-up surveys three times over the next week after each initial request to complete that survey. Participants were remunerated with a $10 gift card upon completing the post-webinar survey and $50 gift card upon completing the 6 month follow up.

Measures

Outcomes: We included attitudes/beliefs measures in four domains; two (opioid beliefs and child welfare beliefs) that were central foci of the webinar, one (urine testing beliefs) that was a minor focus of the webinar, and one (general attitude/beliefs or “control statements”) that included items we did not address in the webinar and did not expect would change after webinar participation. These beliefs items were on 7-point Likert scales ranging from strongly disagree to strongly agree, with neither agree nor disagree as the middle of the scale. While we focus on individual items in our analyses, we did look at internal consistency for each domain, using Cronbach’s alpha. Alpha for opioid beliefs was 0.83 (11 items), for child welfare beliefs was 0.72 (9 items), for urine testing beliefs was 0.76 (5 items), for general beliefs was 0.66 (8 items).

We also assessed people’s descriptions of their own professional role in terms of child welfare reporting. Participants could report multiple roles, including roles such as “I have final say in the reporting decision” and “I can influence reporting decisions that other people are making.” We also assessed reporting behavior, by asking participants to choose the option that best describes their usual practice in relation to reporting birthing people who used drugs during pregnancy to child welfare: report everyone, report most people, report a few people, report no one. We asked about their overall reporting practices and then their reporting practices for a range of different substances (e.g. alcohol, cannabis, prescription opioids not taken as prescribed, heroin/fentanyl). At baseline, we asked about their past year usual practice. At the 6-month follow-up, we asked about their past 6-month usual practice. We developed these items drawing from existing scales to measure opioid-related stigma [20, 21] and implementation science domains we sought to target with the webinar, [1, 22] content expertise of project team members, community advisory board input, and feedback from a few health professionals who participated in a pilot of the survey.

At the 6 month follow-up, we also asked participants two open ended questions about what they have done differently related to reporting birthing people’s drug use and child welfare in the past 6 months.

Participant characteristics: We also collected data about participant characteristics, all categorical variables. These included health professional role; whether they provide direct patient care; U.S. region in which they practice; urbanicity in terms of rural, suburban, urban; career stage; age; race/ethnicity; and gender.

Analysis

Analyses focused on describing participant attitudes/beliefs, professional role descriptions, and practices, including how they varied across baseline, post-webinar, and 6 month follow-up time points. Before the first webinar, we identified one child welfare belief as our main outcome: “I would rather err on the side of overreporting to child welfare than underreporting to child welfare.” At the grant proposal stage, we specified that changes in (self-reported) reporting practices would be a measure of success.

For attitudes and beliefs items, we created dichotomous outcomes of agree versus disagree/neither agree nor disagree. We conducted chi-square tests to assess whether there were statistically significant differences across the three study time points. We assessed whether findings changed if we used mixed effects regressions, and thus accounted for clustering by individual; findings were almost entirely consistent with the chi-square analysis and so we have chosen to report the chi-square analyses. We also checked whether findings changed if restricted the sample to those completing one or more follow-ups. For the reporting practices outcome, the main analysis only includes participants who stated that they provided direct patient care, as we asked this question differently for people not providing direct patient care. For attitudes/beliefs and professional role items, we conducted sensitivity analyses restricting to those providing direct patient care.

Open-ended responses were reviewed by the first and third author. After reading through the responses, the first author grouped them into whether the change described was in the intended direction (e.g. towards less testing/reporting, sharing information from the DRB webinar with others); no change or “not applicable;” not possible to characterize direction of change; change in a non-intended direction (e.g. towards more testing/reporting). The third author then reviewed the coding and identified places of disagreement. The first and third author then discussed any discrepancies and came to consensus about coding. We then identified sample quotations to show the range of changes participants described.

留言 (0)

沒有登入
gif