- Research
- Open access
- Published:
“You know, it feels like you can trust them”: mixed methods implementation research to inform the scale up of a health disparities-responsive COVID-19 school testing program
Implementation Science Communications volume 5, Article number: 136 (2024)
Abstract
Background
Health disparities lead to negative COVID-19 outcomes for Hispanic/Latino communities. Rapid antigen testing was an important mitigation tool for protecting schools and their communities as in-person learning resumed. Within the context of a 3-middle-school non-inferiority trial we assessed acceptability and appropriateness of at-home and school-based COVID-19 antigen testing and implementation barriers and facilitators to facilitate district-wide scale up.
Methods
Guided by the Consolidated Framework for Implementation Research (CFIR) and acceptability and appropriateness implementation outcomes, we collected post-implementation qualitative (n = 30) and quantitative (n = 454) data in English and Spanish from trial participants, in-depth feedback sessions among program implementers (n = 19) and coded 137 project meeting minutes. Verbatim transcripts were thematically analyzed. We used multivariate linear models to evaluate program acceptability and appropriateness by COVID-19 testing modality and mixed qualitative and quantitative findings for interpretation.
Results
Questionnaire respondents closely matched school demographics (> 80% Hispanic/Latino and 8% Filipino/Asian Pacific Islander). While both testing modalities were rated as highly acceptable and appropriate, at-home testing was consistently favorable. Qualitative findings provided actionable areas for at-home testing program refinement, guiding district-wide scale up including: maintaining a learning climate to accommodate modifications as guidelines changed, needs of the school community, and implementation challenges; ensuring an engaged school leadership and sufficient human resources; improving educational communication about COVID-19 and technology ease of use; and increased time for pre-implementation planning and engagement.
Conclusions
Results underscore the value of the CFIR to inform program implementation, particularly programs to reduce disparities during a public health emergency. Results support optimal testing implementation strategies centering the needs and perspectives of Hispanic/Latinos.
Background
Systematic and structural inequities lead to disproportionately negative health outcomes for under-resourced communities, particularly racial and ethnic minorities [1]. Further, public health emergencies like COVID-19 exacerbate these disparities, with marginalized communities facing increased exposure risk and barriers to accessing mitigation resources [2]. Therefore, identifying factors and barriers that drive these disparities and overcoming them must be central to public health emergency response planning. A robust body of literature supports application of implementation frameworks as a tool to guide the systematic identification of factors impacting intervention success to facilitate program implementation and scalability [3,4,5].
Hispanic/Latino communities experienced heightened vulnerability to COVID-19 during the SARS-CoV2 pandemic. In California, persons of Hispanic/Latino ethnicity represented over 4 million cases (44.3%) and over 42,000 COVID-19 related deaths (41.9%) through May 2023 [6]. Consistent with broader trends, in San Diego County, the Hispanic/Latino population (34.3% of residents) experienced significantly higher COVID-19 incidence, hospitalization, and mortality than non-Hispanic white (NHW) residents [7]. These disparities were driven by lower health insurance coverage rates compared to NHWs (potentially hindering access to diagnostic testing and treatment [8]); fear, mistrust, and stigma regarding testing; and concerns about immigration status and financial repercussions from missed work due to a positive test result [9,10,11]. Furthermore, susceptibility to COVID-19 was exacerbated by elevated rates of pre-existing chronic comorbidities and barriers to healthcare access, leading to delayed diagnosis and heightened exposure within frontline industries [9, 12]. To effectively overcome pandemic-related inequity, tailored interventions responsive to the unique cultural and contextual drivers of health disparities among Hispanics/Latinos are critical. While implementation science has a strong equity focus, it is increasingly being recognized as a central tool to systematically developing and implementing effective disparities-focused interventions [13], including during the COVID-19 pandemic [14,15,16].
Early research supported the efficacy of rapid antigen testing to mitigate the impacts of COVID-19 in schools [17]. School-based testing has several advantages that may address access and utilization disparities among Hispanics/Latinos. Testing occurs in a familiar environment (at the school), overcoming medical mistrust, reducing fear and stigma around testing and precludes parent involvement, overcoming barriers related to work schedules and transportation. However, school-based testing is resource-intensive (staffing, test costs, and logistical administration) and requires missed instruction time [14]. It is also rarely accessible to family members. Distributing rapid antigen at-home tests in schools for use by students, staff, and their households leverages some strengths of a school-based program while extending access to the broader community and reducing household transmission. This modality also relieves much of the logistical burden of testing programs on school administration and could reduce school transmission by allowing individuals to remain home upon testing positive (as opposed to testing positive after coming to school). However, this model may be perceived as burdensome to parents as it places the onus of testing and reporting on them.
We previously conducted a non-inferiority trial comparing at-home versus school-based COVID-19 antigen testing at three predominantly low-income Hispanic/Latino-serving middle schools in San Diego County to ensure COVID-19 testing programming was responsive to contextual community needs. The at-home testing model was not inferior to school-based testing for participation rates and adherence to weekly testing by students and staff [18]. Although there were dips in participation during school breaks, they were much less pronounced in the at-home testing schools than in the schools allocated to school-based testing which were provided community testing resources during breaks.
While these results are promising, prior to scale up of at-home COVID-19 testing in schools, it is critical to first analyze post-implementation multi-stakeholder perspectives to identify facilitators and barriers to program success (i.e., implementation process and determinants) and to optimize program fit. In addition to promoting equity-focused programming, early consideration of implementation processes and outcomes (e.g., appropriateness and acceptability) expedites research translation into practice, which is vital during public health crises. The Consolidated Framework for Implementation Research (CFIR) [19] is a determinants framework designed to evaluate complex factors influencing intervention implementation and effectiveness. CFIR is flexible and comprehensive in addressing factors influencing implementation success [20, 21]. Despite widespread recognition of the value of implementation science in supporting healthy equity, there are few evidence-based interventions tailored for addressing health disparities in pandemics or other public health crises, with even fewer guided by implementation science approaches. In the context of COVID-19 testing, some studies have evaluated implementation outcomes of at-home testing in community [22] or workplace [23] settings, though few have done so in a school setting. Among those conducted in school settings, several were outside the United States (US) and are not directly generalizable to US settings [24, 25]. Of those peer-reviewed studies conducted in the US, one was an exposure driven testing program [26] and the other collected only hypothetical testing attitudes data [27]. Another study had research teams from six school project sites quantitatively rate CFIR [19] implementation constructs and barriers related to school COVID-19 testing and document strategies used to overcome barriers using the Expert Recommendations for Implementation Change (ERIC) matching tool [14, 28]. This study reported consensus ratings from each site, providing valuable insights into factors influencing implementation processes across sights, although no direct qualitative feedback from participants or study teams was collected. We aim to demonstrate the utility of CFIR for tailoring school-based interventions in an equity-minded way in preparation for scale up, even within complex and rapidly evolving settings such as COVID-19.
This manuscript describes mixed methods implementation research findings from our study Communities Fighting COVID!: Returning Our Kids Back to School Safely, funded by the National Institutes of Health Rapid Acceleration of Diagnostics—Underserved Populations (RADx-UP) Return to Schools Initiative [29, 30]. The objectives of this study were: 1) to quantitatively and qualitatively compare implementation outcomes of acceptability and appropriateness [31] between school-based and at-home COVID-19 antigen testing within a predominantly underserved Hispanic/Latino-serving school district from the perspective of test users to ensure that at-home testing is suitable to scale up, and 2) to examine implementation process barriers and facilitators to inform scale up of a final testing program across district middle schools using CFIR [19]. We also qualitatively examined how testing programs may mitigate COVID-19 testing disparities in this setting.
Methods
The present mixed methods implementation study was conducted within the context of a non-inferiority randomized controlled trial, the methods of which have been previously described [18]. In brief, between October 2021-March 2022, we enrolled students and staff at three predominantly low-income Hispanic/Latino-serving middle schools in San Diego County. Among students in these schools, an average of 76.7% were eligible for free and reduced meals and an average of 80.2% were Hispanic/Latino [32]. Two of the three schools were randomized to school-based COVID-19 testing while one was randomized to at-home tests distributed at the school. In the at-home testing school, participants were asked to self-report their results (and any symptoms) using a custom test reporting app that was accessible via mobile phone, tablet/iPad, or PC. In December 2021, COVID-19 testing was promoted to include household members of students and staff at one school-based testing school and the at-home testing school. Household members at the other school-based testing school could participate, but it was not promoted.
A total of 264 participants (199 students and 65 staff, representing 34.7% of the school's population) were enrolled in the at-home testing group, while 588 participants (500 students and 88 staff, accounting for 38.6% of the schools’ population) were enrolled in the onsite testing group. Thirty-five students and 17 staff and 43 students and 30 staff in the at-home and onsite groups respectively missed reporting/completing tests during the last 4 weeks or more of the trial [18]. Overall, 51.02% and 45.56% of the suggested weekly screening tests were completed in the at-home and onsite school testing groups, respectively.
Figure 1 describes the implementation staffing structure during the non-inferiority trial. Most study staff working directly with the school staff, students, and families were bilingual (English and Spanish) and bicultural.
To inform the scale up of a refined COVID-19 testing program to the remaining nine district middle schools, we collected post-implementation mixed methods data evaluating implementation outcomes and process. We collected qualitative data from parents/guardians of students who participated in the trial to compare the acceptability and appropriateness of the two models. We also collected quantitative acceptability and appropriateness data during one week of testing soon after the trial was concluded. These mixed methods data on implementation outcomes ensured that at-home testing was appropriate to scale up from the user perspective. Lastly, we collected qualitative implementation process data from program implementers and stakeholders post-trial to determine facilitators and modifiable barriers to the at-home testing implementation, informing modifications in the final testing program.
Participants and data collection
Qualitative
Following the 21-week noninferiority trial, DC and CS co-conducted 13 parent listening sessions (n = 30) lasting an average of 30 min each; eight of these were face-to-face in a participating school private room (n = 17) and the remainder via Zoom (San Jose, CA). Eight were conducted in Spanish, the remainder in English. DC, CS, and EO conducted 14 in-depth staff feedback sessions via Zoom organized by project role (n = 19). Liaisons/Community Health Workers (CHWs) and supervisors/coordinators participated for an average of 3.5 h over three to four sessions and school/district staff participated for an average of two hours to cover the full range of questions, ensure saturation of themes, and accommodate availability.
DC and CS were Spanish/English bilingual and bicultural and a male Master of Public Health Student and female project staff with B.S.-level education respectively, trained and experienced in qualitative facilitation on COVID-19. EO, a study MPI not involved in day-to-day operations, was available to answer participant questions and provide any needed clarification. DC and CS were familiar with the study and did not have prior relationships with the parents, but CS may have had some familiarity with staff through project meetings. To develop rapport, the facilitators explained their roles, the session purpose, and confidentiality guidelines, and asked attendees to take turns speaking and provide candid responses. Parent participants were recruited through an open email invitation to parents with students participating in the trial; district and participating school staff identified as key stakeholders were purposively sampled through email or phone. Notes were taken during all sessions.
Sessions were audio recorded and transcribed; Spanish transcripts were translated to English. In addition, minutes from meetings involving study staff and school administrators/staff (n = 58), school district staff (n = 60), and community organization partners (n = 19) from September 3, 2021, through June 6, 2022, were reviewed.
Theoretical underpinnings of qualitative component
The CFIR [19] informed interview guide development, codebook development, analysis, and interpretation for the implementer groups/in-depth interviews, whereas Proctor et. al.’s [31] implementation outcomes of acceptability and appropriateness framed the parent listening session methods. The CFIR is comprised of thirty-nine constructs organized into five domains: inner setting (construct e.g. implementation climate), outer setting (construct e.g. patient needs and resources), intervention characteristics (construct e.g. relative advantage), process (construct e.g. engaging) and individual characteristics (construct e.g. knowledge and beliefs about the intervention). The CFIR website which offers resources for the development of both qualitative and quantitative tools to measure CFIR constructs (www.CFIRguide.org), supported study materials development, including interview guides and codebook development. Of note, at the time of study development and data collection, the CFIR was still in its first iteration (CFIR 1.0). Since then, an updated version of the framework, CFIR 2.0, has been developed [33]. However, given CFIR 1.0 was used to design our study materials, for consistency we opted to continue to use the original CFIR for analysis and interpretation. Participants were not engaged in reviewing transcripts. The findings of the focus groups, interviews, and listening sessions are reported using the criteria for reporting qualitative research (COREQ). (See Additional file 1) [34].
Quantitative
Acceptability outcomes capturing Proctor et al.’s (2011) [31] implementation outcome acceptability conceptualization (content, complexity, comfort) included two items from Weiner et al., 2017’s implementation outcomes acceptability scale [35] and five items adapted from Kurth et al., 2016’s HIV self-testing acceptability items [36]: e.g., how much individuals liked the testing, ease of access, confidence in the test result, and comfort. Appropriateness, which also followed Proctor et al.’s (2011) [31] implementation outcome conceptualization, captured perceptions of perceived fit, relevance, and suitability of the program for their school using four items adapted from Weiner et al., 2017’s implementation outcomes appropriateness scale [35]. These items were assessed on a 1 (completely disagree) to 5 (completely agree) scale.
The questionnaire was offered to participants over one week in April 2022 in English and Spanish. School-based testing participants completed it at the time of testing via Android tablet. At-home testing participants completed it when they reported their test results online. This yielded 409 responses (n = 97 school-based and n = 312 at-home participants). Additionally, we emailed the questionnaire link to participants who did not test that week, yielding 45 additional responses (n = 16 school-based and n = 29 at-home participants). The final sample of 454 represents a 41.2% response rate of the 1,080 enrolled.
All study measures were approved by the San Diego State University Institutional Review Board and written informed consent and child assent/parental consent were obtained before participation.
Data analysis
Qualitative
Guided by the CFIR 1.0 (2009) [19], we used the Framework Method [37], a highly systematic type of thematic analysis that utilizes matrices to visualize data, to guide the interpretation of our study results. Codebook development was deductive, with CFIR domains and constructs comprising parent and child codes. We followed a parallel approach for Proctor et al.’s [31] implementation outcomes of acceptability and appropriateness in parent listening sessions. Transcripts of audio files were imported into HyperRESEARCH version 4.5.4 (Researchware, Inc.) software for analysis. Transcripts were individually coded by two trained qualitative researchers (CS and DC) supervised by one of the study MPIs (SMK). The two researchers coded an initial batch of transcripts in parallel and reconciled code application to code to consensus. The remaining transcripts were individually coded; a final kappa of > 0.85 was achieved, suggesting strong interrater reliability [38]. After data coding, the Framework Method [37] was used to organize the qualitative data and undertake interpretive thematic analysis, supporting a systematic and rigorous review of the findings [39]. Summaries and coded excerpts from HyperRESEARCH were entered into a framework matrix in Excel to facilitate comparison of responses across participants by CFIR construct or implementation outcome. To promote credibility and confirmability of findings, we used a reflexive team-based approach to review the matrix content and further synthesize emergent themes. Our study team consisted of individuals with extensive qualitative experience and familiarity with the study community. Finally, our use of purposeful sampling promotes transferability by capturing a broad range of experiences and voices. Participants did not provide feedback on the findings.
Quantitative
Using multivariate linear regression for outcomes of acceptability and appropriateness, we constructed models with individual construct items as outcomes to enable detection of potentially subtle differences in aspects of acceptability and appropriateness with COVID-19 testing modality (at-home vs school-based) as the exposure of interest. Ethnicity was dichotomized (Latino/a vs. non-Latino/a) as the school demographics did not allow statistical comparisons for additional racial/ethnic groups due to small cell sizes. We examined ethnicity as a main effect and as a potential moderator of differences (by creating an interaction term ethnicity*testing modality) in the outcomes by testing modality. We adjusted for gender identity, participant type (student, staff, parent), weeks able to participate in testing since enrolling, and the proportion of weekly tests completed since enrolling. We present results of the overall multivariate tests for differences by testing modality and ethnicity. IBM SPSS Statistics version 29 was used for analysis.
Mixed methods
Mixed methods research can enhance the rigor of implementation research by integrating empirical data on who adopts or does not adopt an intervention with qualitative data that explains the reasons behind these adoption decisions. Both qualitative and quantitative results were available around intervention appropriateness and acceptability from the perspective of study participants and/or parents/guardians. These data were gathered concurrently and given equal weight in the analysis. First, qualitative and quantitative data were analyzed separately. We then used a joint display to facilitate integration of mixed methods results using a side-by-side comparison of statistics (quant) and themes (qual) to examine convergence, expansion, and complementarity when developing overall conclusions. First the quantitative results were entered into an excel sheet by scale item within each implementation outcome. We then reviewed our qualitative framework matrix for data relevant to each item. Integrated mixed methods results were then interpreted. Consideration the qualitative and quantitative data collectively supported a more comprehensive interpretation of which components of each intervention were working and for whom, improving our ability to identify areas for improvement.
Results
Acceptability and appropriateness: quantitative and mixed methods results
From the quantitative implementation outcomes data, just over half of participants were female (55.7%, n = 253), 42.5% were male (n = 193), while 1.8% (n = 8) identified as a gender minority. The overwhelming majority of participants were Hispanic/Latino (83.5%, n = 379), 8.1% (n = 37) were Filipino/Asian Pacific Islander (local definition), 0.9% (n = 4) were Black, 4.4% (n = 20) were NHW, and 3.1% (n = 14) reported another race/ethnicity, closely matching school demographics. Most respondents were students (80.2%, n = 364), while the remainder were staff (11.5%, n = 52), and family/household members (8.4%, n = 38).
Figures 2 and 3 compare acceptability and appropriateness items by testing modality and Hispanic/Latino vs. non-Hispanic/Latino. These figures show that while both testing modalities were highly endorsed, at-home testing was consistently favorable. Table 1 presents mixed methods analyses of acceptability and appropriateness, adding qualitative data from staff and parents/guardians whose children participated in the intervention in a joint display.
Acceptability items implementation outcomes for middle school COVID-19 testing programs: school-based vs. at-home testing. Notes: At home vs. school-based multivariate test: F11,435 = 5.72, p < 0.001, η2 0.13. Hispanic/Latino vs Non-Hispanic/Latino multivariate test: F11,435 = 1.54, p = 0.12. All individual item comparisons from the multivariate model between at-home and school-based testing significant at p < 0.001, η2 ranging from 0.06 to 0.10. Individual item comparisons for Hispanic/Latino vs Non-Hispanic/Latino not significantly different except “welcome the testing program,” B = -0.19, t = -2.10, p = 0.04, η2 0.01. No statistically significant interaction between testing modality and ethnicity
Appropriateness items implementation outcomes for middle school COVID-19 testing programs: school-based vs. at-home testing. Notes: At home vs. school-based multivariate test: F11,435 = 5.72, p < 0.001, η2 0.13. Hispanic/Latino vs Non-Hispanic/Latino multivariate test: F11,435 = 1.54, p = 0.12. All individual item comparisons from the multivariate model between at-home and school-based testing significant at p < 0.001, partial eta squares ranging from 0.07 to 0.10. Individual item comparisons for Hispanic/Latino vs Non-Hispanic/Latino all significantly different: B = -0.20, t = -2.13, p = 0.03, η2 0.01; B = -0.27, t = -2.95, p < 0.01, η2 0.02; B = -0.24, t = -2.71, p = 0.01, η2 0.02; B = -0.21, t = -2.23, p = 0.03, η2 0.01 for items in Fig. 3 respectively. No statistically significant interaction between testing modality and ethnicity
Key stakeholder perspectives on contextual determinants of success and challenges in implementing the COVID-19 in-school and at-home testing programs
Qualitative data included in-depth discussions with school district and study staff (n = 19) guided by CFIR domains and subdomains and coded meeting minutes from 137 programmatic meetings with school staff, administrators, district officials and representatives from a local community-based organization providing COVID-19 related services. These results are organized broadly by CFIR domain and then further organized by specific CFIR construct.
Inner setting
Compatibility
At-home testing was described as a better fit for existing workflow processes by school and study staff because it didn’t require students to miss instruction time, and the primary implementers were external staff. The at-home modality also reduced the risk of in-school transmission with early identification.
“There's a lot of benefits to make this accessible, so they can do it [test] at home. So, they're not coming already, you know, when they're positive getting other people exposed and also helping alleviate a little bit-, taking a little bit of the burden off school staff to act as like healthcare providers.” -Liaison/CHW (regarding the at-home testing modality).
Learning climate
Participants from both study arms described a flexible learning environment that accommodated real-time modifications and innovation including being responsive to changing testing guidelines. Modifications also included logistical adjustments for faster test kit distribution such as database and process modifications, pre-printed test kit labels, and pre-bagging test kits. As one Liaison/CHW (implementing the at-home modality) described, “I feel like I have the full support of the [program] administration and um, basically everybody that I work with in making any changes…”.
Leadership engagement
Engaged leadership at all levels was identified as crucial for scale up of either testing modality. CHWs unanimously indicated that supervisors played a critical role in implementation success. At the district level, a key individual was identified as a testing program champion, able to step in and foster leadership engagement when school leaders were less involved. Engagement among school leadership was variable. Liaisons/CHWs reported a relationship between leadership engagement level and program enrollment for both testing modalities. Schools with consistently high participation rates had broad support from various cadres of school staff. This was true for both testing modalities.
Available resources
Custodial staff were identified as key support for their assistance in accessing the space and resources needed to implement the program (e.g., setting up tables, chairs, and tents, transporting test kit cases).
“a lot of [program success] has to do with the support that you have from the administration and then and the staff because […] in some schools, the administration or staff were not like full on supportive of our project, and I think that's the first step in order for the project to be effective. […] and with time we saw that with different schools, administration and staff members were more supportive and then that was reflected on the numbers.” -Liaison/CHW (implementing the school-based testing modality).
School-based material resources were deemed sufficient to support implementation efforts for both testing modalities, though school and district staff indicated that additional human resources (onsite staff) would be required to scale up and sustain at-home test kit distribution. While at-home testing bridged gaps in testing access during school breaks, some parents had challenges completing the online test results reporting as students didn’t have school-provided iPads during breaks.
Outer setting
Patient needs and resources
Both testing modalities increased access to free COVID-19 testing, a driver of disparities in testing access and uptake. Some implementers described how the school-based testing modality was less burdensome for parents because it did not require any action on their part other than initial student enrollment.
“I feel that, when it was in-person testing, it was less of a burden from the parents and the students, because they could they knew like: “ok, I just registered my child and I know they're getting tested at school”, so they didn't even bother to do anything at home or be worried if they had to test or anything because they knew they were in the program and they were able to test every scheduled date that the school was assigned.” -Study staff (involved with both modalities).
Most school district and study staff indicated the at-home testing modality was more responsive to the needs of families as it overcame testing access barriers also identified by parents (e.g., access to free tests for the entire household). The school district serves predominantly low-income households, many of which are multigenerational. At-home tests available in local stores were costly and often out-of-stock. While we expanded school-based testing to household members three months into the trial, this still required in-person school visits during typical working hours. External to the study, lines at community testing sites were long (> 2 h) at times of high community transmission and many had limited hours, often during traditional working hours. The following quotes illustrate the benefits of the at-home modality for the entire household.
“One story that sticks out to me is that the dad has a lot of illnesses, so he would be considered high-risk. And so, they (the family) spent so much, I mean just talking about hundreds of dollars on testing for the whole family, and it was a big financial burden. […] She was just so happy that now they would have access to these at-home tests, so they can continue to do that for the whole household. It was really good-, I mean you should have seen her face. She was just so grateful that something like this existed in her school.” -Liaison/CHW (implementing the at-home modality).
“So, if they were to use, I think it would be super helpful and helping future outbreaks and just keeping their families healthy and safe, because in, at least in my community, um, it's multicultural. I mean-, we have a lot of-, like we have Filipinos and Latinos and they tend to live in multi-generational homes, so if they go to school, they catch COVID and they bring it to grandma. Or other, you know, family members. So, we can help them, you know.” -Liaison/CHW (implementing the at-home modality).
Study, school, and school district staff felt that parents in higher SES communities would not place as much value on the increased access to test kits afforded through the at-home testing program, underscoring the value of assessing context and needs prior to scale up.
Intervention characteristics
Relative advantage
Convenience (testing any time) and breadth (household enrollment) of the at-home testing program were noted as advantages over the school-based testing model, which often involved parents losing work time or students missing instruction or lunch time to get tested at school. At-home testing reduced line wait times by allowing retrieval of four weeks of test kits. They also increased privacy and reduced community exposure. However, there were some stakeholders who noted that at-home testing also added responsibility of reporting test results to the parents, which was a program requirement.
“If you come in person, and then you get your results right then, I don't know. If someone else is doing it for you versus take this home–you're responsible for testing, you're responsible for scanning, you know. Some parents don't-, it's kind of 50/50, some parents like doing it at home when-, at 10 o'clock at night if that's what they choose, compared to like “Oh we're only open till one o'clock,” you know. It doesn't always work for everybody to be off work and take your student and-, or yourself so um yeah…” -school district staff (familiar with both modalities).
Adaptability
Stakeholders described several instances where they made modifications to improve program implementation, which indicated that the testing program was adaptable to both the changing context of COVID-19 as a disease and evolving guidelines. However, at the beginning, it was not always clear to school and district staff whom to consult within the university team when questions arose. This was addressed by identifying communication channels and clarifying roles and assigning Liaisons and CHWs in the schools daily to ease consultation.
“…schools have had to shift, based on you know, like “Oh, you know what. We had a meeting, and it wasn’t well attended” and that happens during like, your coffee with the principal, you know. This time didn't work out so we're going to shift to go to another time, so there's constant shifting to increase outcomes…”. -school district staff (familiar with both modalities).
Complexity
Stakeholders thought that the instructions for swabbing at home were easy to follow; parents and their children (middle school students) could do the swabbing/tests. However, using technology to register and submit results was uncomfortable for many families, with confusion remaining even after the process was explained.
“So I was thinking of in terms of, so, I do a lot of outreach to parents, and the difficulty that we had was the-, the technological aspect of it, so like many parents didn’t understand how to submit results or just overall understanding how the program works, and so there was a ton of confusion around that even though we explained several times how to do it–I believe that was the biggest barrier that they had, was the submitting results process”. -school district staff (familiar with both modalities describing at home testing).
Process
Reflecting and evaluating
Stakeholders described an iterative pattern of implementing, reflecting and evaluating as a team and then responding to implementation challenges and barriers (adapting) as they arose. Despite some confusion around communication channels at the beginning of each of the testing programs, these were resolved with time.
“So, we have weekly staff meetings with them and we give each school like 3 to 4 min to share anything from their school–it's like share any updates, any complications, any issues, you know, from your school and then anything that is brought up to us, you know, then we take it over to [the school district nurse] which is from the school district side. So, if there's something that it's like school district related, you know, it's like ok, we take it to [the district nurse] because that's their side, and then if there's anything from the [study/research] perspective, then we take it to our weekly [study leadership] staff meeting.” -Supervisor/coordinator (who oversaw both testing modalities).
‘Engaging’ and ‘Formally appointed internal implementation leaders’ constructs
Again, level of support from school staff and leadership was mentioned as critical to faithful program implementation for either modality. Stakeholders at schools with broad support reported reaching recruitment and participation targets and achieving implementation as planned while those at schools with less support described challenges.
“It was thanks to everyone who was involved—the assistant principal, the COVID liaison and spreading the word out […] we got to the 20% [recruitment] that was required. So that was one of the goals for the implementation. But again it's-, it's thanks to the collaboration between the institutions and the good communication.” -Liaison/CHW (implementing the at-home modality).
While recruitment goals were met, all respondents noted that while getting individuals to sign up for either testing program was relatively easy, convincing them to remain engaged was more challenging.
“Engaging hasn't really been an issue, at least not for me for my end it's the retention piece that's the hardest […] I think it's the retention, because when you can-, you can see it on the queue when we're making calls and stuff for testing and pickups. You can see that, like we were able to engage them, we were able to enroll them. Most we were able to get them to pick up their tests, not all, but it's the retention.” -Liaison/CHW (implementing the at-home modality).
Individual characteristics
Knowledge and beliefs about the intervention
The politicization of COVID-19 was described by several stakeholders as a barrier to ongoing testing, both in terms of engaging school staff and leadership as well as members of the community.
“I think until we can separate politics from testing, that testing will always be political. And so, I don't think anything has to do, necessarily, with the-, the study but-, but-, but because of that [view], it is clouded by it, by that. just-, due to the sheer nature of what's happening with, the nature and politics of testing, and, vaccines and you know protocol and safety, unfortunately, right? And until that’s separated, I think there's always going to be this division or barrier, and I don't have an answer on how to separate that unfortunately.” -School district staff (familiar with both modalities).
Self-efficacy
Most Liaisons/CHWs expressed confidence in their execution of both programs, citing lessons learned along the way to improve their self-efficacy around implementation. Liaisons/CHWs also indicated that confidence in their ability to implement either program was largely dependent on the level of support received from staff and leadership (as described in the inner setting and process domains) that varied across schools.
“I think right now the program is more polished, more solid. We have a better foundation than when we started. So, for the next school year, there's more confidence in the program and the support that we have from schools and from the school district, too. So, it's just a matter of implementing the things that we have learned since from the beginning.” -Liaison/CHW (implementing the at-home modality).
Cross-cutting themes on facilitators and recommendations to improve scale up of at-home testing, the adopted modality for scale up
To inform the scale up of at-home testing to all middle schools (2 prior onsite testing schools and 8 new schools), Table 2 presents the emergent qualitative themes from the qualitative data with accompanying quotes on recommendations that emerged from individuals involved in implementation of both modalities to facilitate implementation and overcome barriers to implementation of the at-home testing program.
Discussion
This study demonstrated that distribution of at-home COVID-19 tests in middle schools was viewed as more acceptable and appropriate to participating students, parents, and school staff, than school-based rapid antigen testing in a school district serving predominantly low-SES and majority Hispanic/Latino students in southern San Diego County. Both programs were highly acceptable and appropriate overall, supplementing results from our non-inferiority trial [18] supporting the scale up of at-home testing in this setting. Our primary qualitative findings, guided by the CFIR 1.0 [19], provided actionable areas for at-home testing program refinement to improve implementation during scale up. We also gained insights into how the at-home testing modality may have mitigated COVID-19 testing disparities.
Participating schools were in communities disproportionately burdened by COVID-19 morbidity and mortality. Many households are multigenerational, lack comprehensive healthcare, and have heads of household engaged in essential occupations and frontline industries with limited work flexibility, consistent with prior literature [9,10,11,12]. Prior research suggested that free at-home testing kits for all household members offers benefits for those who are uninsured and lack financial resources [9, 12, 40]. This emerged as a strength of the at-home testing modality in our study as well. At-home testing facilitates early testing, timely diagnosis, and home isolation. Stakeholders highlighted the value of early testing and identification of positive cases at home to avoid school spread [17, 41]. Additionally, at the time of the non-inferiority trial, appointments for testing were challenging to secure, often only available during the workday with prohibitively long wait times [8]. Stakeholders highlighted that access to the at-home modality was more accommodating, provided instantaneous test results, and supported the identification of a positive case before sending children to school, which avoided mid-day phone calls from the school to pick up their sick child. Participants described feeling safer knowing they could support their children and other vulnerable family members, like the elderly, through increased testing access, ultimately overcoming structural testing barriers that contribute to health inequity. In line with these findings, to address broader community needs, future school-based public health programs should use students as a gateway to extend services to the wider community through household engagement.
We identified several recommendations to improve scale up of at-home testing program implementation. A recurring sentiment was the need for adequate staff resources at each school to accommodate testing activities, including technical support and guidance for participants struggling to report weekly test results. Hispanics/Latinos from lower-income communities face health literacy challenges and greater barriers to accessing health technologies [42]. Ensuring access to study staff at convenient times face-to-face or by phone was considered essential to program success and consistent principles of equitable design [43]. The at-home testing model should continue using dedicated and skilled bilingual school-based staff to avoid adding new demands and responsibilities on already constrained school systems [14]. Further, consistency in program staffing was viewed as important to foster community trust, which is critical for Hispanics/Latinos who often experience higher rates of mistrust and past discrimination [44]. These recommendations are broadly transferable to public health interventions implemented in lower resourced and minority communities which will need to prioritize overcoming challenges related to health literacy, familiarity with technology and community trust in order to see widespread adoption.
To address barriers in health literacy that may impact both test utilization and accuracy of testing, stakeholders indicated the importance of easy-to-follow instructions, including videos, for at-home testing (swabbing) and reporting in both Spanish and English. Study staff emphasized that COVID-19 fatigue presented a threat to program retention and underscored the need for ongoing communication and messaging around the benefits of testing to encourage continued engagement, particularly during lower community transmission. Tailoring messaging to appeal to those served by the schools will increase their saliency and further support scale up.
Improved communication was recognized as key to scale up success within the implementation setting (i.e., between school staff/leadership and university staff). Study staff indicated that greater involvement of leadership at the school and district level would improve implementation outcomes, including community uptake. Study staff recommended identifying and engaging additional school champions to support testing activities. Several examples of these champions were provided (e.g., a coach). Engaging multiple layers of leadership will be useful in scale up efforts to maintain a flexible learning climate adaptable to ever-changing COVID-19 pandemic parameters. Stakeholders described an open learning environment during the trial, facilitating modifications to program implementation and organizational change adaptations in the deployment of resources which improved the at-home testing modality fit with participant needs and most specifically for Hispanics/Latinos. Continued flexibility and attentiveness to the changing needs and priorities of the Hispanic/Latino community will be essential to ensuring the program is reaching those most likely to experience health disparities [45,46,47].
A final noted challenge impacting both testing programs was the politicization of COVID-19, which bred skepticism among some school leadership and may have impacted parent enthusiasm. The opinions of community leaders influence community attitudes and could impact program success [48]. Considering this, early and close engagement of leadership in program roll-out and sensitization of the importance of ongoing testing may improve program support. While this finding is context-specific, it underscores the importance of tailoring messages to resonate with communities and maintaining political neutrality.
The present study demonstrates the value of CFIR as a tool to support understanding of complex factors influencing success of implementation of school based public health programming. Of note, CFIR 1.0’s outer setting domain (which captures political will and attitudes in the community) has previously been noted as underdeveloped. In response, CFIR 2.0 (2022) [33] introduced several new constructs to fill these gaps, including those relevant to our study, such as the ‘critical incident’ construct, which can help analyze the political impact on programmatic uptake. Although the timing of our work precluded the use of CFIR 2.0, its application in future studies will enable a more comprehensive assessment of the determinants of implementation success across various domains.
Limitations and strengths
Quantitative survey participation was voluntary and those who opted in may differ from those who declined, reducing generalizability. Similarly, qualitative participants were purposively sampled, and their views may not be representative of all perspectives. Small but significant differences were noted by ethnic group in the quantitative data, with Latino/a participants reporting slightly lower appropriateness than non-Latino/a participants for both testing programs. However, the qualitative data did not corroborate these findings. Finally, like all studies that rely on self-reported data, results from home-based tests may be prone to reporting bias. We did not delve into this possible bias in this mixed methods research.
Many strengths offset these limitations. We used a mixed methods approach to generate rich findings, applied an implementation science framework as a tool to ameliorate health disparities, and centered the voices of those disproportionately burdened by the COVID-19 pandemic.
Conclusions
This study highlights the value of leveraging CFIR to inform program implementation to reduce health disparities. While access to both testing modalities was considered acceptable and appropriate, the at-home testing modality overcame barriers to testing access faced by marginalized communities. Integration of feedback from this study into the at-home testing program scale up will support program implementation optimization and promote sustainability for Hispanics/Latinos most in need of the testing program services.
Data availability
The datasets supporting the conclusions of this article are available in the NIH RADx Data Hub repository, (https://radx-hub.nih.gov/home). Qualitative data are not publicly available for confidentiality reasons given that they contain information that could pose a risk of identifying research participants.
Abbreviations
- CBO:
-
Community based organization
- CDC:
-
Centers for Disease Control and Prevention
- CFIR:
-
Consolidated Framework for Implementation Research
- CHW:
-
Community Health Worker
- NHW:
-
non-Hispanic white
- MPI:
-
Multiple principal investigator
- SARS-CoV-2:
-
Severe acute respiratory syndrome coronavirus 2
- SES:
-
Socioeconomic status
References
Brown AF, Ma GX, Miranda J, et al. Structural interventions to reduce and eliminate health disparities. Am J Public Health. 2019;109:S72–8. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.2018.304844.
Raker EJ, Arcaya MC, Lowe SR, et al. Mitigating health disparities after natural disasters: lessons from The RISK project: study examines mitigating health disparities after natural disasters. Health Aff. 2020;39:2128–35. https://doiorg.publicaciones.saludcastillayleon.es/10.1377/hlthaff.2020.01161.
Kirk MA, Kelley C, Yankey N, et al. A systematic review of the use of the consolidated framework for implementation research. Implement Sci. 2016;11:72. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-016-0437-z.
Moullin JC, Dickson KS, Stadnick NA, et al. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14:1. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-018-0842-6.
Gaglio B, Shoup JA, Glasgow RE. The RE-AIM framework: a systematic review of use over time. Am J Public Health. 2013;103(e38–46):20130418. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/ajph.2013.301299.
California Department of Public Health. COVID-19 age, race and ethnicity data, https://www.cdph.ca.gov/Programs/CID/DCDC/Pages/COVID-19/Age-Race-Ethnicity.aspx. 2023, Accessed 4/30/24.
De Ramos IP, Lazo M, Schnake-Mahl A, et al. COVID-19 outcomes among the Hispanic population of 27 large US cities, 2020–2021. Am J Public Health. 2022;112:1034–44. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.2022.306809.
Pond EN, Rutkow L, Blauer B, et al. Disparities in SARS-CoV-2 Testing for Hispanic/Latino Populations: an analysis of state-published demographic data. J Public Health Manag Pract. 2022;28(330–333):20220209. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000001510.
Pedraza L, Villela R, Kamatgi V, et al. The impact of COVID-19 in the latinx community. HCA Healthc J Med 2022;3:97–104. 20220628. https://doiorg.publicaciones.saludcastillayleon.es/10.36518/2689-0216.1387.
Searcy JA, Cioffi CC, Tavalire HF, et al. Reaching latinx communities with algorithmic optimization for SARS-CoV-2 Testing Locations. Prev Sci. 2023;24(1249–1260):20230109. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11121-022-01478-x.
Garcini LM, Pham TT, Ambriz AM, et al. COVID-19 diagnostic testing among underserved Latino communities: Barriers and facilitators. Health Soc Care Community. 2022;30:e1907–16. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/hsc.13621.
Calo WA, Murray A, Francis E, et al. Reaching the hispanic community about COVID-19 through existing chronic disease prevention programs. Prev Chronic Dis. 2020; 17. https://doiorg.publicaciones.saludcastillayleon.es/10.5888/pcd17.200165.
Brownson RC, Kumanyika SK, Kreuter MW, et al. Implementation science should give higher priority to health equity. Implement Sci. 2021;16:1–16. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-021-01097-0.
Haroz EE, Kalb LG, Newland JG, et al. Implementation of school-based COVID-19 testing programs in underserved populations. Pediatrics 2022; 149. https://doiorg.publicaciones.saludcastillayleon.es/10.1542/peds.2021-054268G.
Stadnick NA, Laurent LC, Cain KL, et al. Community-engaged optimization of COVID-19 rapid evaluation and testing experiences: roll-out implementation optimization trial. Implement Sci. 2023;18(46):20231002. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-023-01306-y.
McLoughlin GM, Martinez O. Dissemination and implementation science to advance health equity: an imperative for systemic change. Commonhealth (Philadelphia, Pa). 2022;3:75.
Malone JD, Thihalolipavan S, Bakhtar O, et al. COVID‐19 rapid antigen testing implementation in California K‐12 Schools. J School Health. 2022. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/josh.13219
Kiene SM, McDaniels-Davidson C, Lin C-D, et al. At-Home Versus Onsite COVID-19 school-based testing: a randomized noninferiority trial. Pediatrics 2023; 152. https://doiorg.publicaciones.saludcastillayleon.es/10.1542/peds.2022-060352F.
Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci 2009; 4: 50. 2009/08/12. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-4-50.
Keith RE, Crosson JC, O’Malley AS, et al. Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation. Implement Sci. 2017;12(15):20170210. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-017-0550-7.
Safaeinili N, Brown-Johnson C, Shaw JG, et al. CFIR simplified: Pragmatic application of and adaptations to the Consolidated Framework for Implementation Research (CFIR) for evaluation of a patient-centered care transformation within a learning health system. Learn Health Syst. 2020;4:e10201. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/lrh2.10201.
Cross LM, DeFosset A, Yusuf B, et al. Exploring barriers and facilitators of implementing an at-home SARS-CoV-2 antigen self-testing intervention: The Rapid Acceleration of Diagnostics-Underserved Populations (RADx-UP) initiatives. PLoS One. 2023;18(e0294458):20231116. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0294458.
Nguyen N, Lane B, Lee S, et al. A mixed methods study evaluating acceptability of a daily COVID-19 testing regimen with a mobile-app connected, at-home, rapid antigen test: implications for current and future pandemics. PLoS One. 2022;17:e0267766. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0267766.
Overmars I, Justice F, Kaufman J, et al. Acceptability of an asymptomatic COVID-19 screening program for schools in Victoria, Australia: a qualitative study with caregivers from priority populations. Public Health Res Pract. 2024:20240625. https://doiorg.publicaciones.saludcastillayleon.es/10.17061/phrp34232407.
Colom-Cadena A, Martínez-Riveros H, Bordas A, et al. Feasibility of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) antigen self-testing in school and summer camp attendees. Front Pediatr. 2022;10(975454):20230110. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fped.2022.975454.
Boutzoukas AE, Zimmerman KO, Mann TK, et al. A school-based SARS-CoV-2 testing program: testing uptake and quarantine length after in-school exposures. Pediatrics 2022; 149. https://doiorg.publicaciones.saludcastillayleon.es/10.1542/peds.2021-054268J.
Unger JB, Soto D, Lee R, et al. COVID-19 Testing in Schools: Perspectives of School Administrators, Teachers, Parents, and Students in Southern California. Health Promot Pract. 2023;24(350–359):20211229. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/15248399211066076.
Waltz TJ, Powell BJ, Fernández ME, et al. Choosing implementation strategies to address contextual barriers: diversity in recommendations and future directions. Implement Sci. 2019;14(42):20190429. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-019-0892-4.
National Institutes of Health. Rapid Acceleration of Diagnostics (RADX), https://www.nih.gov/research-training/medical-research-initiatives/radx/radx-programs. 2022, Accessed 4/4/24.
D’Agostino EM, Haroz EE, Linde S, et al. School-academic partnerships in support of safe return to schools during the COVID-19 pandemic. Pediatrics. 2022;149:e2021054268C. https://doiorg.publicaciones.saludcastillayleon.es/10.1542/peds.2021-054268C.
Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38:65–76. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-010-0319-7.
Education Data Partnership. District Summary: Sweetwater Union High, https://www.ed-data.org/district/San-Diego/Sweetwater-Union-High. 2023, Accessed 4/9/23.
Damschroder LJ, Reardon CM, Widerquist MAO, et al. The updated Consolidated Framework for Implementation Research based on user feedback. Implement Sci. 2022;17:75. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-022-01245-0.
Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care. 2007;19:349–57. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/intqhc/mzm042.
Weiner BJ, Lewis CC, Stanick C, et al. Psychometric assessment of three newly developed implementation outcome measures. Implement Sci. 2017;12:1–12. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-017-0635-3.
Kurth AE, Cleland CM, Chhun N, et al. Accuracy and acceptability of oral fluid HIV self-testing in a general adult population in Kenya. AIDS Behav. 2016;20:870–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10461-015-1213-9.
Ritchie J, Lewis J, Nicholls CM, et al. Qualitative research practice: A guide for social science students and researchers. London: Sage Publications; 2013. p. 456.
Lincoln Y, Guba E. Naturalistic Inquiry. Newbury Park, CA: Sage Publications; 1985. p. 416.
Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3:77. https://doiorg.publicaciones.saludcastillayleon.es/10.1191/1478088706qp063oa.
Fielding-Miller RK, Sundaram ME, Brouwer K. Social determinants of COVID-19 mortality at the county level. PLoS One. 2020;15:e0240151. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0240151.
Shimizu K, Kondo K, Osugi Y, et al. Early COVID-19 testing is critical to end the pandemic. J Gen Fam Med. 2021;22:67. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/jgf2.420.
Staccini P, Lau AY. Consuming health information and vulnerable populations: factors of engagement and ongoing usage. Yearb Med Inform. 2022;31:173–80. https://doiorg.publicaciones.saludcastillayleon.es/10.1055/s-0042-1742549.
Figueroa CA, Murayama H, Amorim PC, et al. Applying the digital health social justice guide. Frontiers in digital health. 2022;4:807886. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fdgth.2022.807886.
Bazargan M, Cobb S, Assari S. Discrimination and medical mistrust in a racially and ethnically diverse sample of California adults. Ann Fam Med. 2021;19:4–15. https://doiorg.publicaciones.saludcastillayleon.es/10.1370/afm.2632.
Moir T. Why is implementation science important for intervention design and evaluation within educational settings? In: Frontiers in Education 2018, p.61. Frontiers Media SA.
Kilbourne AM. What can implementation science do for you? Key success stories from the field. J Gen Internal Med. 2020;35. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11606-020-06174-6.
Diederichs M, van Ewijk R, Isphording IE, et al. Schools under mandatory testing can mitigate the spread of SARS-CoV-2. Proc Natl Acad Sci. 2022;119:e2201724119. https://doiorg.publicaciones.saludcastillayleon.es/10.1073/pnas.2201724119.
Embrett M, Sim SM, Caldwell HA, et al. Barriers to and strategies to address COVID-19 testing hesitancy: a rapid scoping review. BMC Public Health. 2022;22:1–10. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12889-022-13127-7.
Acknowledgements
We are grateful to study participants for sharing their feedback and insights. We thank Dr. Sonia Lee from NICHD for her guidance and support; school principals, assistant principals, nurses, COVID-19 liaisons, after-school program coordinators, custodial staff, and other school staff who supported the testing programs at each school; engagement retention assistants, Marisela Arechiga-Romero and Cheenee Rose Real for program implementation support; SBCS for supporting program outreach; the school district leadership for having the vision to implement testing programs within the district middle schools. Elsa Ghebrendrias, MPH from San Diego State University School of Public Health contributed to the literature review and Drs. John Malone and Kelly Motadel from the County of San Diego Health and Human Services Agency provided crucial input on an earlier draft of the manuscript.
Funding
This research was, in part, funded by the RADx-Up Return to School Program of the National Institutes of Health (NIH) Agreement Nos. 1OT2HD108112-01 and 3OT2HD108112-01S1 (MPIs: Kiene, McDaniels-Davidson, Oren). The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the NIH. At the investigators request, NIH provided the Quidel QuickVue OTC and Quidel QuickVue tests used in the overall study through an existing contract agreement (75N92020C00013) between NIH and Quidel Corporation. Amanda P. Miller was supported by a National Institute on Alcohol Abuse and Alcoholism postdoctoral fellowship (T32AA013525, PIs: Riley & Spadoni).
Author information
Authors and Affiliations
Contributions
SMK, CM, and EO conceptualized and designed the overall study. SMK designed the focus group/interview guides and oversaw data collection and analysis with assistance from CM in oversight of qualitative data analysis. DC and CS collected, coded, and analyzed the qualitative data. JM, LF, and RVM, contributed to study implementation. SMK and APM wrote the first draft of the manuscript with assistance from DT. SMK led revisions to the manuscript with assistance from APM, CM, and JM. All authors contributed to results interpretation in context, commented on previous versions of the manuscript, and read and approved the final manuscript.
Corresponding author
Ethics declarations
Ethics approval and consent to participate
This study was performed in line with the principles of the Declaration of Helsinki. Approval was granted by the San Diego State University Institutional Review Board (IRB #HS-2021–0208). Written informed consent and child assent/parental consent were obtained from all individual participants included in the study.
Consent for publication
N/A.
Competing interests
CM has received compensation as a consultant for Gilead Scientific. In addition, her spouse is employed by QuidelOrtho Corporation and has participated in their employee stock purchase program. The other authors have no competing interests to declare.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Kiene, S.M., Miller, A.P., Tuhebwe, D. et al. “You know, it feels like you can trust them”: mixed methods implementation research to inform the scale up of a health disparities-responsive COVID-19 school testing program. Implement Sci Commun 5, 136 (2024). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-024-00669-7
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-024-00669-7