Skip to main content
  • Study protocol
  • Open access
  • Published:

Improving cancer prevention and control through implementing academic-local public health department partnerships – protocol for a cluster-randomized implementation trial using a positive deviance approach

Abstract

Background

Local public health departments in the United States are responsible for implementing cancer-related programs and policies in their communities; however, many staff have not been trained to use evidence-based processes, and the organizational climate may be unsupportive of evidence-based processes. A promising approach to address these gaps is through academic-public health department (AHD) partnerships, in which practitioners and academics collaborate to improve public health practice and education through joint research projects and educational opportunities. Prior research has demonstrated the benefits of AHD partnerships to public health practice and education. However, knowledge about how AHD partnerships should be structured to support implementation of programs and policies is sparse.

Methods

This is a mixed methods, two-phase study, guided by the Exploration, Preparation, Implementation, and Sustainment (EPIS) Framework, in which AHD partnerships are a relational type of bridging factor. A positive deviance approach will be used to understand how AHD partnerships are best structured and supported. In the formative phase, we will survey academics and local health department staff (n = 500) to characterize AHD partnerships and understand contextual influences. We will conduct in-depth interviews with eight AHD partnerships (four high and four low engagement), to identify differences between high and low engagement partnerships. The second, experimental phase will be a paired group randomized trial with 28 AHD partnerships (n = 14 randomized to implementation arm and n = 14 to the control arm). A menu of strategies will be refined through survey and interview findings, literature, and our team’s previous work. The trial will assess whether these strategies can be used to strengthen partnerships and improve adoption of cancer prevention and control programs and policies. We will evaluate changes in AHD partnership engagement and implementation of evidence-based programs and policies.

Discussion

This first-of-its-kind study will focus on collaborations that leverage complementary expertise of health department staff and academics to improve public health practice. Our results can impact the field by identifying new, sustainable models for how public health practitioners and academics can work together to meet common goals, increase the use of evidence-based programs and policies, and expand our understanding of bridging factors within the EPIS framework.

Trial registration

Prospective registered on 9/17/2024 at clinicaltrials.gov no. NCT06605196 (https://clinicaltrials.gov/study/NCT06605196).

Peer Review reports

Background

Cancer is the second leading cause of death in the United States (US) [1]; however, between one-third and one-half of deaths due to cancer are preventable [2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80]. By applying evidence-based programs and policies (EBPPs) that focus on improving modifiable risk factors, such as diet, physical activity, tobacco use, and screening and early detection, the incidence of cancer and impacts on healthcare systems can be reduced [2, 5,6,7,8]. National-level efforts (e.g., the Community Guide) highlight a variety of EBPPs that cancer control practitioners can implement in their communities [9,10,11]. Also, approaches such as evidence-based public health, i.e., the process of integrating science-based interventions with community preferences to improve the health of populations [12], are available to and recommended for public health professionals to successfully adopt and implement EBPPs into real-world public health practice [12,13,14,15,16]. However, EBPPs are used inconsistently among practitioners [12, 17]. One study found that less than half of cancer control planners had ever used evidence-based resources [18], which indicates that additional, active strategies are needed to ensure that public health professionals effectively use the resources available to them and implement EBPPs.

An important group of public health professionals are the practitioners in the 2,800 US local health departments (LHDs), who are on the “front lines” of public health.

LHDs work at the city, county, or regional level and provide two-thirds of all public health activities [19,20,21,22]. LHD activities and expenditures are associated with reduced deaths from diabetes, heart disease, and cancer [23]. LHDs are ideally positioned to address chronic diseases because of their particular strengths to assess a public health problem within the communities they serve and cultivate partnerships needed to implement an EBPP [24,25,26,27]. Activities such as reviewing the best available peer-reviewed evidence, using data, applying program planning frameworks, and conducting sound evaluation are among those most important for evidence-based public health to occur [12]. For many LHD practitioners, skills to conduct these activities are lacking [28, 29]. Organization-level structures and processes (e.g., leadership, organizational climate and culture, access to research information) need to be present [12, 15, 30, 31, 81, 82]. Yet, barriers to utilizing evidence-based practices exist, such as lack of incentives, inadequate connections between research and practice, and absence of cultural and leadership support [14, 32,33,34, 82].

In the last two decades, there has been a growing recognition of academic-health department (AHD) partnerships as a strategy to improve public health practice and education [35,36,37,38]. These partnerships offer an existing structure to deliver the training, technical assistance, and other supports to LHDs. AHD partnerships are a type of community-academic partnership, defined as “an arrangement between an academic institution and a governmental public health agency that provides mutual benefits in teaching, research, and service” [39]. Academicians can improve the real-world relevance of their research and teaching, and public health practitioners and agencies can receive support to implement EBPPs. Erwin and colleagues found that LHDs engaged in an AHD partnership were 2.3 times more likely to implement EBPPs compared to LHDs with no AHD partnership [40]. The arrangements can be formal (e.g., through a memorandum of understanding) or informal and provide collaborative opportunities across academia and practice [41, 42]. These partnerships are common; in the 2022 Profile of Local Health Departments, 89% of LHDs reported engagement with schools or programs of public health and 64% reported formal written agreements [43].

To our knowledge, no studies have systematically examined the structures and processes within AHD partnerships that support EBPP implementation or used experimental designs to understand what is needed for AHD partnerships to support EBPP implementation. This is a critical gap as AHD partnerships represent a “relational tie” type of bridging factor [44]. Bridging factors are an emerging construct in the Exploration, Preparation, Implementation and Sustainment (EPIS) framework [45]. This study advances bridging factors research in two critical ways. First, it advances the methodology of studying bridging factors because it will utilize a suite of methods to describe AHD partnership characteristics and isolate what features of these partnerships can be intervened upon through the use of implementation strategies. Second, the implementation strategies that arise and will be tested in this study will directly support the way that these AHD partnerships function to support EBPP implementation and sustainment. Findings can build implementation strategy knowledge around this commonly used relational tie type of bridging factor.

Our specific aims are as follows:

  • Aim 1: Understand the structures of and processes within AHD partnerships, and the contextual factors that influence the ability of AHD partnerships to implement EBPPs.

  • Research Questions: How do successful AHD partnerships build and maintain individual skills and organizational capacity required to implement and evaluate implementation of EBPPs? What contextual factors promote or hinder a successful AHD partnership?

  • Aim 2: Test the effectiveness of strategies designed to increase the implementation of EBPPs for cancer prevention and control by strengthening AHD partnerships.

  • Hypothesis: Two years post randomization, LHDs within partnerships randomized to receive supports to improve AHD partnerships will have implemented significantly more cancer-related EBPPs compared to LHDs within partnerships randomized to the control arm.

Methods/design

Overview of study design

This is a mixed methods implementation science study that consists of two complementary phases. The formative phase focuses on refining our understanding of AHD partnerships and the structures and processes that may contribute to the ability of a partnership to facilitate EBPP implementation. The experimental phase uses a paired group-randomized trial to test the effectiveness of strategies to increase EBPP adoption by improving AHD partnerships. The study uses a positive deviance approach and is guided by the Exploration, Preparation, Implementation, and Sustainment (EPIS) framework [45, 46].

Exploration, Preparation, Implementation, and Sustainment (EPIS) framework

The work is guided by the EPIS framework, which is a process and determinant framework [47] that identifies four phases of the implementation process: exploration of the health needs and the best available EBPP to address them; preparation for potential barriers and facilitators during implementation and necessary adaptations to the EBPP; implementation and evaluation of the EBPP; and sustainment of the EBPP [48]. The framework highlights determinants in the outer context, i.e., the public health and policy environment and characteristics of EBPP recipients, and inner organizational context of LHDs, such as leadership, organizational structures, and procedures, on the implementation process [45, 48]. Features of an EBPP itself, including the fit of the EBPP with its recipients and the implementing organizations, may also influence implementation. This study offers an opportunity to deepen our knowledge of the processes and determinants outlined in the framework and understand how they operate in different systems. A recent addition to EPIS is the concept of bridging factors, which are structures and processes that cross and connect the outer system and inner organizational context and influence the implementation process [45]. Bridging factors include community-academic partnerships, such as AHD partnerships of inquiry in this study, intermediary organizations that provide support for EBPP implementation, and formal arrangements and processes (e.g., contracts) [45, 49]. Using EPIS and the concept of bridging factors in this study will allow us to understand the activities within partnerships that could be modified to support EBPP implementation [44, 45, 49], i.e., how AHD partnerships should be structured and what processes and resources should be used within an AHD partnership to enhance the adoption of EBPPs [49]. This study provides a rich context for identifying specific ways that LHDs and academic partners influence each other and exchange key resources between the inner context of an LHD and the outer setting to influence EBPP adoption, implementation, and sustainment.

Positive deviance methodology

The overall methodological approach for the project is the positive deviance methodology, which acknowledges that solutions to problems within a community (here, public health practitioners and researchers) often exist within that community, and the knowledge and experiences of certain members can be generalized to improve the engagement of other members [46]. The positive deviance approach accomplishes two goals – identifying practices associated with top engagement and promoting the use of these practices within a community, using a mixed methods approach [50]. Four steps are used within a positive deviance approach [46]. First is to identify positive and negative deviants, i.e., AHD partnerships that demonstrate high or low implementation of cancer-related EBPPs. Next, organizations identified as positive and negative deviants are studied in-depth using qualitative methods to understand which structures and processes enable AHD partnerships to support implementation of EBPPs and how our existing strategies need to be refined to fit within AHD partnerships. Third, hypotheses generated from the qualitative work are quantitatively tested in a broader sample of partnerships. Last, results are disseminated with input from our practice partners.

Research setting and partners

This study will focus on AHD partnerships identified through the Council on Linkages Between Academia and Public Health Practice’s Academic Health Department Learning Community, hereafter the Learning Community. The Learning Community is a national network designed to support the development, maintenance, and growth of AHD partnerships [51]. The Learning Community is comprised of members who are interested in sharing knowledge and experiences related to AHD partnerships and creating resources and tools. Members include LHD directors and staff and academicians (in public health or related fields). The Learning Community hosts resources to support AHD partnerships and opportunities for interaction, e.g., meetings, webinars, and online discussions. To date, these resources have focused on partnership building and public health education, but not specifically on how AHD partnerships can support EBPP implementation.

We will engage additional research partners through a Practice Advisory Group that will 1) provide overall project guidance; 2) review data collection methods and instruments; 3) give input on refining the strategies used in the experimental phase; and 4) assist in disseminating findings. The group includes members of national public health practice organizations, public health practitioners, and practice-focused academics, all with expertise in AHD partnerships.

Formative phase: learn from existing AHD partnerships

This phase focuses on the first two steps of the positive deviance approach: 1) survey AHD partnerships to identify high and low engagement AHD partnerships based on their adoption of cancer-related EBPPs and 2) study these partnerships in depth using qualitative methods (interviews and document reviews).

Step 1. Identify positive and negative deviants

We will survey AHD partnerships to identify positive and negative deviants, i.e., AHD partnerships that have high or low adoption of cancer-related EBPPs, and collect data on other AHD outcomes for public health research and practice according to the logic model by Erwin et al. [41].

Sampling and recruitment

We will survey LHD practitioners and their academic partners to assess the adoption of cancer-related EBPPs and quantify other measures of partnership success according to a composite measure described below, theoretically guided by the AHD partnership logic model [41]. Using a modified snowball sampling approach, we will invite members of the Learning Community to participate in the survey and will ask those participating to identify other individuals in their partnership. These additional individuals will be invited to participate in the survey. LHD practitioners and their academic partners will be eligible to participate in the survey. We anticipate 500 responses from 100 AHD partnerships (average of 5 respondents per partnership).

Measures

The survey is designed to assess engagement related to implementation of cancer prevention and control EBPPs (Table 1) [11, 52]. Adoption of EBPPs will be the variable used to characterize positive and negative deviants and is the primary outcome for the trial. Respondents are presented with a list of evidence-based programs and policies taken from the Community Guide, reflecting the program areas in which respondents work. EBPPs are focused on primary or secondary prevention of cancer, grouped into categories used in the Community Guide: healthy weight management, physical activity promotion, healthy eating promotion, tobacco use prevention and/or cessation, HPV vaccination, cancer screening, health equity/social determinants of health, maternal and child health, and environmental health [11]. EBPPs that are recommended with sufficient or strong evidence from the Preventive Services Task Force Review are included in the survey. Respondents indicate whether they currently implement (i.e., have adopted) a given EBPP, and a summary variable will be created to reflect adopted EBPPs. This item has been used in our previous work and has demonstrated sufficient test–retest reliability [63]. We will also assess well-recognized outputs and outcomes of successful, productive AHD partnerships [41]. Quantitative indicators include: AHD partnership structures and activities; AHD partnership strengths; organizational supports for evidence-based practice; delivery of evidence-based interventions to address chronic diseases; individual public health skills regarding evidence-based public health; leadership and organizational characteristics; and sustainability of public health programs.

Table 1 Survey measures

Survey refinement

The study team and our research and practitioner colleagues have reviewed survey drafts to make sure it can be easily understood by respondents and accurately captures relevant experiences or opportunities. We conducted cognitive response testing with approximately 20 AHD partnership members (practitioners and academics), to improve the quality of data collection [53,54,55,56]. Testing was complete when survey items were clear and captured relevant information according to interviewees, and survey items were revised iteratively based on results of the cognitive response testing.

Quantitative data collection

Pre-invitation emails will be sent to participants to inform them of the survey purpose, and invitation emails with the survey link will be sent one week later. The survey will take no more than 20 min to complete and will be programmed for data collection in Qualtrics, an online survey software. Those who have not completed the survey will receive up to three reminder emails and two phone calls over a six-week period to address questions about the study and encourage participation. The study timeline is shown in Table 2.

Table 2 Study timeline

Data analysis

Data analysis will be conducted in SAS (version 9.4, Cary, NC). Summary measures will be created separately for all domains by adding items within a domain. For use within the qualitative interviews, an average partnership-level score will be created for each summary measure. Partnerships will be ranked according to the average number of EBPPs reported by partnership members. The four highest- and four lowest-ranked AHD partnerships, i.e., partnerships implementing the most and fewest cancer-related EBPPs, will be considered to have the highest and lowest engagement and will be invited to participate in the subsequent step 2. In the event of a tie, we will randomly choose one partnership to invite.

Step 2. Study positive and negative deviants

Individual interviews will be used to study positive and negative deviants in depth. Once a group of positive and negative deviants is identified in step 1, LHD practitioners and their academic partners in positive and negative deviant AHD partnerships will be invited to participate in qualitative, key informant interviews and provide information for document reviews. Interviews and document reviews will yield data to understand the extent to which AHD partnerships operate, as a bridging factor, to contribute to EBPP implementation, i.e., the structures, processes, and resources used within a partnership to connect the inner and external contexts. Data from this step will prioritize what strategies are used in the experimental phase to enhance AHD partnerships to improve EBPP implementation.

Sampling, recruitment, and interview domains

We will use a purposive, snowball sampling approach to identify key informants for interviews [57]. This approach allows us to identify those who have first-hand knowledge about the AHD partnerships from the perspective of LHD staff and academics. For partnerships identified as a positive or negative deviant, we will invite the survey respondents from step 1 to participate in an interview. Similar to the approach to identify other respondents for the quantitative survey, at the end of the interview we will ask the respondent to name individuals within the partnership (i.e., practitioners and academics). An average of five participants per AHD partnership will be recruited, including three participants representing the LHD (total recruitment: 8 AHD partnerships × 5 participants/partnership = 40 participants).

The interviews will take an ontological approach and a pragmatic interpretive framework, as we will seek to understand what is useful, practical, and works from the perspectives of more and less successful AHD partnership members [58]. The interviews will focus on several domains: 1) how the partnership formed; 2) how it has been maintained; 3) key characteristics of the partnership; 4) relevant details about the outer context, especially those related to funding and policy; and 5) how well the structures, processes, and resources within the partnership bridge the inner and outer contexts to support EBPP implementation.

Document review

In addition to qualitative interviews of those in successful AHD partnerships, we will collect formal documents that provide guidance about how successful AHD partnerships are structured. Documents will be requested at the conclusion of the interview and will be shared digitally with the study team. Example documents are Memoranda of Understanding (MOU), which outline the terms of a partnership, project reports from joint research projects, and accreditation documents, which often are required to have information about EBPP implementation and practice-based education (student practica).

Qualitative data and document review coding and analysis

Digital recordings will be transcribed verbatim by the Internet-based service Rev (https://www.rev.com/). All transcripts and documents will be analyzed using NVIVO.12 software [59]. A codebook will be developed based on the interview guide, which will be based on EPIS. Each transcript will be coded independently by two team members [60]. The two team members will then review non-overlapping coding in the text blocks and reach agreement on text blocking and coding. Themes from the coded transcripts and documents will be summarized and highlighted with exemplary quotes or in data matrices.

Refining strategies to enhance AHD partnership engagement and increase use of EBPPs

A list of AHD partnership features (e.g., characteristics, structures) and processes used to form or maintain AHD partnerships that may contribute to EBPP implementation will be compiled based on emerging themes from the qualitative interviews and document reviews. To create an initial menu of strategies, we will match these features to strategies used in prior work [61,62,63,64,65] or in the Learning Community using a nominal group technique, a structured variation of a small-group discussion, to reach consensus [66]. To refine this list, the investigative team and Practice Advisory Group will review themes to prioritize those that are most likely to be influential for EBPP implementation, as some features may be influential for other outcomes of AHD partnerships such as student practica. We will build consensus as a group using prioritization matrices commonly employed in small group decision making and prioritization activities [67,68,69]. We will rate partnership features according to characteristics such as appeal, perceived relevance to EBPP implementation, feasibility, cost, and sustainability potential and prioritize based on the combinations of these ratings.

Experimental Phase: test strategies to improve EBPP implementation through AHD partnerships (positive deviance step 3)

In the experimental phase, we will test strategies to support AHD partnerships and increased use of EBPPs. To test the hypotheses generated in the formative phase, we will conduct a paired, group-randomized study to understand if the strategies used within top engagement AHD partnerships can improve adoption of cancer prevention and control EBPPs among lower engagement AHD partnerships. Fourteen partnerships will be randomized to the implementation arm, which will receive resources and guided facilitation, and 14 will be randomized to a control arm, which will be referred to existing resources in the Learning Community but will not receive new resources or the guided facilitation.

Study population and selection of AHD partnerships

Our target audiences for this study are LHD practitioners and academics in lower engagement AHD partnerships as defined by extent of LHD EBPP implementation. We will recruit participants for this study from those identified as the bottom half of the EBPP implementation score created in the formative phase. Randomization will occur after pair matching as in previous community trials, to improve power [70,71,72]. Matching of partnerships will be based on size of the LHD and how long the AHD partnership has been in existence. Individuals from the LHD and academic institution will be invited to participate in the study. We will recruit 13 LHD staff and academic partners per partnership. LHD staff will include the division or program director for community health promotion or chronic disease prevention (primary contact) along with staff identified by the program director key to supporting cancer prevention and control, e.g., health educators. Academic partners will include professors, research staff, and graduate students from schools or programs of public health, nursing, or other related health disciplines. To increase achievability of the primary outcome, we will limit our sample to LHDs with at least 2 FTEs focused on chronic disease control per our prior research [63]. This provides a total of 364 (28 partnerships X 13 partnership members/partnership) individuals for recruitment into the trial.

AHD partnership strengthening strategies

AHD partnerships randomized to receive remote tailored, guided support to improve their partnerships will first participate in a planning period at the beginning of the intervention period to establish the goals for the partnership (Table 3). The study team will facilitate discussions to determine how AHD partnerships want to modify their partnership. We will use the prioritized list of AHD partnership features developed in the formative phase to guide the initial planning period and then assist the partnership members to plan to implement the corresponding strategies focused on their areas of interest. This approach acknowledges that uniformity is unlikely to be effective across partnerships with different needs, capacity, and context [73,74,75,76].

Table 3 Implementation strategy description

Once the initial planning period is complete, the research team will be knowledge brokers to provide facilitation and improve AHD partnership engagement in ways that match the goals set by the partnership. Based on our prior work, we anticipate LHD staff will need guided facilitation and training focused on: 1) assistance with strategic planning processes for implementing a strategy of interest; 2) consultation on overcoming barriers to implementing EBPPs; 3) help identifying funding sources and grant writing to fund joint research projects; and 4) identification of online training sources to support evidence-based decision making and strengthen partnerships. Instead of providing these supports directly to LHD staff as in previous work, we will work with the AHD partnership to meet these needs. We will regularly assess AHD partnerships’ progress towards goals, and adjust goals and strategies based on the progress.

Quantitative measures

To understand the effects of AHD partnership strengthening activities, we will measure the adoption of cancer-related EBPPs (primary outcome) and other partnering, individual- and organizational-level covariates. Data will be collected at baseline (prior to randomization) and at the end of the study.

AHD partnership engagement

Measures will include the same collected in the formative phase, including the adoption (use) of cancer prevention and control EBPPs (primary outcome); AHD partnering activities; organizational supports for evidence-based decision making [77]; joint research projects; publishable practice-based research; shared staff; and shared financial resources.

Implementation costs

Given the importance of cost in public health decision making and for future scale up and replication studies [78,79,80, 84, 85], costs associated with various strategies will be collected using a pragmatic approach proposed by Cidav and colleagues [86]. The approach blends time-driven activity-based costing, a process-based micro-costing method used in business accounting, with Proctor’s framework for reporting implementation strategies [86]. Costs are tracked in a step-wise manner: 1) name each implementation strategy and list the associated actions, actors, and temporality; 2) determine the frequency and average duration of each implementation action by actors and actors’ total time spent on each action; 3) determine the price per hour of each actor; 4) determine non-personnel, fixed resources and their expenses; and 5) calculate total costs. This approach will allow us to describe the relationship between costs and outcomes, both cumulatively and by implementation action. Costs associated with research activities will be separated from other implementation costs.

Organizational covariates

Information about the AHD partnership will be collected, including the year the partnership was established, the number of individuals involved in the partnership, and bridging factor dimensions of the partnership (e.g., partnership resources and structures to support EBPP implementation). Characteristics of the LHD (FTEs, annual expenditures, jurisdiction size, Public Health Accreditation Board accreditation status) and academic institutions (2- or 4-year institutions, Public Health Schools or Programs) will be collected.

Individual covariates

LHD staff and academicians’ age group, years in current position, job title, and academic degrees will be collected. Previously validated surveys will assess LHD staff’s evidence-based decision making skills [61, 87, 88].

Qualitative measures

Participants in partnerships randomized to the active implementation arm will be invited to participate in a post-study interview to understand how and why changes did or did not occur, according to EPIS framework constructs that may explain the quantitative findings in the study. Similar to the formative phase, interview questions will assess how partnerships were structured and what processes and resources AHD partnership members thought were most and least impactful for enhancing the adoption of EBPPs.

Process evaluation

Process evaluation data will assist us in determining the use and reach of our strategies and to assess whether outside events impacted our findings. Examples of process measures include requests for technical assistance, participation and satisfaction with capacity building initiatives, inclusion of a greater focus on EBPH and EBPPs in agency plans, becoming an accredited or reaccredited health department, and external funding granted to an LHD for cancer control-related programming.

Power calculation

This study uses a paired, group randomized design. By using tight matching criteria, we will balance potential confounding factors (e.g., FTEs, county size, duration of partnership) that may affect EBPP implementation. As a result, the between-cluster variation will be reduced, and statistical power increases [89, 90]. Based on our preliminary studies and values of intraclass correlation coefficients (ICC) in the literature [30, 62, 91,92,93,94,95,96], we estimated a range of effect sizes and ICCs. We calculated a median ICC from similar studies and developed a range based on a 50% decrease and increase around the median (range 0.009 to 0.027). We are interested in the changes in AHD partnership characteristics and adoption of EBPPs from baseline between the intervention and control arms. Drawing from previous work [63, 92, 93], we hypothesize that the scores in the intervention arm will be between 10 and 20% higher than the control arm for implementation of EBPPs. Following Donner [97] and Thompson [98], and using the most conservative estimates of effect sizes and ICCs, we estimate the number of clusters needed is 28 AHD partnerships (n = 14 per arm) and the number of subjects is 13 at baseline (total = 364) to ensure 9 at two years post-randomization (total = 252), given a power of > 90% and an overall Type I error of 5%. This assumes about a 70% retention rate of subjects and allows for attrition of one cluster.

Quantitative analyses

We will follow Donner [97] for analysis of matched-pair quantitative data. We consider the difference in the mean change between intervention and control EBPP implementation scores in an AHD partnership pair as the unit of analysis and will use the weighted paired t-test (with the cluster size as weights) if the cluster size varies across pairs. We will use the permutation test in which we compare the observed difference with the null distribution derived from the permutation procedure. For cluster-level adjusted analyses, we will follow Thompson’s approach [98] to obtain the adjusted mean first for each individual and for each partnership, then the adjusted difference in mean change in a pair. Modeling details are available upon request.

Qualitative analyses

Interview data will be analyzed using the same approach outlined for the interviews conducted in the formative phase.

Mixed methods analyses

In the formative phase, the quantitative and qualitative data will be collected and analyzed as a concurrent, embedded design (qual + QUAN), in which the qualitative findings will elaborate upon the quantitative findings to triangulate results, according to accepted best practices [99, 100]. The two types of data will be connected such that the qualitative data builds off of the quantitative data and analyzed with complementarity in mind, which will allow us to elaborate upon quantitative findings [99]. We will use a social constructivist approach and ontological assumptions during data analysis, as our emphasis will be on understanding AHD partnerships from the perspectives of partnership members (i.e., LHD staff and academic partners), acknowledging that there are likely different experiences for each AHD partnership [58]. Data can be integrated in a joint results display to visualize linkages between the quantitative and qualitative data [50].

Step 4. Disseminate findings

To complete the positive deviance approach, we will use a multi-component, active dissemination strategy to communicate findings [73, 101, 102]. Our systematic approach will be guided by principles of Designing for Dissemination (D4D) to effectively reach all relevant groups [103,104,105,106]. D4D applies the concept of audience segmentation, acknowledging that communication with different audiences requires messages and channels specific to their needs and preferences [107,108,109,110]. Keeping a focus on dissemination from the beginning of the project ensures that our findings are useful, relevant, and ready for dissemination before funding ends.

Because we will have LHD practitioners and academics as members of our Practice Advisory Group, we will have natural advocates for project dissemination. Our Practice Advisory Group will: 1) review the initial findings from the project phases especially in regard to their relevance for practitioners; 2) identify resources for enhancing the reach of the project; and 3) identify opportunities for disseminating project findings. LHD practitioners and their academic partners will be asked to provide input on formats and channels for dissemination materials (e.g., reports, webinars, toolkits, social media platforms).

Strategies will then be employed according to the target audience. Scientific researchers will be reached via publications in high-impact, practice-oriented peer-reviewed journals and research meetings. Public health practitioners will be reached through a combination of channels, including our partner organizations which regularly communicate with their members through email and webinars. All materials from our project (key findings, survey instruments) will be posted on our center website. A particular emphasis will be placed on social media (e.g., LinkedIn, Facebook), as these platforms are increasingly important in communicating to practitioners and researchers [111, 112].

Discussion

This study will contribute to the public health-focused implementation science literature in several ways. To our knowledge this is the first experimental, longitudinal study focused on local-level collaborations that leverage the expertise of LHDs and academics to improve public health practice. Also, this study will advance bridging factors research [44, 45, 49], specifically refining methods used to describe bridging factors and how they influence EBPP implementation, identifying modifiable aspects of bridging factors, and testing implementation strategies to strengthen bridging factors. Third, this study examines new models for how public health practice and academic public health can work together to meet common goals. The Institute of Medicine (now the National Academy of Medicine) noted the potential benefits of community-academic partnerships for public health education and practice [26], and this study could make the recommendations in these reports a reality. Lastly, a project on the scale intended here has the potential to begin to shift the paradigm on how research on EBPPs can be more quickly and effectively translated to those in the best position to use the evidence (here, LHD practitioners) and how public health research and practice can be efficiently funded to improve public health outcomes.

There are several potential limitations to this study. First, many AHD partnerships enhance workforce development broadly through activities such as internship programs and training programs across multiple topics and may or may not specifically address cancer prevention and control. Second, while strategies AHD partnerships employ may strengthen the partnership overall, it may take more time than the allotted intervention period to impact cancer control EBPP implementation. Third, the number of cancer control EBPPs implemented by LHDs as the main outcome variable is a narrow parameter, that while measurable, might not demonstrate enhanced implementation quality, sustainability, impact, or strength of the LHD partnerships.

Conclusion and impact

AHD partnerships play a critical role in advancing the implementation of EBPPs [40]. By learning about the way in which academic-health department partnerships work and the effectiveness of strategies in supporting them, we can identify relevant, effective capacity building and implementation strategies to increase local capacity for cancer prevention and control and other, similar EBPPs. More broadly, knowledge from this study can inform future work involving the assessment of or intervention on bridging factors. The implementation strategy cost information we will collect may be useful not only to LHDs and AHD partnerships seeking to increase EBPP uptake, but also to inform other goals in AHD partnerships and public health practice. Our results have the potential to impact the field by identifying new, sustainable models for how public health practitioners and academics can work together to meet common goals, increase use of EBPPs, and make efficient use of limited resources.

Data availability

Not applicable.

Abbreviations

EBPPs:

Evidence-based programs and policies

LHD:

Local health department

AHD:

Academic-public health department

EPIS:

Exploration, Preparation, Implementation, and Sustainment

D4D:

Designing for Dissemination

References

  1. American Cancer Society. Cancer Facts & Figures 2020http://www.cancer.org. Accessed 27 Feb 2024.

  2. Colditz GA, Wolin KY, Gehlert S. Applying what we know to accelerate cancer prevention. Sci Transl Med. 2012;4(127):127rv4–127rv4. https://doiorg.publicaciones.saludcastillayleon.es/10.1126/scitranslmed.3003218.

  3. Byers T, Mouchawar J, Marks J, et al. The American Cancer Society challenge goals. How far can cancer rates decline in the U.S. by the year 2015? Cancer. 1999;86(4):715–727.

  4. Willett WC, Colditz GA, Mueller NE. Strategies for Minimizing Cancer Risk. Sci Am. 1996;275(3):88–95. https://doiorg.publicaciones.saludcastillayleon.es/10.1038/scientificamerican0996-88.

    Article  CAS  PubMed  Google Scholar 

  5. Emmons KM, Colditz GA. Realizing the Potential of Cancer Prevention — The Role of Implementation Science. N Engl J Med. 2017;376(10):986–90. https://doiorg.publicaciones.saludcastillayleon.es/10.1056/nejmsb1609101.

    Article  PubMed  PubMed Central  Google Scholar 

  6. National Research Council. Fulfilling the Potential for Cancer Prevention and Early Detection. National Academies Press; 2003. https://doiorg.publicaciones.saludcastillayleon.es/10.17226/10263.

  7. Remington PL, Brownson RC, Wegner MV. Chronic Disease Epidemiology, Prevention, and Control, 4th ed. Washington, DC: APHA Press; 2016.

  8. The National Cancer Institute. Cancer Moonshot Blue Ribbon Panel Report 2016. Accessed February 26, 2024. https://www.cancer.gov/research/key-initiatives/moonshot-cancer-initiative/blue-ribbon-panel/blue-ribbon-panel-report-2016.pdf.

  9. Briss PA, Brownson RC, Fielding JE, Zaza S. Developing and Using the Guide to Community Preventive Services: Lessons Learned About Evidence-Based Public Health. Annu Rev Public Health. 2004;25(1):281–302. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev.publhealth.25.050503.153933.

    Article  PubMed  Google Scholar 

  10. Sanchez MA, Vinson CA, Porta M La, Viswanath K, Kerner JF, Glasgow RE. Evolution of Cancer Control P.L.A.N.E.T.: moving research into practice. Cancer Causes & Control. 2012;23(7):1205–1212. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10552-012-9987-9.

  11. Task Force on Community Preventive Services. Guide to Community Preventive Services. Centers for Disease Control and Prevention. www.thecommunityguide.org. Accessed 5 June 2016.

  12. Brownson RC, Fielding JE, Maylahn CM. Evidence-Based Public Health: A Fundamental Concept for Public Health Practice. Annu Rev Public Health. 2009;30(1):175–201. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev.publhealth.031308.100134.

    Article  PubMed  Google Scholar 

  13. Brownson RC, Gurney JG, Land GH. Evidence-Based Decision Making in Public Health. J Public Health Manag Pract. 1999;5(5):86–97. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/00124784-199909000-00012.

    Article  CAS  PubMed  Google Scholar 

  14. Baker EA, Brownson RC, Dreisinger M, McIntosh LD, Karamehic-Muratovic A. Examining the Role of Training in Evidence-Based Public Health: A Qualitative Study. Health Promot Pract. 2009;10(3):342–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1524839909336649.

    Article  PubMed  Google Scholar 

  15. Kohatsu ND, Robinson JG, Torner JC. Evidence-based public health. Am J Prev Med. 2004;27(5):417–21. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.amepre.2004.07.019.

    Article  PubMed  Google Scholar 

  16. Public Health Accreditation Board. Public Health Accreditation Board Standards and Measures, Version 1.5. 2013.; 2013. http://www.phaboard.org/wp-content/uploads/SM-Version-1.5-Board-adopted-FINAL-01-24-2014.docx.pdf. Accessed 26 Feb 2024.

  17. Hannon PA, Maxwell AE, Escoffery C, et al. Adoption and Implementation of Evidence-Based Colorectal Cancer Screening Interventions Among Cancer Control Program Grantees, 2009–2015. Prev Chronic Dis. 2019;16: 180682. https://doiorg.publicaciones.saludcastillayleon.es/10.5888/pcd16.180682.

    Article  Google Scholar 

  18. Hannon PA, Fernandez ME, Williams RS, et al. Cancer Control Planners’ Perceptions and Use of Evidence-Based Programs. J Public Health Manag Pract. 2010;16(3):E1–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0b013e3181b3a3b1.

    Article  PubMed  PubMed Central  Google Scholar 

  19. National Association of County and City Health Officials. Local Health Departments Protect the Public’s Health.; 2014. http://www.naccho.org/advocacy/upload/flyer_what-is-lhd-2013_v3.pdf. Accessed 26 Feb 2024.

  20. Mays G. Organization of the public health delivery system. In: Novick L, Morrow C, Mays G, eds. Public Health Administration Principles for Population-Based Management. 2nd ed. Jones and Bartlett; 2008:69–126.

  21. Turnock BJ. Public Health: Career Choices That Make a Difference. Sudbury, MA. Jones and Bartlett Publishers; 2006.

  22. National Association of County and City Health Officials. Diabetes. National Association of County and City Health Officials. http://www.naccho.org/topics/hpdp/diabetes/. Accessed 26 Feb 2024.

  23. Mays GP, Smith SA. Evidence Links Increases In Public Health Spending To Declines In Preventable Deaths. Health Aff. 2011;30(8):1585–93. https://doiorg.publicaciones.saludcastillayleon.es/10.1377/hlthaff.2011.0196.

    Article  Google Scholar 

  24. Institute of Medicine (US) Committee for the Study of the Future of Public Health. The Future of Public Health. Washington, DC: National Academy Press; 1988.

  25. Green LW, Ottoson JM, García C, Hiatt RA. Diffusion Theory and Knowledge Dissemination, Utilization, and Integration in Public Health. Annu Rev Public Health. 2009;30(1):151–74. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev.publhealth.031308.100049.

    Article  PubMed  Google Scholar 

  26. Institute of Medicine. The Future of the Public’s Health in the 21st Century. Washington, DC: The National Academies Press; 2003.

  27. Yancey AK, Fielding JE, Flores GR, Sallis JF, McCarthy WJ, Breslow L. Creating a Robust Public Health Infrastructure for Physical Activity Promotion. Am J Prev Med. 2007;32(1):68–78. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.amepre.2006.08.029.

    Article  PubMed  Google Scholar 

  28. Frieden TR. Asleep at the Switch: Local Public Health and Chronic Disease. Am J Public Health. 2004;94(12):2059–61. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.94.12.2059.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Prentice B, Flores G. Local health departments and the challenge of chronic disease: lessons from California. Prev Chronic Dis. 2007;4(1):A15.

    PubMed  Google Scholar 

  30. Brownson RC, Ballew P, Dieffenderfer B, et al. Evidence-Based Interventions to Promote Physical Activity. Am J Prev Med. 2007;33(1):S66–78. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.amepre.2007.03.011.

    Article  PubMed  Google Scholar 

  31. Brownson RC, Allen P, Duggan K, Stamatakis KA, Erwin PC. Fostering More-Effective Public Health by Identifying Administrative Evidence-Based Practices. Am J Prev Med. 2012;43(3):309–19. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.amepre.2012.06.006.

    Article  PubMed  PubMed Central  Google Scholar 

  32. Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to Evidence-Based Decision Making in Public Health: A National Survey of Chronic Disease Practitioners. Public Health Rep. 2010;125(5):736–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/003335491012500516.

    Article  PubMed  PubMed Central  Google Scholar 

  33. Armstrong R, Waters E, Roberts H, Oliver S, Popay J. The role and theoretical evolution of knowledge translation and exchange in public health. J Public Health (Bangkok). 2006;28(4):384–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/pubmed/fdl072.

    Article  Google Scholar 

  34. Maylahn C, Bohn C, Hammer M, Waltz EC. Strengthening Epidemiologic Competencies among Local Health Professionals in New York: Teaching Evidence-Based Public Health. Public Health Rep. 2008;123(1_suppl):35–43. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/00333549081230S110.

  35. Keck WC. Lessons Learned from an Academic Health Department. J Public Health Manag Pract. 2000;6(1):47–52. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/00124784-200006010-00008.

    Article  CAS  PubMed  Google Scholar 

  36. Council on Education for Public Health. Accreditation criteria and procedures. https://ceph.org/about/org-info/criteria-procedures-documents/criteria-procedures/. Accessed 8 July 2020.

  37. Public Health Accreditation Board. Standards and Measures for Initial Accreditation. https://phaboard.org/standards-and-measures-for-initial-accreditation/. Accessed 8 July 2020.

  38. Gordon AK, Chung K, Handler A, Turnock BJ, Schieve LA, Ippoliti P. Final Report on Public Health Practice Linkages Between Schools of Public Health and State Health Agencies. J Public Health Manag Pract. 1999;5(3):25–34. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/00124784-199905000-00006.

    Article  CAS  PubMed  Google Scholar 

  39. Erwin PC, Keck CW. The Academic Health Department. J Public Health Manag Pract. 2014;20(3):270–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000000016.

    Article  PubMed  Google Scholar 

  40. Erwin PC, Parks RG, Mazzucca S, et al. Evidence-Based Public Health Provided Through Local Health Departments: Importance of Academic-Practice Partnerships. Am J Public Health. 2019;109(5):739–47. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.2019.304958.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Erwin PC, McNeely CS, Grubaugh JH, Valentine J, Miller MD, Buchanan M. A Logic Model for Evaluating the Academic Health Department. J Public Health Manag Pract. 2016;22(2):182–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000000236.

    Article  PubMed  Google Scholar 

  42. Erwin PC, Barlow P, Brownson RC, Amos K, Keck CW. Characteristics of Academic Health Departments. J Public Health Manag Pract. 2016;22(2):190–3. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000000237.

    Article  PubMed  PubMed Central  Google Scholar 

  43. 2019 National Profile of Local Health Departments. Washington, DC: National Association of County and City Health Officials; 2020.

  44. Lengnick-Hall R, Stadnick NA, Dickson KS, Moullin JC, Aarons GA. Forms and functions of bridging factors: specifying the dynamic links between outer and inner contexts during implementation and sustainment. Implement Sci. 2021;16(1):34. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-021-01099-y.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the Exploration, Preparation, Implementation, Sustainment (EPIS) framework. Implement Sci. 2019;14(1):1. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-018-0842-6.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Bradley EH, Curry LA, Ramanadhan S, Rowe L, Nembhard IM, Krumholz HM. Research in action: using positive deviance to improve quality of health care. Implement Sci. 2009;4(1):25. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-4-25.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10:53. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-015-0242-0.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-010-0327-7.

    Article  PubMed  Google Scholar 

  49. Lengnick-Hall R, Willging C, Hurlburt M, Fenwick K, Aarons GA. Contracting as a bridging factor linking outer and inner contexts during EBP implementation and sustainment: a prospective study across multiple U.S. public sector service systems. Implement Sci. 2020;15(1):43. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-020-00999-9.

  50. Creswell JW. A Concise Introduction to Mixed Methods Research. 2nd ed. Thousand Oaks, CA: Sage Publications; 2014.

  51. Public Health Foundation. Academic Health Department Learning Community. Accessed July 4, 2020. http://www.phf.org/programs/AHDLC/Pages/Academic_Health_Department_Learning_Community.aspx.

  52. National Cancer Institute. Research-tested intervention programs (RTIPs). http://rtips.cancer.gov/rtips/index.do. Accessed 12 July 2013.

  53. Forsyth B, Lessler J. Cognitive laboratory methods: a taxonomy. In: Biemer P, Groves R, Lyberg L, Mathiowetz N, Sudman S, editors. Measurement Errors in Surveys. Wiley-Interscience; 1991. p. 395–418.

    Google Scholar 

  54. Jobe JB, Mingay DJ. Cognitive laboratory approach to designing questionnaires for surveys of the elderly. Public Health Rep. 1990;105(5):518–24.

    CAS  PubMed  PubMed Central  Google Scholar 

  55. Jobe JB, Mingay DJ. Cognitive research improves questionnaires. Am J Public Health. 1989;79(8):1053–5. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.79.8.1053.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  56. Willis G. Cognitive Interviewing: A Tool for Improving Questionnaire Design. Thousand Oaks, CA: Sage Publications; 2005.

  57. Palinkas LA, Horwitz SM, Green CA, Wisdom JP, Duan N, Hoagwood K. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Administration and Policy in Mental Health and Mental Health Services Research. 2015;42(5):533–44. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-013-0528-y.

    Article  PubMed  Google Scholar 

  58. Creswell JW, Poth CN. Qualitative Inquiry and Research Design. Thousand Oaks, CA: Sage Publications; 2018.

  59. QSR International. NVivo 12 for Windows. https://www.qsrinternational.com/nvivo-qualitative-data-analysis-software/about/nvivo. Accessed 22 July 2020.

  60. Patton MQ. Qualitative Research and Evaluation Methods. 3rd ed. Thousand Oaks, CA: Sage Publications; 2002.

  61. Brownson RC, Allen P, Jacob RR, et al. Controlling Chronic Diseases Through Evidence-Based Decision Making: A Group-Randomized Trial. Prev Chronic Dis. 2017;14: 170326. https://doiorg.publicaciones.saludcastillayleon.es/10.5888/pcd14.170326.

    Article  Google Scholar 

  62. Jacobs JA, Duggan K, Erwin P, et al. Capacity building for evidence-based decision making in local health departments: scaling up an effective training approach. Implement Sci. 2014;9(1):124. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-014-0124-x.

    Article  PubMed  PubMed Central  Google Scholar 

  63. Parks RG, Tabak RG, Allen P, et al. Enhancing evidence-based diabetes and chronic disease control among local health departments: a multi-phase dissemination study with a stepped-wedge cluster randomized trial component. Implement Sci. 2017;12(1):122. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-017-0650-4.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Brownson RC, Diem G, Grabauskas V, et al. Training practitioners in evidence-based chronic disease prevention for global health. Promot Educ. 2007;14(3):159–63.

    Article  PubMed  Google Scholar 

  65. Gibbert WS, Keating SM, Jacobs JA, et al. Training the Workforce in Evidence-Based Public Health: An Evaluation of Impact Among US and International Practitioners. Prev Chronic Dis. 2013;10: 130120. https://doiorg.publicaciones.saludcastillayleon.es/10.5888/pcd10.130120.

    Article  Google Scholar 

  66. Delbecq A, Van de Ven A, Gustafson D. Group Techniques for Program Planning: A Guide to Nominal Group and Delphi Processes. Foreman and Company: Scott; 1975.

    Google Scholar 

  67. Lugo J. Pugh Method: How to decide between different designs? University of Notre Dame. Accessed August 17, 2020. https://sites.nd.edu/jlugo/2012/09/24/pugh-method-how-to-decide-between-different-designs/.

  68. Duttweiler M. Priority Setting Tools: Selected Background and Information and Techniques. Cornell, NY. Cornell: Cooperative Extension; 2007.

  69. Kumar A, Sah B, Singh AR, et al. Multicriteria decision-making methodologies and their applications in sustainable energy system/microgrids. In: Decision Making Applications in Modern Power Systems. Elsevier; 2020:1–40. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/B978-0-12-816445-7.00001-3.

  70. Donner A. Statistical methodology for paired cluster designs. Am J Epidemiol. 1987;126(5):972–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/oxfordjournals.aje.a114735.

    Article  CAS  PubMed  Google Scholar 

  71. Green SB. The advantages of community-randomized trials for evaluating lifestyle modification. Control Clin Trials. 1997;18(6):506–13. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/S0197-2456(97)00013-5.

    Article  CAS  PubMed  Google Scholar 

  72. Todd J, Carpenter L, Li X, Nakiyingi J, Gray R, Hayes R. The effects of alternative study designs on the power of community randomized trials: evidence from three studies of human immunodeficiency virus prevention in East Africa. Int J Epidemiol. 2003;32(5):755–62. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/ije/dyg150.

    Article  PubMed  Google Scholar 

  73. Bero LA, Grilli R, Grimshaw JM, Harvey E, Oxman AD, Thomson MA. Getting research findings into practice: Closing the gap between research and practice: an overview of systematic reviews of interventions to promote the implementation of research findings. BMJ. 1998;317(7156):465–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmj.317.7156.465.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  74. Glasgow RE, Marcus AC, Bull SS, Wilson KM. Disseminating effective cancer screening interventions. Cancer. 2004;101(5 SUPPL.):1239–50. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/cncr.20509.

    Article  PubMed  Google Scholar 

  75. Kerner J, Rimer B, Emmons K. Introduction to the Special Section on Dissemination: Dissemination Research and Research Dissemination: How Can We Close the Gap? Health Psychol. 2005;24(5):443–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/0278-6133.24.5.443.

    Article  PubMed  Google Scholar 

  76. Rabin BA, Glasgow RE, Kerner JF, Klump MP, Brownson RC. Dissemination and Implementation Research on Community-Based Cancer Prevention. Am J Prev Med. 2010;38(4):443–56. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.amepre.2009.12.035.

    Article  PubMed  Google Scholar 

  77. Mazzucca S, Parks RG, Tabak RG, et al. Assessing Organizational Supports for Evidence-Based Decision Making in Local Public Health Departments in the United States: Development and Psychometric Properties of a New Measure. J Public Health Manag Pract. 2019;25(5):454–63. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000000952.

    Article  PubMed  PubMed Central  Google Scholar 

  78. Bond GR, Drake RE, McHugo GJ, Peterson AE, Jones AM, Williams J. Long-Term Sustainability of Evidence-Based Practices in Community Mental Health Agencies. Administration and Policy in Mental Health and Mental Health Services Research. 2014;41(2):228–36. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-012-0461-5.

    Article  PubMed  Google Scholar 

  79. Powell BJ, Fernandez ME, Williams NJ, et al. Enhancing the Impact of Implementation Strategies in Healthcare: A Research Agenda. Front Public Health. 2019;7. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fpubh.2019.00003.

  80. Raghavan R. The Role of Economic Evaluation in Dissemination and Implementation Research. Vol 1. Oxford University Press; 2017. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/oso/9780190683214.003.0006.

  81. Allen P, Brownson RC, Duggan K, Stamatakis KA, Erwin PC. The Makings of an Evidence-Based Local Health Department: Identifying Administrative and Management Practices. Public Health Services and Systems Research. 1(2). https://doiorg.publicaciones.saludcastillayleon.es/10.13023/FPHSSR.0102.02.

  82. Dobbins M, Cockerill R, Barnsley J, Ciliska D. Factors of the innovation, organization, environment, and individual that predict the influence five systematic reviews had on public health decisions. Int J Technol Assess Health Care. 2001;17(4):467–78.

    Article  CAS  PubMed  Google Scholar 

  83. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implementation Sci. 2013;8:139. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-8-139.

    Article  Google Scholar 

  84. Roundfield KD, Lang JM. Costs to Community Mental Health Agencies to Sustain an Evidence-Based Practice. Psychiatr Serv. 2017;68(9):876–82. https://doiorg.publicaciones.saludcastillayleon.es/10.1176/appi.ps.201600193.

    Article  PubMed  Google Scholar 

  85. Vale L, Thomas R, MacLennan G, Grimshaw J. Systematic review of economic evaluations and cost analyses of guideline implementation strategies. Eur J Health Econ. 2007;8(2):111–21. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10198-007-0043-8.

    Article  PubMed  Google Scholar 

  86. Cidav Z, Mandell D, Pyne J, Beidas R, Curran G, Marcus S. A pragmatic method for costing implementation strategies using time-driven activity-based costing. Implement Sci. 2020;15(1):28. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-020-00993-1.

    Article  PubMed  PubMed Central  Google Scholar 

  87. Mazzucca S, Jacob RR, Valko CA, Macchi M, Brownson RC. The Relationships Between State Health Department Practitioners’ Perceptions of Organizational Supports and Evidence-Based Decision-Making Skills. Public Health Rep. 2021;136(6):710–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/0033354920984159.

    Article  PubMed  PubMed Central  Google Scholar 

  88. Jacob RR, Baker EA, Allen P, et al. Training needs and supports for evidence-based decision making among the public health workforce in the United States. BMC Health Serv Res. 2014;14(1):564. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-014-0564-7.

    Article  PubMed  PubMed Central  Google Scholar 

  89. Freedman LS, Green SB, Byar DP. Assessing the gain in efficiency due to matching in a community intervention study. Stat Med. 1990;9(8):943–52. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/sim.4780090810.

    Article  CAS  PubMed  Google Scholar 

  90. Gail MH, Byar DP, Pechacek TF, Corle DK. Aspects of statistical design for the community intervention trial for smoking cessation (COMMIT). Control Clin Trials. 1992;13(1):6–21. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/0197-2456(92)90026-V.

    Article  CAS  PubMed  Google Scholar 

  91. Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to Evidence-Based Decision Making in Public Health: A National Survey of Chronic Disease Practitioners. Public Health Rep. 2010;125(5):736–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/003335491012500516.

    Article  PubMed  PubMed Central  Google Scholar 

  92. Brownson RC, Ballew P, Brown KL, et al. The Effect of Disseminating Evidence-Based Interventions That Promote Physical Activity to Health Departments. Am J Public Health. 2007;97(10):1900–7. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.2006.090399.

    Article  PubMed  PubMed Central  Google Scholar 

  93. Dobbins M, Hanna SE, Ciliska D, et al. A randomized controlled trial evaluating the impact of knowledge translation and exchange strategies. Implement Sci. 2009;4(1):61. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-4-61.

    Article  PubMed  PubMed Central  Google Scholar 

  94. Gulliford MC, Adams G, Ukoumunne OC, Latinovic R, Chinn S, Campbell MJ. Intraclass correlation coefficient and outcome prevalence are associated in clustered binary data. J Clin Epidemiol. 2005;58(3):246–51. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.jclinepi.2004.08.012.

    Article  CAS  PubMed  Google Scholar 

  95. Gulliford MC, Ukoumunne OC, Chinn S. Components of Variance and Intraclass Correlations for the Design of Community-based Surveys and Intervention Studies: Data from the Health Survey for England 1994. Am J Epidemiol. 1999;149(9):876–83. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/oxfordjournals.aje.a009904.

    Article  CAS  PubMed  Google Scholar 

  96. Turner RM, Thompson SG, Spiegelhalter DJ. Prior distributions for the intracluster correlation coefficient, based on multiple previous estimates, and their application in cluster randomized trials. Clin Trials. 2005;2(2):108–18. https://doiorg.publicaciones.saludcastillayleon.es/10.1191/1740774505cn072oa.

    Article  PubMed  Google Scholar 

  97. Donner A, Klar N. Design and Analysis of Cluster Randomization Trials in Health Research. London: Arnold Publishers; 2000.

  98. Thompson SG, Pyke SDM, Hardy RJ. The design and analysis of paired cluster randomized trials: an application of meta-analysis techniques. Stat Med. 1997;16(18):2063–79. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/(SICI)1097-0258(19970930)16:18%3c2063::AID-SIM642%3e3.0.CO;2-8.

    Article  CAS  PubMed  Google Scholar 

  99. Palinkas LA, Aarons GA, Horwitz S, Chamberlain P, Hurlburt M, Landsverk J. Mixed Method Designs in Implementation Research. Administration and Policy in Mental Health and Mental Health Services Research. 2011;38(1):44–53. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s10488-010-0314-z.

    Article  PubMed  Google Scholar 

  100. Albright K, Gechter K, Kempe A. Importance of Mixed Methods in Pragmatic Trials and Dissemination and Implementation Research. Acad Pediatr. 2013;13(5):400–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.acap.2013.06.010.

    Article  PubMed  Google Scholar 

  101. Beaulieu MD, Brophy J, Jacques A, Blais R, Battista R, Lebeau R. Drug treatment of stable angina pectoris and mass dissemination of therapeutic guidelines: a randomized controlled trial. QJM. 2004;97(1):21–31. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/qjmed/hch006.

    Article  PubMed  Google Scholar 

  102. Elder JP, Ayala GX, Campbell NR, et al. Interpersonal and Print Nutrition Communication for a Spanish-Dominant Latino Population: Secretos de la Buena Vida. Health Psychol. 2005;24(1):49–57. https://doiorg.publicaciones.saludcastillayleon.es/10.1037/0278-6133.24.1.49.

    Article  PubMed  Google Scholar 

  103. Brownson RC, Jacobs JA, Tabak RG, Hoehner CM, Stamatakis KA. Designing for Dissemination Among Public Health Researchers: Findings From a National Survey in the United States. Am J Public Health. 2013;103(9):1693–9. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.2012.301165.

    Article  PubMed  PubMed Central  Google Scholar 

  104. Tabak RG, Reis RS, Wilson P, Brownson RC. Dissemination of Health-Related Research among Scientists in Three Countries: Access to Resources and Current Practices. Biomed Res Int. 2015;2015:1–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1155/2015/179156.

    Article  Google Scholar 

  105. Tabak RG, Stamatakis KA, Jacobs JA, Brownson RC. What Predicts Dissemination Efforts among Public Health Researchers in the United States? Public Health Rep. 2014;129(4):361–8. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/003335491412900411.

    Article  PubMed  PubMed Central  Google Scholar 

  106. Knoepke CE, Ingle MP, Matlock DD, Brownson RC, Glasgow RE. Dissemination and stakeholder engagement practices among dissemination & implementation scientists: Results from an online survey. PLoS ONE. 2019;14(11):e0216971. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0216971.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  107. National Cancer Institute. Designing for Dissemination: Conference Summary Report. 2002:28; 2002. http://dccps.nci.nih.gov/d4d/d4d_conf_sum_report.pdf. Accessed 3 May 2024.

  108. Owen N, Goode A, Fjeldsoe B, Sugiyama T, Eakin E. Designing for the Dissemination of Environmental and Policy Initiatives and Programs for High-Risk Groups. In: Dissemination and Implementation Research in HealthTranslating Science to Practice. Oxford University Press; 2012:114–127. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/acprof:oso/9780199751877.003.0006.

  109. Slater MD. Theory and Method in Health Audience Segmentation. J Health Commun. 1996;1(3):267–84. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/108107396128059.

    Article  CAS  PubMed  Google Scholar 

  110. Brownson RC, Eyler AA, Harris JK, Moore JB, Tabak RG. Getting the word out: New approaches for disseminating public health science. J Public Health Manag Pract. 2018;24(2):102–11. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000000673.

    Article  PubMed  PubMed Central  Google Scholar 

  111. Miller MR, Snook WD, Yoder EW. Social Media in Public Health Departments: A Vital Component of Community Engagement. J Public Health Manag Pract. 2020;26(1):94–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/PHH.0000000000001125.

    Article  PubMed  Google Scholar 

  112. Luc JGY, Archer MA, Arora RC, et al. Does Tweeting Improve Citations? One-Year Results From the TSSMN Prospective Randomized Trial. Ann Thorac Surg. 2021;111(1):296–300. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.athoracsur.2020.04.065.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

We acknowledge our Practice Advisory Group members, Julie Grubaugh, Laura Valentino, Dixie Duncan, and Mac McCullough, who have provided integral guidance for this study. We also acknowledge our partnership with the Public Health Foundation and the Council on Linkages Between Academia and Public Health Practice’s AHD Learning Community in the development, implementation, and dissemination of findings from this study. We thank the Prevention Research Center at Washington University in St. Louis team members Mary Adams and Linda Dix for administrative assistance and Cheryl Valko for the center’s support.

Funding

This study is funded by the National Cancer Institute (R37262011). Additional support is provided by the National Cancer Institute (P50CA244431), the National Institute of Diabetes and Digestive and Kidney Diseases (P30DK092950), and the Centers for Disease Control and Prevention (Cooperative Agreement number U48DP006395). The findings and conclusions in this paper are those of the authors and do not necessarily represent the official positions of the National Institutes of Health or the Centers for Disease Control and Prevention.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study design. SMR, JG, PA, and MB drafted the manuscript and all authors provided input to critically revise the manuscript. RRJ and FG provided statistical expertise and will conduct the primary implementation and effectiveness analyses. ARB provided economic evaluation expertise and will conduct the economic evaluation. RLH provided advice regarding use of the EPIS framework. KA, PA, RCB, and PCE provided expertise regarding local public health departments, academic health department partnerships, and strategies to support partnerships. All authors reviewed and provided critical feedback on the final manuscript.

Corresponding author

Correspondence to Stephanie Mazzucca-Ragan.

Ethics declarations

Ethics approval and consent to participate

The study was approved by the Washington University in St. Louis Institutional Review Board in December 2023.

Consent for publication

Not applicable.

Competing interests

Ross Brownson is part of the Editorial Board for Implementation Science Communications.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mazzucca-Ragan, S., Allen, P., Amos, K. et al. Improving cancer prevention and control through implementing academic-local public health department partnerships – protocol for a cluster-randomized implementation trial using a positive deviance approach. Implement Sci Commun 6, 20 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00706-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00706-z

Keywords