Skip to main content

The role of organizational characteristics in intervention sustainment: findings from a quantitative analysis in 42 HIV testing clinics in Vietnam

Abstract

Background

Evidence-based intervention (EBI) sustainment is one of public health’s largest translational research problems. Fewer than half of public health EBIs are sustained long-term, and sustainment challenges are even more pressing in low and middle-income countries (LMICs). Organizational characteristics, including organizations’ inner structures, culture, and climate, may play a key role in EBI sustainment. However, little quantitative research has examined these relationships, particularly in LMICs.

Methods

In this observational study, we assessed the association between baseline organizational characteristics and EBI sustainment within a cluster randomized implementation trial in Vietnam testing strategies to scale-up Systems Navigation and Psychosocial Counseling (SNaP) for people who inject drugs (PWID) living with HIV across 42 HIV testing clinics. From the Exploration, Preparation, Implementation, and Sustainment (EPIS) Framework, five baseline organizational characteristics were selected for investigation: 1) organizational readiness for implementing change; 2) implementation leadership; 3) implementation climate; 4) percent PWID; and 5) staff workload. Six to ten months post-study completion, clinic staff and leadership completed a survey that included the Provider Report of Sustainment Scale (PRESS), a measure of EBI sustainment across a clinic. We conducted clinic-level simple and multiple linear regression analyses to evaluate the association between organizational characteristics and sustainment.

Results

218 participants (94% completion rate) completed the PRESS survey. All implementation scales had good individual-level internal consistency reliability. Clinics with high organizational readiness to change at baseline had significantly greater SNaP sustainment than clinics with low organizational readiness to change (ß = 1.91, p = 0.015). None of the other organizational characteristics were associated with sustainment, controlling for study arm.

Conclusions

We identified the importance of organizational readiness for SNaP sustainment in Vietnam. This study adds to the evidence base around the relationship between organizational characteristics and HIV intervention sustainment and could inform the development of future sustainment strategies. We also identified several areas for organizational characteristic and sustainment measure advancement, including the need for pragmatic sustainment measures that also capture EBI adaptation. This research demonstrates that assessing clinics’ organizational readiness pre-implementation and providing tailored support to those with low readiness scores could improve HIV intervention sustainment for key populations.

Peer Review reports

Background

Intervention sustainment is one of the largest translational research problems in public health, as fewer than half of public health interventions are sustained long-term [1,2,3,4]. Sustainment is the extent to which a practice or intervention that has been newly implemented is continued as part of an organization’s regular operations [5]. We know that sustainment is impeded by many factors, such as lack of planning, challenges with obtaining long-term funding, and staff turnover [6]. However, less is known about the most effective approaches for sustaining evidence-based interventions (EBIs) (7).

One set of factors that may be important for intervention sustainment are organizational characteristics [8]. Organizational characteristics are organizations’ inner structures, culture, and climate. Organizational characteristics compose the contexts that are needed for effective implementation and can inform the selection of implementation strategies [8, 9]. For example, organizations with weak implementation leadership (meaning that leaders do not engage in behaviors to encourage EBI implementation) will likely experience challenges with implementing new interventions and may need to undertake strategies to improve implementation leadership to achieve EBI scale-up [10].

While conceptual frameworks suggest a wide range of organizational characteristics, four key characteristics that have been hypothesized to be critical for successful EBI implementation include: 1) implementation leadership, 2) implementation climate, 3) organizational readiness, and 4) clinic demographic variables (e.g., size) (defined below) [8, 11, 12]. Qualitative research suggests that these organizational characteristics may also be important for sustainment [13]. But their relationships with sustainment have been infrequently examined in quantitative research, largely due to sample size challenges, and the few studies that have been conducted have inconsistent findings around the associations between these characteristics and sustainment [7, 14, 15]. It is also unknown which of these organizational characteristics are most important for EBI sustainment; thus, it is unclear how to intervene most effectively. Even fewer studies have quantitatively examined sustainment determinants within low- and middle-income country (LMIC) settings [13, 16, 17].

Sustained, effective interventions are particularly needed for people who inject drugs (PWID) living with HIV. While significant advances have been made in addressing the HIV epidemic globally, PWID are at greater risk for HIV infection and have poorer HIV outcomes compared with the general population [18]. There are EBIs, most of which are clinic-based, that are effective in improving HIV outcomes among PWID [19]. However, given low rates of intervention sustainment and significant sustainment barriers, more work is needed to identify factors within clinics that may facilitate EBI sustainment for PWID.

To address these gaps, we conducted a quantitative study, as an extension of a larger cluster randomized implementation trial, to assess the association between organizational characteristics and sustainment (after the removal of external study support) of an EBI for PWID living with HIV across 42 HIV testing clinics in Vietnam. We hypothesized that clinics with stronger organizational characteristics (i.e., higher implementation leadership, implementation climate, organizational readiness, and percent PWID and smaller workloads) would also have higher sustainment.

Methods

Parent study and setting

In Vietnam, despite the government’s concerted efforts to address the HIV epidemic among PWID, this population has continued to have worse HIV prevention and care outcomes than the general population [20]. In 2023, PWID in Vietnam experienced an HIV prevalence of 9.1%, compared to a prevalence of 0.3% among adults ages 15–49 in the general population. PWID also had an HIV testing and status awareness of 63% and an antiretroviral therapy (ART) coverage of 62% compared to 94% and 78%, receptively, among the general population [21]. These disparities underscore the need for the sustainment of effective EBIs for this population in Vietnam.

This quantitative study is an extension of a hybrid type III implementation-effectiveness trial (referred to as “the parent study”; NCT03952520, multiple PIs: Go and Miller) [22]. The trial assessed the scale up of an EBI, Systems Navigation and Psychosocial Counseling (SNaP), designed to improve ART uptake and adherence among PWID. We tested two implementation strategies to scale up SNaP across HIV testing clinics in Vietnam. SNaP includes two sessions of systems navigation (over the phone or in-person) and at least one session of psychosocial counseling (with the option for additional booster sessions, as needed). SNaP was proven to be effective in HIV Prevention Trials Network (HPTN) 074, a randomized control trial conducted from 2015–2018 in Ukraine, Vietnam, and Indonesia [23]. HPTN 074 showed that SNaP was effective in reducing HIV mortality and transmission. It also improved ART use as well as use of medications for opioid use disorder and increased rates of viral suppression among PWID who were newly diagnosed with HIV or re-engaging in care [23, 24].

The parent study took place from 2020–2023 in 42 HIV testing clinics across ten provinces in Vietnam. Clinics were selected due to their high concentration of PWID and high concentrations of PWID within the larger catchment area. The goal of the trial was to compare the effectiveness of a standard package of implementation strategies (SA arm) to a package of strategies tailored to address each site’s barriers to implementation (TA arm) in scaling up SNaP. The SA and TA arm strategies were developed through intervention mapping, a systematic approach for designing and tailoring implementation strategies in collaboration with implementation partners [25]. The implementation mapping process resulted in 15 “discrete” implementation strategies in the SA arm (e.g., booster training sessions) and a menu of 10 additional strategies (e.g., audit and feedback) that sites in the TA arm could tailor to their needs. Clinics were randomized 1:1 to the SA or TA arm, and both arms received the SNaP intervention. The study activities included surveys with clinic staff at study baseline, 12, and 24 months. Inclusion criteria in the parent study for clinic staff were being a clinic director or staff member, including systems navigators and psychosocial counselors, involved with delivering SNaP at the selected HIV testing clinics, and willingness to participate [22]. See Nguyen et al. (2020) for further details on the parent study, including the list of implementation strategies, study design, and outcomes measurement [22].

Recruitment and data collection

Between April and December 2023, research staff re-contacted all clinic staff (n = 232) (clinic directors, navigators/counselors, and phlebotomists) who had been involved with the parent study across all 42 SNaP study clinics and invited them by email to participate in a SNaP sustainment survey. This survey included a reminder summary of the core components of the SNaP intervention and the Provider Report of Sustainment Scale (PRESS) [26]. Participants were invited to take the survey six to ten months after the SNaP study had ended at their clinic, which was staggered based on their clinic’s SNaP initiation date. A member of the study team, who is fluent in English and Vietnamese and has extensive experience with translations, translated the survey to Vietnamese. The survey was self-administered in Qualtrics, and participants were compensated (50,000 VND, ~ 2 USD) for their participation. Within each clinic, we stopped participant recruitment after reaching six surveys, which was the maximum number of staff participating in SNaP at each clinic. All participants provided informed consent before completing the survey, and the study was approved by the University of North Carolina at Chapel Hill, Hanoi Medical University, and Viet Nam Ministry of Health Institutional Review Boards. This study adheres to the STROBE checklist.

Conceptual frameworks and measures

Conceptual frameworks

The widely-used Exploration, Preparation, Implementation, Sustainment (EPIS) Framework informed our selection of the five organizational characteristics that we assessed in this paper for their association with sustainment: 1) organizational readiness for change; 2) implementation leadership; 3) implementation climate; 4) target population percentage; and 5) staff workload (definitions are listed below) [8, 12, 27]. These organizational characteristics are all represented within the EPIS and were chosen because of their importance for EBI implementation [8, 11, 12].

PRESS scale

We used the PRESS, a 3-item scale designed as a pragmatic assessment of providers’ perceptions of an intervention’s sustainment within their clinic [26]. The questions ask participants to rate how much they agree with the following items: 1) Staff use SNaP as much as possible when appropriate; 2) Staff continue to use SNaP throughout changing circumstances; and 3) SNaP is a routine part of our practice. PRESS uses five-point Likert scale response options with responses ranging from 0 = “not at all” to 4 = “a great extent” (scale score range: 0–12). In a study in the US, the PRESS had a high Cronbach’s alpha (0.947) and face and content validity [26]. From the PRESS, we constructed an average sustainment score for each clinic.

Organizational characteristics

The five organizational characteristic measures, which correspond to our five organizational characteristics, came from the parent study’s baseline survey, which was conducted with all participating clinic staff (n = 247) at the study clinics. These measures included: 1) Organizational Readiness for Implementing Change (ORIC) [28]; 2) the Implementation Leadership Scale (ILS) [29]; and 3) the Implementation Climate Scale (ICS) [30], in addition to two clinic demographic variables: 4) PWID size and 5) staff workload.

ORIC assesses how prepared “psychologically and behaviorally” members of an organization are to implement a new change [28]. It is a 12-item measure with Likert scale response options ranging from 0 to 5 and two sub-scales that include “change commitment” and “change efficacy.” In the initial psychometric assessment of the ORIC in the US, it had high reliability (Alpha: 0.89–0.91) and strong validity [28]. We calculated an average ORIC score for each participant (scale range = 0–5, with higher scores indicating higher organizational readiness to change) and then averaged scores within clinics to create a clinic score. The ORIC was heavily skewed, so we dichotomized it at its median to create “high” and “low” values to improve interpretation. We selected the median because it is a widely-used cutoff for creating binary variables [31].

The Implementation Leadership Scale (ILS) assesses actions that leaders have taken to facilitate implementation efforts in their organizations. It includes two scale versions: one for leadership that asks about their own implementation leadership and a second for clinic staff that asks about leaders’ implementation leadership. We used both versions of the scale. The ILS contains 12 items across four sub-scales: 1) proactive leadership; 2) knowledgeable leadership; 3) supportive leadership; and 4) perseverant leadership. It has 5-point Likert scale response options ranging from 0 = “not at all” to 4 = “a very great extent” and has demonstrated good validity and reliability [29]. We averaged across the sub-scales to create a score for each participant and then averaged across participants within a clinic (total score range = 0–4, with higher scores indicating higher implementation leadership).

The Implementation Climate Scale (ICS) is an 18-item scale that assesses “the shared meaning organizational members attach to the events, policies, practices, and procedures they experience and the behaviors they see being rewarded, supported, and expected” [30]. The ICS contains six sub-scales: 1) Focus on evidence-based practice (EBP); 2) Educational support for EBP; 3) Recognition for EBP; 4) Rewards for EBP; 5) Selection for EBP; and 6) Selection for openness. We created a total score for each participant by averaging across sub-scales and then calculated an average score for each site (total score range = 0–4, with higher scores indicating stronger implementation climates). The ICS has the same response options as the ILS and also has demonstrated good validity and reliability [30].

We also assessed two clinic demographic variables, PWID size and staff workload. As part of the baseline survey, clinic leaders at each of the sites were asked to input this data for their site. PWID size was assessed as percentage of PWID HIV tests/total HIV tests in the past year. Staff workload was assessed as total number of HIV tests in the past year/number of clinic staff. Four clinic leaders did not complete data for these variables. After conducting an analysis to confirm that there was not a significant difference between clinics’ baseline and 12-month scores on these variables, we replaced the missing values with the four clinics’ respective 12-month survey responses. Both variables were highly skewed, so we dichotomized them at their median to create “high” and “low” values to facilitate interpretation.

Data analysis

Using StataBE version 17 [32], we prepared and cleaned the data, including reviewing item distributions, checking for outliers, assessing correlations and Cronbach’s alphas for the scales, and generating basic descriptive statistics. We first conducted a bivariate linear regression analysis to assess the associations between each of the organizational characteristics and the PRESS score (assessed with α = 0.05). After checking for collinearity, we ran a multiple linear regression to assess the association between all organizational characteristics and sustainment score, controlling for study arm (assessed with α = 0.05). Results are presented as regression coefficients, standard errors, 95% confidence intervals, and p-values.

Results

Two hundred forty-seven participants completed the baseline clinic demographics survey (i.e., implementation climate, implementation leadership, and organizational readiness scales). Most baseline survey participants were clinic staff (63%), while 37% were site directors or vice directors. In addition to HIV counseling and testing, half of the sites also offered HIV confirmation testing and 61% offered HIV treatment. The median number of tests conducted in the last year across sites was 1,657, while the median number of positive tests was 17. The median percentage of PWID at the sites in the past year was 23% (Inter-quartile range (IQR) = 11–40%), while the median percent positive HIV tests among PWID was 36% (IQR = 10–70%). The median total number of staff members per site was 4 (IQR = 2–5) (Table 1).

Table 1 Clinic demographic characteristics (n = 42)

In total, 218 participants completed the PRESS (out of 232 possible participants) at 6–10 months post-study completion, which represented a 94% completion rate. This number of potential participants is smaller than the number at baseline, given that some clinic staff/leadership retired or changed positions. The average number of surveys completed per site was five (range = 2–6). The average PRESS score across sites was 8.54 (Standard Deviation (SD) = 1.71) (scale score range = 0–12, with higher scores indicating higher sustainment). See Table 2 for predictor descriptive statistics. All the scales had high Cronbach’s alphas: ORIC alpha = 0.96; ICS alpha = 0.95; ILS alpha = 0.95. The PRESS Cronbach’s alpha was slightly lower but still good (alpha = 0.88).

Table 2 Organizational characteristics descriptive statistics and simple linear regression of the association between organizational characteristics and sustainment score

In the simple linear regression, implementation climate (ß = −0.10, p = 0.896), implementation leadership (ß = −0.06, p = 0.950), percent PWID (ß = 0.02, p = 0.965), staff workload (ß = 0.11, p = 0.843), and study arm (TA vs. SA) (ß = 0.40, p = 0.456) were not associated with sustainment. Organizational readiness was borderline significantly associated with the PRESS sustainment score (ß = 0.93, p = 0.078) (Table 2).

In the multiple linear regression, we found that clinics with high organizational readiness to change had significantly greater sustainment (ß = 1.91, p = 0.015) than clinics with low organizational readiness to change, controlling for study arm. Implementation climate (ß = −0.79, p = 0.564), implementation leadership (ß = −1.68, p = 0.350), percent PWID (ß = −0.12, p = 0.825), staff workload (ß = −0.15, p = 0.802), and study arm (ß = 0.30, p = 0.580) were not significantly associated with PRESS sustainment score (Table 3).

Table 3 Multiple linear regression of the association between organizational characteristics and sustainment score

Discussion

As far as we know, this is one of the first studies to examine the association between organizational characteristics and sustainment of an HIV intervention for PWID. In our analysis of the relationship between organizational characteristics and sustainment of the SNaP intervention in HIV testing clinics in Vietnam, we found that organizational readiness to change was associated with reported sustainment. In contrast, we found that implementation climate, implementation leadership, percent PWID, staff workload, and study arm were not significantly associated with sustainment. These findings were somewhat contrary to our hypothesis that all the organizational characteristic variables would be significantly associated with sustainment of SNaP post-study completion.

This was also one of the first studies to quantitatively examine the association between organizational readiness to change and intervention sustainment within an LMIC setting. The few studies that have assessed this relationship were all conducted in the US and had inconsistent findings related to the association between these two variables [14, 33,34,35,36]. These studies had different settings, interventions, and measures of readiness and sustainment than ours. They used the Organizational Readiness for Change and Organizational Readiness for Change Assessment to measure readiness, whereas we used the ORIC [37, 38]. To measure sustainment, they used claims data or unvalidated survey questions that asked participants to what extent they had continued delivering the intervention. They then dichotomized responses into “sustained” versus “not sustained” [14, 33,34,35,36]. Given the wide range of methods for measuring sustainment, including the use of unvalidated study-specific measures, greater standardization of how sustainment is measured globally would be helpful to improve confidence in sustainment outcomes and facilitate cross-study comparisons [39].

One possible explanation for our findings that organizational readiness was associated with sustainment while the other organizational characteristics were not is that some researchers have hypothesized that organizational readiness may be closer on the causal pathway to EBI implementation success (and potentially sustainment) than some of our other organizational characteristics, like implementation leadership and climate [8]. While organizational characteristics are viewed as being inter-related, if organizational readiness is more proximal to sustainment, this may have led us to see a significant relationship between these variables, while we may have been unable to see relationships with our organizational characteristics that were further away from sustainment on the causal pathway (particularly with our relatively small sample size).

Additionally, given that organizational readiness was more significant in the multivariate analysis than the bivariate analysis and there was a somewhat strong correlation between organizational readiness, implementation climate, and implementation leadership, it is likely that a suppressor effect (when adding a predictor augments the predictive power of another independent variable) was occurring in this relationship [40]. This is one of the reasons why it is important to not solely rely on bivariate relationships for determining which variables should be included in multivariate analyses [41].

In terms of the broader implications of this research, our findings indicate that in future scale-up of SNaP in Vietnam (and elsewhere), clinic leadership and policymakers may want to assess clinics’ readiness for implementation and provide tailored support to clinics with low readiness to change scores to increase the likelihood of long-term sustainment of SNaP. There are several approaches that may be effective for increasing organizational readiness, but they have yet to be widely tested. These include organizations using Implementation Mapping and the Readiness Building System to prioritize readiness goals and select strategies to improve readiness, and use of strategies identified from the Organizational Readiness Typology (e.g., conduct local consensus discussions, conduct educational meetings) to increase organizational readiness [42, 43]. Testing these systematic strategy selection approaches and assessing their effect on readiness will be an important next step.

Another factor that could have contributed to our null findings is that there may be other external factors that are relatively more important for SNaP sustainment than some of the organizational-level factors. In a qualitative study of factors related to SNaP sustainment, we identified external factors, like lack of funding, barriers to getting PWID into care, and government mandates, as being particularly influential in the long-term sustainment of SNaP [44]. It is possible that these external factors may have been driving differences in SNaP sustainment across clinics more than some of our organizational-level factors. In the literature, while one mixed methods review found that inner setting barriers/facilitators were the most commonly identified type of sustainment determinants across studies [13], a predictive study of factors influencing EBI sustainment identified that external factors, including funding stability, political support, and partnerships, appeared to be more important for sustainment than inner setting factors [14]. If these external factors have a greater influence on sustainment than some of our clinic-level factors, this could have important implications in terms of the selection of sustainment strategies in future studies.

Additionally, there are challenges with the measurement of organizational characteristics within LMIC settings. All of the organizational characteristic measures that we used in this study were developed in high-income countries (HICs). While these measures have been previously used within LMICs [45,46,47], researchers have questioned their applicability outside of HICs [45], given differences in structures and financing of healthcare globally and differing cultural values [48]. For example, research has found that leadership may look differently in countries that are influenced by Confucianism (like Vietnam) than in Western countries [49]. These countries may put greater value on collectivism and paternalism than Western countries, which could have implications for the adaptation of organizational measures (e.g., the addition of a “directive leadership” dimension to the ILS that focuses on leadership mandates or vocal support around EBI implementation). In response, they have called for the development of more context-specific measurements as part of a broader movement towards decolonization of global implementation science [50]. We also identified several translation challenges with these measures into Vietnamese (e.g., some items direct translated to the exact same sentence) and had to further modify them as a result. In response to these challenges, recently a few implementation science measures, like the Mental Health Implementation Science Tools [45], have been developed to be more applicable to LMIC settings. More work is needed to adapt implementation measures to advance the study of implementation and sustainment within LMICs.

In addition to challenges with measuring organizational characteristics, there is also a dearth of sustainment measures that are both pragmatic and have strong psychometric properties, and even fewer that measure sustainment as an outcome rather than measuring sustainment determinants [39, 51,52,53,54]. In a systematic review of 28 sustainment measures (all that were developed in HICs), the PRESS had the best psychometric and pragmatic properties across sustainment measures [39]. However, the PRESS does not allow for a nuanced assessment of adaptations or of partial sustainment of some, but not other, components of an EBI. This is an issue given that intervention adaptations are common and may be helpful to ensure continued EBI fit within clinics over time [13, 55]. This demonstrates the challenge in the use of a highly general sustainment measure, like the 3-item PRESS, that is pragmatic and easy for busy providers to answer but may not fully capture whether an EBI has been sustained or not. There is a need for more work around sustainment measurement to develop psychometrically strong measures that can capture EBI adaptation while maintaining pragmatism. One possibility could be to develop a scale that asks implementors about the degree to which they continued to implement each of an EBI’s core components, with open-ended questions to list adaptations that were made [7].

There are some limitations to this research. First, some of the self-reported organizational characteristics (particularly implementation leadership) and sustainment may have been subject to social desirability bias. This could have been a challenge in Vietnam where there is cultural respect for hierarchy and people in authority [56]. While participants were reminded at the start of the surveys that their responses were anonymous, these factors still might have encouraged clinic staff to score leaders and the organizations more highly on the implementation and sustainment measures, leading to skew in the data. Use of innovative observational methods could help to get a more objective measure of sustainment. For example, standardized patients, who are trained to present with a certain condition, could be used to assess if providers in a clinic have continued to implement an intervention and to what degree they have adapted it [57].

Additionally, while 42 clinics is a relatively large number for implementation research, it is possible that with this number of clinics and the skew in the data, we were unable to see small to medium-small effect sizes in the relationships between organizational characteristics and SNaP sustainment. There was also variability in the number of surveys conducted at each site, given that the number of staff members varied across sites. As a result, extreme responses would have had a greater effect on scores in clinics with fewer staff than in clinics with more staff involved in SNaP. Finally, while there is no standard amount of time when an intervention can be assessed for sustainment, in the literature, measuring sustainment two or more years after study end is viewed as ideal [7]. This study allowed us to see what happened to SNaP in the early sustainment phase, and even at 6–10 months, we already saw variation in sustainment across the sites. An area for future research would be to repeat the sustainment measure at a later point to determine both if SNaP had been sustained long-term in the clinics and if organizational readiness remained significantly associated with sustainment at this later time point.

Conclusions

From our analysis of the association between theoretically important organizational characteristics and SNaP sustainment, we identified that organizational readiness to change was significantly associated with sustainment, while other organizational characteristics (implementation climate, implementation leadership, percent PWID, and staff workload) were not. We also identified several factors that might have contributed to these null findings, including potential measurement challenges. This study adds to the limited quantitative research on the association between organizational characteristics and sustainment, particularly within LMIC settings, and could inform future selection of strategies for HIV intervention sustainment.

Data availability

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

ART:

Anti-retroviral Therapy

EBI:

Evidence-based Intervention

EBP:

Evidence-based Practice

EPIS:

Exploration, Preparation, Implementation, Sustainment

HIC:

High-income Country

HPTN:

HIV Prevention Trials Network

ICS:

Implementation Climate Scale

ILS:

Implementation Leadership Scale

IQR:

Inter-quartile Range

LMIC:

Low and Middle-income Country

ORIC:

Organizational Readiness for Implementing Change

PRESS:

Provider Report of Sustainment Scale

PWID:

People Who Inject Drugs

SA:

Standard Arm

SD:

Standard Deviation

SNaP:

Systems Navigation and Psychosocial Counseling

TA:

Tailored Arm

References

  1. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, et al. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implement Sci. 2015;10(1):88.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Hodge LM, Turner KM. Sustained implementation of evidence-based programs in disadvantaged communities: a conceptual framework of supporting factors. Am J Community Psychol. 2016;58(1–2):192–210.

    Article  PubMed  Google Scholar 

  3. Wiltsey Stirman S, Kimberly J, Cook N, Calloway A, Castro F, Charns M. The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implement Sci. 2012;7(1):17.

    Article  PubMed  PubMed Central  Google Scholar 

  4. Pluye P, Potvin L, Denis J-L. Making public health programs last: conceptualizing sustainability. Eval Program Plann. 2004;27(2):121–33.

    Article  Google Scholar 

  5. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76.

    Article  PubMed  Google Scholar 

  6. Hailemariam M, Bustos T, Montgomery B, Barajas R, Evans LB, Drahota A. Evidence-based intervention sustainability strategies: a systematic review. Implement Sci. 2019;14(1):57.

    Article  PubMed  PubMed Central  Google Scholar 

  7. Shelton RC, Cooper BR, Stirman SW. The sustainability of evidence-based interventions and practices in public health and health care. Annu Rev Public Health. 2018;39(1):55–76.

    Article  PubMed  Google Scholar 

  8. Aarons GA, Moullin JC, Ehrhart MG. 'The Role of Organizational Processes in Dissemination and Implementation Research'. In: Brownson RC, Colditz GA, Proctor EK (eds). Dissemination and Implementation Research in Health: Translating Science to Practice, 2nd edn. New York: Oxford Academic; 2017. https://doiorg.publicaciones.saludcastillayleon.es/10.1093/oso/9780190683214.003.0008.

  9. Powell BJ, Mandell DS, Hadley TR, Rubin RM, Evans AC, Hurford MO, et al. Are general and strategic measures of organizational context and leadership associated with knowledge and attitudes toward evidence-based practices in public behavioral health settings? A cross-sectional observational study. Implement Sci. 2017;12(1):64.

    Article  PubMed  PubMed Central  Google Scholar 

  10. Castiglione SA. Implementation leadership: a concept analysis. J Nurs Manag. 2020;28(1):94–101.

    Article  PubMed  Google Scholar 

  11. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  12. Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health. 2011;38(1):4–23.

    Article  PubMed  Google Scholar 

  13. Zurynski Y, Ludlow K, Testa L, Augustsson H, Herkes-Deane J, Hutchinson K, et al. Built to last? Barriers and facilitators of healthcare program sustainability: a systematic integrative review. Implement Sci. 2023;18(1):62.

    Article  PubMed  PubMed Central  Google Scholar 

  14. Hunter SB, Han B, Slaughter ME, Godley SH, Garner BR. Predicting evidence-based treatment sustainment: results from a longitudinal study of the Adolescent-Community Reinforcement Approach. Implement Sci. 2017;12(1):75.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Aarons GA, Green AE, Trott E, Willging CE, Torres EM, Ehrhart MG, et al. The roles of system and organizational leadership in system-wide evidence-based intervention sustainment: a mixed-method study. Adm Policy Mental Health Mental Health Serv Res. 2016;43(6):991–1008.

    Article  Google Scholar 

  16. Yano EM. The role of organizational research in implementing evidence-based practice: QUERI Series. Implement Sci. 2008;3:29-.

    Article  PubMed  PubMed Central  Google Scholar 

  17. Allen JD, Towne SD, Maxwell AE, DiMartino L, Leyva B, Bowen DJ, et al. Measures of organizational characteristics associated with adoption and/or implementation of innovations: a systematic review. BMC Health Serv Res. 2017;17(1):591.

    Article  PubMed  PubMed Central  Google Scholar 

  18. UNAIDS. Key populations. Available from: https://www.unaids.org/en/topic/key-populations#:~:text=UNAIDS%20considers%20gay%20men%20and,lack%20adequate%20access%20to%20services.

  19. Uusküla A, Feelemyer J, Des Jarlais DC. HIV treatment, antiretroviral adherence and AIDS mortality in people who inject drugs: a scoping review. Eur J Public Health. 2023;33(3):381–8.

    Article  PubMed  PubMed Central  Google Scholar 

  20. Nguyen TH, Nguyen TL, Trinh QH. HIV/AIDS epidemics in Vietnam: evolution and responses. AIDS Educ Prev. 2004;16(3 Suppl A):137–54.

    PubMed  Google Scholar 

  21. UNAIDS. Vietnam country factsheets. 2023. Available from: https://www.unaids.org/en/regionscountries/countries/vietnam.

  22. Nguyen MXB, Chu AV, Powell BJ, Tran HV, Nguyen LH, Dao ATM, et al. Comparing a standard and tailored approach to scaling up an evidence-based intervention for antiretroviral therapy for people who inject drugs in Vietnam: study protocol for a cluster randomized hybrid type III trial. Implement Sci. 2020;15(1):64-.

    Article  PubMed  PubMed Central  Google Scholar 

  23. Miller WC, Hoffman IF, Hanscom BS, Ha TV, Dumchev K, Djoerban Z, et al. A scalable, integrated intervention to engage people who inject drugs in HIV care and medication-assisted treatment (HPTN 074): a randomised, controlled phase 3 feasibility and efficacy study. Lancet. 2018;392(10149):747–59.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Miller W, Hoffman I, Hanscom B, Ha T, Dumchev K, Djoerban Z, et al. Impact of systems navigation and counseling on ART, SUT, and death in PWID: HPTN 074. 2018. Available from: http://www.croiconference.org/sessions/impact-systems-navigation-and-counseling-art-sut-and-death-pwid-hptn-074.

  25. Fernandez ME, ten Hoor GA, van Lieshout S, Rodriguez SA, Beidas RS, Parcel G, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158.

    Article  PubMed  PubMed Central  Google Scholar 

  26. Moullin JC, Sklar M, Ehrhart MG, Green A, Aarons GA. Provider REport of Sustainment Scale (PRESS): development and validation of a brief measure of inner context sustainment. Implement Sci. 2021;16(1):86.

    Article  PubMed  PubMed Central  Google Scholar 

  27. Moullin JC, Dickson KS, Stadnick NA, Rabin B, Aarons GA. Systematic review of the exploration, preparation, implementation, sustainment (EPIS) framework. Implement Sci. 2019;14(1):1.

    Article  PubMed  PubMed Central  Google Scholar 

  28. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: a psychometric assessment of a new measure. Implement Sci. 2014;9(1): 7.

    Article  PubMed  PubMed Central  Google Scholar 

  29. Aarons GA, Ehrhart MG, Farahnak LR. The implementation leadership scale (ILS): development of a brief measure of unit level implementation leadership. Implement Sci. 2014;9(1): 45.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Ehrhart MG, Aarons GA, Farahnak LR. Assessing the organizational context for EBP implementation: the development and validity testing of the Implementation Climate Scale (ICS). Implement Sci. 2014;9(1):157.

    Article  PubMed  PubMed Central  Google Scholar 

  31. DeCoster J, Gallucci M, Iselin AMR. Best practices for using median splits, artificial categorization, and their continuous alternatives. J Exper Psychopathol. 2011;2(2):197–209.

    Article  Google Scholar 

  32. StataCorp. Stata statistical software: release 17. College Station, TX: StataCorp LLC; 2023.

  33. Swindle T, Bellows LL, Mitchell V, Johnson SL, Shakya S, Zhang D, et al. Predictors of sustainment of two distinct nutrition and physical activity programs in early care and education. Front Health Serv. 2022;2:1010305.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Rodriguez A, Lau AS, Wright B, Regan J, Brookman-Frazee L. Mixed-method analysis of program leader perspectives on the sustainment of multiple child evidence-based practices in a system-driven implementation. Implement Sci. 2018;13(1):44.

    Article  PubMed  PubMed Central  Google Scholar 

  35. Lau AS, Lind T, Motamedi M, Lui JHL, Kuckertz M, Innes-Gomberg D, et al. Prospective predictors of sustainment of multiple EBPs in a system-driven implementation context: Examining sustained delivery based on administrative claims. Implement Res Pract. 2021;2:26334895211057884.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Cooper BR, Bumbarger BK, Moore JE. Sustaining evidence-based prevention programs: correlates in a large-scale dissemination initiative. Prev Sci. 2015;16(1):145–57.

    Article  PubMed  Google Scholar 

  37. Helfrich CD, Li Y-F, Sharp ND, Sales AE. Organizational readiness to change assessment (ORCA): Development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework. Implement Sci. 2009;4(1): 38.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Lehman WEK, Greener JM, Simpson DD. Assessing organizational readiness for change. J Subst Abuse Treat. 2002;22(4):197–209.

    Article  PubMed  Google Scholar 

  39. Hall A, Shoesmith A, Doherty E, McEvoy B, Mettert K, Lewis CC, et al. Evaluation of measures of sustainability and sustainability determinants for use in community, public health, and clinical settings: a systematic review. Implement Sci. 2022;17(1):81.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Watson D, Clark LA, Chmielewski M, Kotov R. The value of suppressor effects in explicating the construct validity of symptom measures. Psychol Assess. 2013;25(3):929–41.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Pandey S, Elliott W. Suppressor variables in social work research: Ways to identify in multiple regression models. US: Society for Social Work and Research; 2010. p. 28–40.

    Google Scholar 

  42. Watson AK, Hernandez BF, Kolodny-Goetz J, Walker TJ, Lamont A, Imm P, et al. Using implementation mapping to build organizational readiness. Front Public Health. 2022;10: 904652.

    Article  PubMed  PubMed Central  Google Scholar 

  43. Vax S, Farkas M, Russinova Z, Mueser KT, Drainoni M-L. Enhancing organizational readiness for implementation: constructing a typology of readiness-development strategies using a modified Delphi process. Implement Sci. 2021;16(1):61.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Bartels SM, Nguyen MX, Nguyen TT, Sibley AL, Dang HLT, Nong HTT, et al. Sustainment and adaptation of systems navigation and psychosocial counseling across HIV testing clinics in Vietnam: a qualitative assessment. Implement Res Pract. 2025;6:26334895251319812.

    Article  PubMed  PubMed Central  Google Scholar 

  45. Aldridge LR, Kemp CG, Bass JK, Danforth K, Kane JC, Hamdani SU, et al. Psychometric performance of the Mental Health Implementation Science Tools (mhIST) across six low- and middle-income countries. Implement Sci Commun. 2022;3(1):54.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Hazim CE, Dobe I, Pope S, Ásbjörnsdóttir KH, Augusto O, Bruno FP, et al. Scaling-up and scaling-out the systems analysis and improvement approach to optimize the hypertension diagnosis and care cascade for HIV infected individuals (SCALE SAIA-HTN): a stepped-wedge cluster randomized trial. Implement Sci Commun. 2024;5(1):27.

    Article  PubMed  PubMed Central  Google Scholar 

  47. Iwelunmor J, Ezechi O, Obiezu-Umeh C, Oladele D, Nwaozuru U, Aifah A, et al. Factors influencing the integration of evidence-based task-strengthening strategies for hypertension control within HIV clinics in Nigeria. Implement Sci Commun. 2022;3(1):43.

    Article  PubMed  PubMed Central  Google Scholar 

  48. Mills A. Health care systems in low- and middle-income countries. N Engl J Med. 2014;370(6):552–7.

    Article  CAS  PubMed  Google Scholar 

  49. McDonald P. Confucian foundations to leadership: a study of Chinese business leaders across Greater China and South-East Asia. Asia Pac Bus Rev. 2012;18(4):465–87.

    Article  Google Scholar 

  50. Bartels SM, Haider S, Williams CR, Mazumder Y, Ibisomi L, Alonge O, et al. Diversifying implementation science: a global perspective. Global health: science and practice. 2022.

  51. Lewis CC, Mettert KD, Stanick CF, Halko HM, Nolen EA, Powell BJ, et al. The psychometric and pragmatic evidence rating scale (PAPERS) for measure development and evaluation. Implement Res Pract. 2021;2:26334895211037390.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Stanick CF, Halko HM, Nolen EA, Powell BJ, Dorsey CN, Mettert KD, et al. Pragmatic measures for implementation research: development of the Psychometric and Pragmatic Evidence Rating Scale (PAPERS). Transl Behav Med. 2021;11(1):11–20.

    Article  PubMed  Google Scholar 

  53. Moullin JC, Sklar M, Green A, Dickson KS, Stadnick NA, Reeder K, et al. Advancing the pragmatic measurement of sustainment: a narrative review of measures. Implement Sci Commun. 2020;1(1):76.

    Article  PubMed  PubMed Central  Google Scholar 

  54. Lewis CC, Fischer S, Weiner BJ, Stanick C, Kim M, Martinez RG. Outcomes for implementation science: an enhanced systematic review of instruments using evidence-based rating criteria. Implement Sci. 2015;10(1):155.

    Article  PubMed  PubMed Central  Google Scholar 

  55. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implement Sci. 2013;8:117-.

    Article  PubMed  PubMed Central  Google Scholar 

  56. Nguyen DTN, Teo STT, Grover SL, Nguyen NP. Respect, bullying, and public sector work outcomes in Vietnam. Public Manag Rev. 2019;21(6):863–89.

    Article  Google Scholar 

  57. Beullens J, Rethans JJ, Goedhuys J, Buntinx F. The use of standardized patients in research in general practice. Fam Pract. 1997;14(1):58–62.

    Article  CAS  PubMed  Google Scholar 

Download references

Funding

This study was supported by a grant from the National Institute on Drug Abuse (NIDA), 1R01DA047876-01. SMB was also supported by NIDA through 1 F31DA057893-01 A. BJP was supported in part by the NIH through the following grants: R01DA047876, R25MH080916, U24HL154426, R01 CA262325, and R01HD112323.

Author information

Authors and Affiliations

Authors

Contributions

SMB conceived the study and obtained the funding with support from VG, WCM, CB, LMR, LMG, and BJP who also helped with protocol development. WCM and VG conceived the parent study and obtained the parent study funding. HVT, NTKN, MXN, TTN, VATC, and VATT facilitated or contributed to data collection and results interpretation. LMR also supported data analysis. SMB drafted the manuscript, and VG, WCM, CB, LMR, TS, and HTTP revised it critically with important intellectual contents. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Sophia M. Bartels.

Ethics declarations

Ethics approval and consent to participate

The study was approved by the University of North Carolina at Chapel Hill, Hanoi Medical University, and Viet Nam Ministry of Health Institutional Review Boards. All participants provided written informed consent before participating in the study interviews.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Bartels, S.M., Nguyen, M.X., Nguyen, T.T. et al. The role of organizational characteristics in intervention sustainment: findings from a quantitative analysis in 42 HIV testing clinics in Vietnam. Implement Sci Commun 6, 60 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00745-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00745-6

Keywords