Skip to main content

Relationships between internal facilitation processes and implementation outcomes among hospitals participating in a quality improvement collaborative to reduce cesarean births: a mixed-methods embedded case study

Abstract

Background

Quality improvement collaboratives (QICs) are a common strategy for implementing evidence-based practices; however, there is often variable performance between participating organizations. Few studies of QICs assess the internal facilitation (IF) processes engaged in by participating organizations, which may be key to understanding and enhancing the effectiveness of QICs as an implementation strategy. We examined IF processes among hospitals participating in Maryland’s perinatal QIC to implement national guidelines for reducing primary cesarean births.

Methods

This study followed a mixed-methods embedded case study design. We conducted qualitative interviews with internal implementation leaders at 21 QIC-participating hospitals using a guide informed by the iPARIHS and CFIR frameworks. Two investigators independently coded transcripts in Dedoose using a modified CFIR codebook including seven IF process codes adapted from published categorizations. The investigators also independently applied the CFIR rating system to rate each IF process as a barrier (-2, -1), facilitator (+ 1, + 2), neutral (0), or mixed (X), for each hospital. Final ratings were established through consensus discussions. Average ratings were calculated by hospital and process and charted alongside implementation outcomes from secondary data sources for identification of patterns.

Results

Hospital leaders engaged in a variety of activities within each IF process. The average hospital rating across IF processes ranged from -1.1 to + 1.5. The IF process with the highest average rating was project management (average: 1.0; SD: 0.9), the lowest was planning (average: 0.5; SD: 1.0) and the most variable was providing individual support and accountability (average: 0.5; SD: 1.2). Negative ratings resulted from hospital teams not engaging in an IF process or the activities of hospital teams being insufficient to overcome related contextual barriers. Average IF process ratings were significantly higher among hospitals that implemented more than the median number of practice changes. Multiple contextual determinants influenced each IF process; work infrastructure and relational connections were the most frequent influences across IF processes.

Conclusions

IF processes played an important role in determining implementation success at hospitals participating in a perinatal QIC. Monitoring and strengthening IF processes at participating organizations may enhance the effectiveness of QICs as an implementation strategy.

Peer Review reports

Background

Quality improvement collaboratives (QICs), or learning collaboratives, are an implementation strategy [1] that has been used in a wide range of clinical settings for incorporating evidence-based practices into routine care [2, 3]. The collaborative approach, popularized by the Institute for Healthcare Improvement (IHI) “Breakthrough Series” model in the 1990s [4], typically convenes a group of teams from multiple healthcare organizations who commit to work towards a common goal for improving practice and patient outcomes during a period of one to two years. The effectiveness of QICs as an implementation strategy has been the subject of many studies across clinical areas. Systematic reviews have concluded that QICs are generally, but not always, effective strategies for improving clinical outcomes across organizations [2, 5].

In the United States, the QIC has become a primary strategy for improving maternal health outcomes. Maternal outcomes in the United States are poor compared to peer nations and have worsened in recent decades [6]. Unwarranted variations in care and suboptimal implementation of evidence-based practices contribute to poor maternal outcomes [7]. To support implementation of evidence-based maternity care, U.S. agencies have funded the development of consensus maternal safety bundles [8] and their implementation through state perinatal quality collaboratives [9]. Since the first state perinatal QICs were founded between 2005 and 2010 [10], the strategy has expanded to almost all states, including 36 state perinatal QICs that receive funding from the U.S. Centers for Disease Control and Prevention (CDC) [11]. Several state perinatal QICs have reported marked success in improving outcomes, including reducing morbidity for patients with obstetric hemorrhage in California [12] and perinatal substance use disorder in Ohio [13], as well as reducing cesarean births in Maryland and California [14, 15].

Strategies deployed within a collaborative that are considered to contribute to their successes may include facilitation or technical assistance from the entity coordinating the collaborative [16], cross-organization learning activities [17, 18], and accountability and normative pressure [19, 20]. However, the overall improvements across all organizations participating in a collaborative can mask failures to improve among some participating organizations [5, 18, 20]. In fact, evaluations of perinatal QICs commonly report that up to one-third of participating hospitals do not make the expected improvements in patient outcomes [12, 21, 22]. As a result, recent studies emphasized the need to understand the processes and mechanisms through which QICs achieve improvements in clinical care, both within and between participating organizations [20].

The concept of implementation facilitation, as defined in the integrated PARIHS framework, provides a lens for understanding variable performance within QICs. The iPARIHS framework considers implementation facilitation as “the construct that activates implementation through assessing and responding to characteristics of the innovation and the recipients (both as individuals and in teams) within their contextual setting” [23]. In other words, implementation facilitation is a series of actions performed by one or more individuals in a facilitator role. Within QICs, implementation facilitation can be viewed as operating at two levels—external facilitation by collaborative organizers, who operate outside participating organizations, and internal facilitation that is performed by organizational leaders and is particular to each participating organization.

External facilitation is one of the most well-tested implementation strategies, and the evidence largely supports its effectiveness [24]. However, implementation facilitation has been characterized as a “black box” implementation strategy, with insufficient attention paid to its processes and mechanisms [25]. Internal facilitation is not as well studied as external facilitation, and is less easily observed for implementation support personnel and researchers. Few studies assess the internal facilitation activities at organizations participating in QICs [2, 26], which may be key to understanding and enhancing the effectiveness of QICs as an implementation strategy. For example, the extent of activity by members of participating organizations has been noted as a determinant of QIC success [27].

This study examined internal facilitation processes among hospitals participating in Maryland’s perinatal QIC to implement an obstetric patient safety bundle for reducing primary cesarean births between 2016 and 2018. The objectives of the study were to: 1) describe the internal facilitation roles and processes of labor & delivery staff leading their organization’s participation in the QIC; and, 2) assess the relationship between internal facilitation processes and implementation outcomes.

Methods

Setting and intervention

This study took place as a part of an evaluation of the Maryland Perinatal-Neonatal Quality Collaborative’s initiative to reduce cesarean birth rates between July 2016 and December 2018. This initiative aligned with the Healthy People 2020 goal to achieve a cesarean rate among first-time mothers with low risk pregnancies of 24.7% [28]. The year prior to the initiative, Maryland’s primary cesarean rate was 28.5%, 3 percentage points above the national average [29].

Thirty-one of the thirty-two birthing hospitals in Maryland participated in the initiative and committed to implement practice changes from the Alliance for Innovation on Maternal Health (AIM) Program’s “Safe Reduction of Primary Cesarean Birth” maternal safety bundle (hereafter, the bundle) [30]. At that time, the bundle included 26 discrete policies, practices, and implementation strategies aligned with practice guidelines of the American College of Obstetricians and Gynecologists and other professional associations in maternity care [15]. The Maryland Patient Safety Center, a non-profit organization, served as the external facilitator for the collaborative. External facilitation activities were modelled after the IHI Breakthrough Series [4], and included presentations from national experts, collaborative-wide calls for sharing information and progress, and required monthly data submission of select process and structure measures specified by the AIM program (see Appendix 1 for further details) [15].

Conceptual framework and design

This embedded case study [31] followed a sequential-explanatory mixed-methods design [32]. Quantitative data collected through a previously-published evaluation of Maryland’s initiative [15] allowed for description of implementation outcomes among participating hospitals. These outcomes informed the qualitative methods, which sought to understand the implementation processes and contextual determinants that characterized high- and low-performing hospitals [33]. We adapted the iPARIHS framework to the context of QICs to guide the study (Fig. 1). Specifically, we considered facilitation as taking place both external to participating hospitals (e.g., from the Maryland Patient Safety Center) and internal to participating hospitals. This study focused on internal facilitation (IF) processes and context within each hospital.

Fig. 1
figure 1

Conceptual model of external and internal facilitation in quality improvement collaboratives. Note for Fig. 1: Adapted from the iPARIHS framework

Implementation outcomes

We conducted secondary analysis of implementation outcomes among participating hospitals to identify those with the highest and lowest performance during the Maryland perinatal QIC's cesarean initiative. Hospital implementation outcomes were assessed through surveys of internal facilitators at participating hospitals. The survey, described in detail elsewhere [15], included questions about the individuals participating in internal facilitation and the implementation status of all 26 bundle-recommended practices at the beginning and end of the collaborative. Response options for the implementation status of each practice included, “not started,” “planning,” “partially implemented during the collaborative,” “fully implemented during the collaborative,” and “fully implemented prior to the collaborative.” To compare relative implementation performance, we calculated the number of practices, from the list of 26 recommendations, that each hospital reported fully implementing during the collaborative.

Qualitative data collection and analysis

Following a qualitative descriptive approach [34], we subsequently conducted in-depth interviews with internal implementation facilitators at QIC-participating hospitals between September 2019 and March 2020. Twenty-one interviewed leaders were in their positions during the cesarean initiative and were included in this analysis. Leaders from eight hospitals could not be interviewed, due to the QI team leader leaving their position at the hospital (n = 4) or nonresponse to interview invitations (n = 6).

JCK conducted all interviews following a guide informed by the iPARIHS and the Consolidated Framework for Implementation Research (CFIR 1.0) [35] frameworks with input from co-investigators. Hospital profiles with prior survey results were compiled for each hospital prior to the interview, and served as a basis for discussing implementation outcomes. Following the first two interviews, transcripts were shared with co-investigators, and interview guides were revised based on feedback. All but three interviews were audio recorded and transcribed. Three hospitals declined recording, and in those cases a research assistant attended the interview to type verbatim notes.

A coding team of two investigators (JCK and RBB) independently coded each transcript in Dedoose using a codebook based on CFIR 1.0. CFIR 1.0 was selected for the analysis framework due to its robust definitions for included constructs. The investigators inductively refined operational definitions for each construct through data familiarization. During initial thematic analysis with CFIR 1.0, the analysis team determined the existing CFIR 1.0 process constructs (engaging, planning, executing and reflecting/evaluating) did not adequately capture the IF activities described by participants, and conducted a literature review for expanded process frameworks. We modified the most detailed framework, published by Dogherty et. al [36], to differentiate seven categories of IF processes utilized by hospitals in the study: engaging internal facilitators; planning; increasing awareness and motivation; project management; changing systems and structures of care; providing support and accountability; and assessment. The revised coding framework included operational definitions for each of these categories and references to their correspondence with constructs in the Dogherty framework, iPARIHS, and CFIR, which we updated upon publication of CFIR 2.0 [37] (see Appendix 2).

The coding team completed coding in sets of two to three transcripts at a time, and met within one week of coding each transcript to discuss the application of codes and assignment of ratings for IF processes, following a modified CFIR rating system [38]. For each hospital, the IF processes were assigned a rating as a barrier (− 2, − 1), facilitator (+ 1, + 2), neutral (0), or mixed (X). The rating for each category of IF activities reflects the degree to which the activities undertaken—or not undertaken—by the internal facilitators had a positive or negative influence on implementation of the cesarean bundle (Table 1). The two members of the coding team rated each facility transcript independently. During analysis meetings, the team discussed the rationale for their independent ratings to establish a consensus rating. Average ratings were calculated to describe patterns in the strength of IF processes by hospital and IF category. We also calculated descriptive statistics to characterize the individuals engaged in IF across the collaborative.

Table 1 Internal facilitation rating rubric

Integration of qualitative and quantitative data

Following completion of qualitative analyses, the quantitative and qualitative results for each hospital were merged for further cross-case comparison, pattern identification, and interpretation [31, 39]. To achieve this integration of methods, we utilized joint displays, recommended for synthesizing complex data in multisite implementation research studies [40]. We prepared a case-ordered predictor-outcomes matrix [41], equivalent to a matrix heat map [40], to display IF process ratings alongside implementation outcomes. To evaluate the relationship between implementation outcomes and average IF process rating, we categorized hospitals into two outcome groups (≤ the median of 4 practice changes fully implemented, and > 4 practice changes) and performed a Mann–Whitney test in Stata 18 [42].

The predictor-outcomes matrix also served as the basis for interpretation discussions among the team of investigators. Team members represented a variety of disciplines, including two nurses, three PhD-trained implementation scientists with research experience in a variety of healthcare settings, and a sociologist. Two team members (BD and JCK) were known to participants through their roles supporting the QIC, which may have influenced data and interpretations.

Results

The 21 internal facilitators who participated included 4 physicians with unit management roles, 12 nurses with unit management roles, 3 nurses not in management roles, 1 certified nurse midwife with a management role, and one performance improvement coordinator (Table 2). The majority of participants were white (n = 18; 85%) and female (n = 20; 95%). All but two participants had 10 or more years of experience in labor & delivery. Nineteen (90%) participants identified as the implementation leader at their hospital, while two (10%) had facilitation roles but were not the leader. Participants represented hospitals with delivery volumes ranging from 300 to over 9,000 births per year. During the two-year collaborative, the median number of practice changes fully implemented by hospitals was 4 (range: 0 to 12).

Table 2 Characteristics of interview participants and hospitals included in the study (n = 21)

Engaging internal facilitators

Sixteen hospitals were rated positively for engaging facilitators, four had mixed or neutral ratings, and one had a negative rating (see Appendix 3 for average and median ratings by process). Examples of activities within this process for positive, neutral, and negative ratings are provided in Table 3. All but one hospital reported having one or more facilitators for this initiative who served as the main point of contact for the collaborative and led the planning and execution of quality improvement activities for the unit. Many interviewed facilitators reported being assigned the role by the nature of their positions as nurse managers or medical leadership for the unit. Some hospitals had only one or two individuals serving as facilitators, while hospitals with a standing leadership committee often described that committee as playing the role of an IF team:

“We have a group called our patient safety and quality team, which has a representative from every physician group, and from every nursing unit. We meet once a month to approve and discuss new projects... Part of it was done certainly in small groups, but it always came back to the quality and safety team. They were the decision-making team.” (Nurse Facilitator, Hospital T)

Table 3 Examples of facilitator activities corresponding to positive and negative ratings within each internal facilitation process

Nurse managers served as facilitators at 20 hospitals, and typically described the facilitator role as a part of their job duties. Physician engagement as facilitators was more variable between hospitals. Hospitals with mixed/neutral ratings did not have physicians playing supportive IF roles, even when they were in leadership positions and identified members of an IF team. Participants described physician engagement as being influenced by individual motivation and interest, often referencing a “passion” for the topic. Several hospitals with positive ratings stated that they selected individuals to involve as facilitators based on their personal interest in the topic: “whenever we’re starting anything we always look for people that seem to have a passion around the specific thing that we’re working on” (Nurse Facilitator, Hospital M).

The one hospital with a negative rating did not assign any internal facilitator due to low relative priority of the initiative in the hospital at that time. Other contextual determinants that emerged from analysis as influencing this process were relational connections, work infrastructure, individual motivation, and individual capability (Table 4; see Appendix 4 for examples of each determinant by IF process). Work infrastructure, specifically turnover in leadership positions, was the most common barrier to the process of engaging internal facilitators. Team approaches to IF helped to ameliorate this barrier. One hospital that lost implementation momentum due to nursing leadership turnover recovered quickly due to the presence of three committed physicians on the IF team (Hospital U).

Table 4 Contextual determinants that commonly influenced internal facilitation, by process

Planning

Planning—which encompassed the extent to which facilitators developed a plan for implementation of the bundle, the nature of the plan, and how decisions were made—tied with providing support and accountability as the process with the lowest average rating across hospitals (average: 0.5; SD: 1.0).Three hospitals received a negative rating for planning, six received a neutral rating, and 12 a positive rating. The majority of hospitals considered themselves to have developed implementation plans, but did not document their plans, as stated by one leader, “I can’t say that we had something formally written out, it was just an agreed upon plan on what we were going to do” (Nurse Facilitator, Hospital P). Most hospitals planned one change at a time: “we would go down one path, and if that wasn’t going anywhere, we would try a different angle” (Nurse Facilitator, Hospital K). Planning occurred at IF team meetings in some hospitals, while others, particularly smaller hospitals, described planning during more casual interactions, “not so much sit down group meetings, but discussions at the desk” (Nurse Facilitator, Hospital P). In two outlier hospitals with strong positive planning ratings, facilitators drew on prior training to develop formally written plans following structured quality improvement approaches, such as Kotter’s Eight Steps. Hospitals with negative or neutral ratings did not plan due to competing priorities, or limited the scope of practice changes they would consider.

The AIM bundle was used as the basis for planning at close to half of hospitals, who qualitatively assessed their current practices against those recommended in the bundle. Their decisions were often influenced by the feasibility of practice changes—selecting easier changes first—and facilitators’ perceptions about the acceptability of practice changes among staff:

“It was evaluation and discussion of the different bundle components and sort of whether the department supported this specific part of the bundle or didn’t and if they would sign off with support, it was more likely we could push that part out.” (Nurse Facilitator, Hospital D)

Other hospitals reviewed unit-level data to decide which changes to make. These hospitals typically used case reviews of primary cesarean births—a required reporting measure for the collaborative—to identify practice changes that could reduce the unit’s rate. One physician leader explained, “I had data that would indicate our greatest opportunity was around the induced patient, as their rate of primary cesarean was higher” (Physician Facilitator, Hospital I). Review of unit-level data led to different decisions at hospitals that prioritized improving labor support, or electronic fetal monitoring interpretation, based on discussion of their data.

Raising awareness and motivation

Ratings for the IF process of raising awareness and motivation considered the extent to which facilitators effectively made the case for change on their unit through unit or group-level communications intended to maximize participation, overcome resistance, and build consensus. Fifteen hospitals received a positive rating for this IF process, four a neutral or mixed rating, and two a negative rating. Communication activities were discussed as critical for implementation success among most participants, who considered effective communications necessary to secure staff support for practice changes. Describing her unit’s approach, one nurse leader said, “we wanted buy in, so we needed to spread a consistent vision and what our goals were early on” (Nurse Facilitator, Hospital N).

Hospitals with positive ratings used multiple channels of communication. Describing the benefits of multiple channels, one participant said, “not everybody’s physically present, not everybody reads their email much, so you’re now having to communicate across multiple modalities and multiple venues” (Physician Facilitator, Hospital I). When present, strong existing internal communications on the unit helped facilitators spread messages; communication channels that facilitators utilized included email, newsletters, fliers delivered to offices and posted in hallways and lounges, department-wide meetings including grand rounds, daily huddles, and staff education. Hospitals with negative or neutral ratings typically relied on one or two modes of communication, communicated with the unit infrequently (e.g., quarterly or less), and noted their inability to gain nurse or physician buy in.

Among most hospitals with positive ratings, facilitators described tailoring the content of messages to address implementation barriers related to perceptions of the innovation and individual motivation. Messages highlighting the strength of evidence, relative advantage, and credibility of the source of the bundle were used to overcome skepticism at many hospitals:

“Getting people to do standard work is a little more of a challenge, because nurses and physicians are like, ‘but I can’t use my critical thinking when I have standard work.’ So, we spent a lot of time helping them understand that we were going by the standard of their professional organizations. We were using evidence-based information to drive the changes we needed to make.” (Nurse Facilitator, Hospital T)

Some hospitals communicated the need for change by sharing unit-level data. Other hospitals fostered staff motivation by appealing to their concern for their patients: “we tried really hard always to put the patient in the center” (Nurse Facilitator, Hospital N). Two hospitals also described communications intended to foster external pressure for practice change from the collaborative: “It definitely helps when we can say, ‘look, other hospitals in the state are doing this,’ so that they don’t feel quite so independent” (Physician Facilitator, Hospital I).

Project management

Project management activities engaged in by hospitals included assigning roles and delegating tasks, coordinating with staff or other departments to complete activities, and mobilizing resources, such as purchasing new equipment or securing staff time. Project management was one of two IF processes with the highest average rating across hospitals (average rating: 1.0; SD: 0.9); 15 hospitals received a positive rating, two a neutral rating, and two a negative rating. Securing and mobilizing resources was the aspect of project management that participants most-commonly discussed as important for implementation success. While the collaborative provides guidance and structure to hospitals, one participant noted, “it is a fair amount of work and it’s the internal hospital resources that [are] needed to get the work done” (Nurse Facilitator, Hospital D).

Hospitals with positive ratings secured and deployed multiple types of internal resources, including financial and human resources. Many facilitators distributed tasks to multiple members of the IF team or the unit. For example, several hospitals engaged nurses to abstract data for case reviews. Multiple hospitals engaged internal providers to develop and/or deliver recommended staff trainings. Another common resource that facilitators deployed were financial resources to purchase recommended equipment, such as peanut balls for labor positioning and wireless fetal monitoring systems. Financial resources were sometimes needed for staff training, when providing training during work hours:

“We found money to pull people out of class . . . and we provided CMEs to our providers, but [at first] we really didn’t have a plan of, ok, so this is not productive time... how are we going to pay for it... and it could have created a barrier for us.” (Nurse Facilitator, Hospital N)

Many facilitators also secured some resources from outside of the unit, such as IT department staff time, and some engaged resources outside of their hospital, such as experts to provide training.

Characteristics of the hospital and unit influenced facilitators’ project management activities. Low staffing levels at some hospitals limited the human resources that could be devoted to the initiative. Some units had operating budgets that could provide financial resources for implementation activities, while others did not. Many hospitals had a performance improvement division that could support the initiative with data abstraction and quality improvement methods. Effectively leveraging this resource still required active management by the facilitators, and two hospitals acknowledged that they had not taken advantage of their hospitals’ performance improvement resources—“Every six months, they look for a research or QI candidate to help... I think people are there, but we would have to engage them ourselves” (Nurse Facilitator, Hospital H).

Changing systems & structures of care

This IF process encompassed the extent to which facilitators made changes to systems, policies, and processes on the unit, and whether adaptations were made for recommended practices. Sixteen hospitals received a positive rating for this process, two received a neutral or mixed rating, and three received a negative rating (average rating: 0.9). Hospitals with strong positive ratings instituted multiple formal policies or procedures that required or reinforced a clinical practice change. Examples of procedures that reinforced a clinical practice change at strong positive hospitals were pre-operative huddles and induction scheduling approval processes. Internal work to further define and operationalize practice changes recommended in the bundle was common among high performing hospitals. For example, hospitals described adapting recommendations to develop unit-specific induction guidelines and algorithms for managing fetal heart rate concerns.

Among hospitals with negative and neutral ratings, facilitators made no or few changes to unit policies or procedures. These hospitals typically focused on promotion of practice changes through didactic or informal education, without mandating changes:

“It’s not something the organization has put their foot down on. It’s the physician’s practice and we can’t dictate what that practice looks like, but I think the more we talk about it... I do think that we’ll get there” (Nurse Facilitator, Hospital B).

A common determinant for this IF process was midlevel leaders’ motivation to mandate change. Leaders at hospitals with positive ratings were willing to require standard practice. At one hospital, physicians who initially resisted new standardized induction guidelines eventually adopted the practice, “because they had to, because the chair of the department made it required” (Nurse Facilitator, Hospital U). In contrast, hospitals with negative scores did not have leadership support for standardizing care: “most of the strategies were really about changing policy and many things about provider practice and we just didn’t have any leverage” (Nurse Facilitator, Hospital A). Additional barriers of this IF process at some hospitals included concerns about medical liability and unit cultures that valued clinician autonomy over standardizing clinical care.

Providing support & accountability

The providing support and accountability process included the extent to which facilitators provided feedback to individual providers in the form of mentoring or coaching, correcting practices, providing encouragement and positive reinforcement for practice changes, and audit and feedback of provider performance. This was the IF process with the highest proportion (52%) of included hospitals receiving a negative or neutral rating; the process tied with planning as the IF process with the lowest average rating across hospitals (average rating: 0.5; SD: 1.2). Hospitals with strong positive ratings provided multiple forms of feedback to staff, primarily directed at physicians. Commonly used feedback mechanisms, which were recommended in the bundle, included sharing cesarean rates directly with the physician and posting physician rates on the unit. Posting of unblinded rates was considered more effective by several participants:

“We posted blinded for a while, and that didn’t get us anywhere. We found that when you started posting people’s names, they started paying a different level of attention.” (Nurse Facilitator, Hospital T)

At a few hospitals, medical leadership held individual feedback meetings with physicians whose cesarean cases were not consistent with guidelines. Supportive coaching was offered at some hospitals with strong relational connections and teamwork.

Hospitals with negative ratings either did not deliver feedback to providers on their cesarean rates and/or compliance with guidelines, or the feedback they delivered was infrequent and ineffective. In one example of ineffectiveness, physicians received their individual cesarean rates once, but expressed skepticism about the data and their negative reaction presented a barrier for further practice changes: “since everybody was so upset, [department leadership] didn’t want to push it and they didn’t want to say... we need to start doing some of the things that the collaborative had said” (Nurse Facilitator, Hospital C).

Mid-level leaders’ motivation to provide feedback to physicians was a primary determinant of this IF process at most hospitals. Medical leadership was more highly engaged in providing direct feedback to other physicians at hospitals with positive ratings than hospitals with neutral or negative ratings. For example, case reviews at one negatively-rated hospital identified opportunities for improving practice, but no corrective feedback was delivered to individual providers:

“When we finished chart reviews, they would then go to the chair of the department. She did not want to be involved in counseling or giving education and feedback to providers, other than in a classroom setting... not the one-on-one, so that kind of tied our hands a little bit.” (Nurse Facilitator, Hospital N)

Another common barrier for this process was difficulty attributing a cesarean birth to one provider due to shared management of patients. Physicians were less receptive to feedback when they considered the calculations to be incorrect or unfair.

Assessment

The assessment process included use of unit-level data to evaluate internal practices and change efforts, and often overlapped with the processes of planning and raising awareness and motivation. Assessment was tied with project management as the most positively-rated IF process across hospitals (average score: 1; SD: 0.9); 14 hospitals received a positive rating, six received a neutral rating, and only one received a negative rating. The structure provided by the collaborative may explain the strength of most hospitals' assessment activities. A detailed data collection plan accompanied the AIM bundle and hospitals were expected to report the specified measures to AIM’s data portal on a quarterly basis.

Among hospitals with positive ratings for assessment, internal facilitators strengthened data systems to support collaborative goals. They often worked with IT or performance improvement divisions to modify electronic records systems and build automated reports. Most hospitals with strong positive ratings also worked to improve the quality of data in existing systems, such as improving the completeness of provider documentation. Hospitals with positive ratings also used data to support other IF processes, including using unit-level data to inform planning decisions and sharing unit data to increase awareness and motivation. In contrast, hospitals with neutral ratings completed the data requirements of the collaborative, but made limited or no use of their data to advance implementation of the bundle—“I don’t think [the data] ever got back to physicians, I don’t think they actually utilized it to change practice” (Nurse facilitator, Hospital B). At the one hospital with a negative rating for assessment, completing required data collection led to disagreements between medical and nursing leaders, which stalled further implementation of the bundle (Hospital C).

Many participants noted that completing the data requirements for the initiative, particularly the case reviews, was a time-consuming activity. Work infrastructure presented a barrier for assessment when there was insufficient staff time that could be allocated to case review abstraction and analysis. The strength of each hospital’s support divisions, such as performance improvement and IT, was frequently discussed as a positive or negative contextual determinant for assessment activities. Strong support divisions were able to modify and automate assessment processes to meet many data requirements of the initiative. In some cases, metrics for this initiative were included on enterprise dashboards or reports for hospital administrators, which enhanced accountability for facilitators. In hospitals with weaker support divisions, facilitators identified inaccuracies in the data reports that they received and spent time correcting those errors.

Implementation outcomes and association with IF process ratings

The number of practice changes that participating hospitals fully implemented during the collaborative ranged from 0 to 12, with a median of four. The average IF process rating ranged from − 1.1 to 1.3 for hospitals that implemented the median number of practice changes or fewer (≤ 4), and from 0.4 to 1.6 for hospitals that implemented more than the median (≥ 5) number of changes. Figure 2 displays a matrix of IF process ratings for each hospital, sorted by the number of practice changes reported by the hospital (see Appendix 5 for black and white version). Among the three hospitals with negative average IF process ratings, the highest number of practices implemented was 2. Six of the seven hospitals with an IF process rating > 1 implemented more than the median number of practice changes. Average IF process ratings were significantly higher among hospitals that implemented more than the median number of changes (P = 0.008).

Fig. 2
figure 2

Internal facilitation ratings and implementation outcomes, by hospital

Discussion

This comparative case study describes seven IF processes that labor & delivery unit facilitators engaged in as a part of their participation in a perinatal QIC. These results expand on prior evidence that the functionality and activity level of internal quality improvement teams influences QIC success [20, 26, 27] by identifying the breadth and depth of activity displayed by highly-engaged hospitals. This study also demonstrates a large degree of variability in IF processes between participating hospitals and suggests a positive relationship between the strength of IF processes and implementation outcomes. Weak IF process may help explain why up to one third of hospitals do not achieve the desired improvements in multiple evaluations of perinatal QICs [12, 21, 22]. Practical implications of these results for QIC organizers, particularly in perinatal health, include the need to consider and support internal facilitators’ preparation, skills, and resources to fulfill their roles.

The seven IF processes in this study represent an expansion of some prior process frameworks [35, 43] and a simplification of others [23, 36]. These seven process areas adequately captured the vast majority of IF activities discussed by participants, while maintaining meaningful distinctions between the functions of activities. The IF activities observed in this study are also consistent with evaluations of IF skills [44, 45] and aligned with external facilitation activity categories developed by Richie et al. [46]. Like external facilitators, internal facilitators engage stakeholders, plan, market, adapt, and assess [46]. While external facilitators demonstrated additional activities such as network development and technical support, another distinction between external facilitators and internal facilitators is that internal facilitators usually lack specialized training in facilitation [44]. In this study, internal facilitators were nurse managers or physician leaders, and implementation and change management were relatively small areas within their scope of responsibility. To better support internal facilitators, the seven process categories could be developed into tools or guides that translate implementation research findings into quality improvement practice [47] and strengthen the effectiveness of QICs.

The contextual determinants of IF processes identified in this study substantiate those that have been described in other qualitative studies of hospital quality improvement including in perinatal care [48,49,50,51]. These determinants range from high-level leadership support and collaborative, interdisciplinary unit culture to integration with IT systems, communications, and adequacy of staffing and other resources [48]. What this study adds is a mapping of contextual determinants to the internal facilitation processes in which they have the largest influence. For example, tension for change was particularly influential in the planning process, while communications had the strongest influence on the process of increasing awareness and motivation. Applying the IF process definitions to implementation studies of QICs in other settings could further our understanding of these patterns.

Strengths of this study include the integration of quantitative and qualitative data, inclusion of twenty-one hospitals, and rigorous multi-rater analysis. A limitation of this study is that we interviewed one person per hospital, the individual with the greatest facilitation role whenever possible. Interviews took place after completion of QIC activities; this retrospective approach allowed participants to reflect on their activities, barriers, and facilitators across the complete implementation period, but recall may have been limited for some individuals and for the earliest activities. As a result of staff turnover at participating hospitals, some of the individuals interviewed may not have been in their role for the entire length of the QIC’s cesarean initiative, and our inability to interview facilitators at some hospitals may have introduced bias. While most external facilitation activities were provided equally for all collaborative members (see Appendix 1), some hospitals received one ad hoc site visit from the collaborative organizer to address implementation challenges such as difficulty using the data portal and orientation of new leadership. This study is not able to evaluate the influence of those visits. An additional limitation is that this study did not consider some IF processes present in other frameworks [36] that were not prominent in this particular collaborative, or for which there was insufficient data. Knowledge management, for example, was not prominent in this case because the innovations and evidence were provided to participating hospitals in the AIM bundle. Fostering teamwork, an IF process that can be important for reducing cesarean birth [49], was not included due to limited explicit engagement and insufficient data for most hospitals.

Conclusions

This study contributes to advancing our understanding of internal facilitation and describes seven IF processes engaged in by internal facilitators at hospitals participating in a perinatal QIC. The strength of IF processes was variable between hospitals and multiple contextual determinants of implementation influenced each process. Engagement in IF processes may determine the success of organizations participating in QICs. These findings can be used to develop tools for guiding and strengthening IF in QICs.

Data availability

Data are not available to protect the privacy of qualitative interview participants and to comply with the terms of secondary data received from the Maryland Department of Health.

Abbreviations

AIM:

Alliance for Innovation on Maternal Health

CFIR:

Consolidated Framework for Implementation Research

IF:

Internal facilitation

iPARIHS:

Integrated Promoting Action on Research Implementation in Health Services

QIC:

Quality improvement collaborative

References

  1. Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Implement Sci. 2015;10(1):21. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-015-0209-1.

    Article  PubMed  PubMed Central  Google Scholar 

  2. Wells S, Tamir O, Gray J, Naidoo D, Bekhit M, Goldmann D. Are quality improvement collaboratives effective? A systematic review. BMJ Qual Saf. 2018;27(3):226–40. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2017-006926.

    Article  PubMed  Google Scholar 

  3. Garcia-Elorrio E, Rowe SY, Teijeiro ME, Ciapponi A, Rowe AK. The effectiveness of the quality improvement collaborative strategy in low- and middle-income countries: A systematic review and meta-analysis. PLoS ONE. 2019;14(10):e0221919. https://doiorg.publicaciones.saludcastillayleon.es/10.1371/journal.pone.0221919.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  4. Institute for Healthcare Improvement. The Breakthrough Series: IHI’s Collaborative Model for Achieving Breakthrough Improvement. Institute for Healthcare Improvement; 2003.

  5. Schouten LMT, Hulscher MEJL, van Everdingen JJE, Huijsman R, Grol RPTM. Evidence for the impact of quality improvement collaboratives: systematic review. BMJ. 2008;336(7659):1491–4. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmj.39570.749884.BE.

    Article  PubMed  PubMed Central  Google Scholar 

  6. Douthard RA, Martin IK, Chapple-McGruder T, Langer A, Chang SUS. Maternal mortality within a global context: historical trends, current state, and future directions. J Women’s Health. 2021;30(2):168–77. https://doiorg.publicaciones.saludcastillayleon.es/10.1089/jwh.2020.8863.

    Article  Google Scholar 

  7. Miller S, Abalos E, Chamillard M, et al. Beyond too little, too late and too much, too soon: a pathway towards evidence-based, respectful maternity care worldwide. Lancet. 2016;388(10056):2176–92. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/S0140-6736(16)31472-6.

    Article  PubMed  Google Scholar 

  8. Mahoney J. The alliance for innovation in maternal health care: a way forward. Clin Obstet Gynecol. 2018;61(2):400–10. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/GRF.0000000000000363.

    Article  PubMed  Google Scholar 

  9. Henderson ZT, Ernst K, Simpson KR, et al. The national network of state perinatal quality collaboratives: a growing movement to improve maternal and infant Health. J Women’s Health. 2018;27(3):221–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1089/jwh.2018.6941.

    Article  Google Scholar 

  10. Main EK. Reducing maternal mortality and severe maternal morbidity through state-based quality improvement initiatives. Clin Obstet Gynecol. 2018;61(2):319–31. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/GRF.0000000000000361.

    Article  PubMed  Google Scholar 

  11. Centers for Disease Control and Prevention. State Perinatal Quality Collaboratives.

  12. Main EK, Cape V, Abreo A, et al. Reduction of severe maternal morbidity from hemorrhage using a state perinatal quality collaborative. Am J Obstet Gynecol. 2017;216(3):298.e1-298.e11. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ajog.2017.01.017.

    Article  PubMed  Google Scholar 

  13. Crane D, Marcotte M, Applegate M, et al. A statewide quality improvement (QI) initiative for better health outcomes and family stability among pregnant women with opioid use disorder (OUD) and their infants. J Subst Abuse Treat. 2019;102:53–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.jsat.2019.04.010.

    Article  PubMed  Google Scholar 

  14. Rosenstein MG, Chang SC, Sakowski C, et al. Hospital quality improvement interventions, statewide policy initiatives, and rates of cesarean delivery for nulliparous, term, singleton vertex births in California. JAMA. 2021;325(16):1631–9. https://doiorg.publicaciones.saludcastillayleon.es/10.1001/jama.2021.3816.

    Article  PubMed  PubMed Central  Google Scholar 

  15. Callaghan-Koru JA, DiPietro B, Wahid I, et al. Reduction in cesarean delivery rates associated with a state quality collaborative in Maryland. Obstet Gynecol. 2021;138(4):583. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/AOG.0000000000004540.

    Article  PubMed  PubMed Central  Google Scholar 

  16. Dückers ML, Spreeuwenberg P, Wagner C, Groenewegen PP. Exploring the black box of quality improvement collaboratives: modelling relations between conditions, applied changes and outcomes. Implementation Science. 2009;4(1):74. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-4-74

  17. Carter P, Ozieranski P, McNicol S, Power M, Dixon-Woods M. How collaborative are quality improvement collaboratives: a qualitative study in stroke care. Implement Sci. 2014;9(1):32. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-9-32.

    Article  PubMed  PubMed Central  Google Scholar 

  18. Nembhard IM. All teach, all learn, all improve?: The role of interorganizational learning in quality improvement collaboratives. Health Care Manage Rev. 2012;37(2):154–64. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/HMR.0b013e31822af831.

    Article  PubMed  PubMed Central  Google Scholar 

  19. Lyndon A, Cape V. Maternal Hemorrhage Quality Improvement Collaborative Lessons: MCN. Am J Maternal Child Nurs. 2016;41(6):363–71. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/NMC.0000000000000277.

    Article  Google Scholar 

  20. Zamboni K, Baker U, Tyagi M, Schellenberg J, Hill Z, Hanson C. How and under what circumstances do quality improvement collaboratives lead to better outcomes? A systematic review. Implement Sci. 2020;15(1):27. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-020-0978-z.

    Article  PubMed  PubMed Central  Google Scholar 

  21. Callaghan-Koru JA, DiPietro B, Wahid I, et al. Reduction in Cesarean Delivery Rates Associated With a State Quality Collaborative in Maryland. Obstetrics & Gynecology. Published online September 9, 2021:https://doiorg.publicaciones.saludcastillayleon.es/10.1097/AOG.0000000000004540. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/AOG.0000000000004540

  22. Main EK, Chang SC, Cape V, Sakowski C, Smith H, Vasher J. Safety assessment of a large-scale improvement collaborative to reduce nulliparous cesarean delivery rates. Obstet Gynecol. 2019;133(4):613–23. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/AOG.0000000000003109.

    Article  PubMed  Google Scholar 

  23. Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11(1):33. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-016-0398-2.

    Article  PubMed  PubMed Central  Google Scholar 

  24. Hero J, Goodrich D, Ernecoff N, et al. Implementation Strategies for Evidence-Based Practice in Health and Health Care: A Review of the Evidenc. RAND Health Care; 2023.

  25. Kilbourne AM, Geng E, Eshun-Wilson I, et al. How does facilitation in healthcare work? Using mechanism mapping to illuminate the black box of a meta-implementation strategy. Implement Sci Commun. 2023;4(1):1–12. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-023-00435-1.

    Article  Google Scholar 

  26. Khodyakov D, Ridgely MS, Huang C, DeBartolo KO, Sorbero ME, Schneider EC. Project JOINTS: What factors affect bundle adoption in a voluntary quality improvement campaign? BMJ Qual Saf. 2015;24(1):38–47. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2014-003169.

    Article  PubMed  Google Scholar 

  27. Nix M, McNamara P, Genevro J, et al. Learning collaboratives: insights and a new taxonomy from AHRQ’s two decades of experience. Health Aff. 2018;37(2):205–12. https://doiorg.publicaciones.saludcastillayleon.es/10.1377/hlthaff.2017.1144.

    Article  Google Scholar 

  28. Antoine C, Young BK. Cesarean section one hundred years 1920–2020: the good, the bad and the ugly. J Perinat Med. 2021;49(1):5–16. https://doiorg.publicaciones.saludcastillayleon.es/10.1515/jpm-2020-0305.

    Article  Google Scholar 

  29. Martin JA, Hamilton BE, Osterman MJK, Driscoll AK, Drake P. Births: Final Data for 2016. Natl Vital Stat Rep. 2018;67(1):1–55.

    PubMed  Google Scholar 

  30. Caughey AB, Cahill AG, Guise JM, Rouse DJ. Safe prevention of the primary cesarean delivery. Am J Obstet Gynecol. 2014;210(3):179–93. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ajog.2014.01.026.

    Article  PubMed  Google Scholar 

  31. Yin RK. Case Study Research: Design and Methods. Sage Publications, Inc; 2008.

  32. Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. SAGE; 2011.

  33. Palinkas LA, Mendon SJ, Hamilton AB. Innovations in Mixed Methods Evaluations. Annu Rev Public Health. 2019;40(1):423–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1146/annurev-publhealth-040218-044215.

    Article  PubMed  PubMed Central  Google Scholar 

  34. Kim H, Sefcik JS, Bradway C. Characteristics of qualitative descriptive studies: a systematic review. Res Nurs Health. 2017;40(1):23–42. https://doiorg.publicaciones.saludcastillayleon.es/10.1002/nur.21768.

    Article  CAS  PubMed  Google Scholar 

  35. Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1):50.

    Article  PubMed  PubMed Central  Google Scholar 

  36. Dogherty EJ, Harrison MB, Graham ID. Facilitation as a role and process in achieving evidence-based practice in nursing: a focused review of concept and meaning. Worldviews Evid Based Nurs. 2010;7(2):76–89. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/j.1741-6787.2010.00186.x.

    Article  PubMed  Google Scholar 

  37. Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated consolidated framework for implementation research based on user feedback. Implement Sci. 2022;17(1):75. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-022-01245-0.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Damschroder LJ, Reardon CM, Sperber N, Robinson CH, Fickel JJ, Oddone EZ. Implementation evaluation of the Telephone Lifestyle Coaching (TLC) program: organizational factors associated with successful implementation. Transl Behav Med. 2017;7(2):233–41. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s13142-016-0424-6.

    Article  PubMed  Google Scholar 

  39. Fetters MD, Curry LA, Creswell JW. Achieving integration in mixed methods designs—principles and practices. Health Serv Res. 2013;48(6 Pt 2):2134–56. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/1475-6773.12117.

    Article  PubMed  PubMed Central  Google Scholar 

  40. Salvati ZM, Rahm AK, Williams MS, et al. A picture is worth a thousand words: advancing the use of visualization tools in implementation science through process mapping and matrix heat mapping. Implement Sci Commun. 2023;4(1):43. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-023-00424-4.

    Article  PubMed  PubMed Central  Google Scholar 

  41. Miles MB, Huberman AM. Qualitative Data Analysis: An Expanded Sourcebook. Sage; 1994.

  42. StataCorp. Stata Statistical Software: Release 18. StataCorp LLC; 2023.

  43. Baloh J, Zhu X, Ward MM. Types of internal facilitation activities in hospitals implementing evidence-based interventions. Health Care Manage Rev. 2018;43(3):229–37. https://doiorg.publicaciones.saludcastillayleon.es/10.1097/HMR.0000000000000145.

    Article  PubMed  PubMed Central  Google Scholar 

  44. Connolly SL, Sullivan JL, Ritchie MJ, Kim B, Miller CJ, Bauer MS. External facilitators’ perceptions of internal facilitation skills during implementation of collaborative care for mental health teams: a qualitative analysis informed by the i-PARIHS framework. BMC Health Serv Res. 2020;20(1):1–10. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s12913-020-5011-3.

    Article  Google Scholar 

  45. Ritchie MJ, Parker LE, Kirchner JE. From novice to expert: a qualitative study of implementation facilitation skills. Implement Sci Commun. 2020;1(1):25. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-020-00006-8.

    Article  PubMed  PubMed Central  Google Scholar 

  46. Ritchie MJ, Kirchner JE, Townsend JC, Pitcock JA, Dollar KM, Liu CF. Time and organizational cost for facilitating implementation of primary care mental health integration. J Gen Intern Med. 2020;35(4):1001–10. https://doiorg.publicaciones.saludcastillayleon.es/10.1007/s11606-019-05537-y.

    Article  PubMed  Google Scholar 

  47. Callaghan-Koru J, Farzin A, Ridout E, Curran G. Integrating implementation science with quality improvement to improve perinatal outcomes. Clin Perinatol. 2023;50(2):343–61. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.clp.2023.01.002.

    Article  PubMed  Google Scholar 

  48. Vaughn VM, Saint S, Krein SL, et al. Characteristics of healthcare organisations struggling to improve quality: results from a systematic review of qualitative studies. BMJ Qual Saf. 2019;28(1):74–84. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjqs-2017-007573.

    Article  PubMed  Google Scholar 

  49. VanGompel ECW, Perez SL, Datta A, Carlock FR, Cape V, Main EK. Culture that facilitates change: a mixed methods study of hospitals engaged in reducing cesarean deliveries. Ann Fam Med. 2021;19(3):249–57. https://doiorg.publicaciones.saludcastillayleon.es/10.1370/afm.2675.

    Article  Google Scholar 

  50. Moniz MH, Bonawitz K, Wetmore MK, et al. Implementing immediate postpartum contraception: a comparative case study at 11 hospitals. Implement Sci Commun. 2021;2(1):42. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-021-00136-7.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Vamos CA, Thompson EL, Cantor A, et al. Contextual factors influencing the implementation of the obstetrics hemorrhage initiative in Florida. J Perinatol. 2017;37(2):150–6. https://doiorg.publicaciones.saludcastillayleon.es/10.1038/jp.2016.199.

    Article  CAS  PubMed  Google Scholar 

Download references

Acknowledgements

The authors would like to thank Eddie Okojie and Abby Farmer for assistance with data curation and management.

Funding

Research reported in this publication was supported by the Eunice Kennedy Shriver National Institute of Child Health and Development of the National Institutes of Health under award number R03HD096397. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

Author information

Authors and Affiliations

Authors

Contributions

JCK: Conceptualization; Methodology; Project Administration; Data Curation; Validation; Formal Analysis; Visualization; Writing—Original Draft RB: Methodology; Validation; Formal Analysis; Writing—Review & Editing BDP: Project Administration; Methodology; Writing—Review & Editing LH: Project Administration; Writing—Review & Editing GC: Methodology; Validation; Writing—Review & Editing.

Corresponding author

Correspondence to Jennifer A. Callaghan-Koru.

Ethics declarations

Ethics approval and consent to participate

This study was reviewed and approved by the Institutional Review Board of the University of Maryland, Baltimore County (protocol # Y19 JCK21055). All individuals who participated in qualitative interviews provided informed consent.

Consent for publication

Not applicable.

Competing interests

Jennifer Callaghan-Koru has received consulting payments and speaking fees from the Alliance for Innovation on Maternal Health, a federally-funded program at the American College of Obstetricians and Gynecologists. The authors have no other competing interests to declare.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

43058_2025_735_MOESM1_ESM.docx

Supplementary Material 1: Appendix 1. External facilitation activities of the Maryland Perinatal-Neonatal Quality Collaborative.

Supplementary Material 2: Appendix 2. Operational definitions of internal facilitation processes.

Supplementary Material 3: Appendix 3. Descriptive statistics for internal facilitation process ratings, by process.

43058_2025_735_MOESM4_ESM.docx

Supplementary Material 4: Appendix 4. Examples of contextual determinants that influenced internal facilitation processes at participating hospitals.

43058_2025_735_MOESM5_ESM.docx

Supplementary Material 5: Appendix 5. Black and white outcomes-predictor matrix with numeric ratings. Description of data: These appendices provide expanded additional detail for methods and results described in the text and main tables.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Callaghan-Koru, J.A., Breman, R.B., DiPietro, B. et al. Relationships between internal facilitation processes and implementation outcomes among hospitals participating in a quality improvement collaborative to reduce cesarean births: a mixed-methods embedded case study. Implement Sci Commun 6, 57 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00735-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00735-8

Keywords