- Research
- Open access
- Published:
Applying the major system change framework to evaluate implementation of rapid healthcare system change: a case study of COVID-19 remote home monitoring services
Implementation Science Communications volume 6, Article number: 24 (2025)
Abstract
Background
A framework to evaluate implementation of Major System Change (MSC) in healthcare has been developed and applied to implementation of longer-term system changes. This was the first study to apply the five domains of the MSC framework to rapid healthcare system change. We aimed to: i) evaluate implementation of rapid MSC, using England COVID-19 remote home monitoring services as a case study and ii) consider whether and how the MSC framework can be applied to rapid MSC.
Methods
A mixed-methods rapid evaluation in England, across 28 primary and secondary healthcare sites (October 2020-November 2021; data collection: 4 months). We conducted 126 interviews (5 national leads, 59 staff, 62 patients/carers) and surveyed staff (n = 292) and patients/carers (n = 1069). Service providers completed cost surveys. Aggregated and patient-level national datasets were used to explore enrolment, service use and clinical outcomes. The MSC framework was applied retrospectively. Qualitative data were analysed thematically to explore key themes within each MSC framework domain. Descriptive statistics and multivariate analyses were used to analyse experience, costs, service use and clinical outcomes.
Results
Decision to change/Decision on model: Service development happened concurrently: i) early local development motivated by urgent clinical need, ii) national rollout using standard operating procedures, and iii) local implementation and adaptation.
Implementation approach: Services were tailored to local needs to consider patient, staff, organisational and resource factors.
Implementation outcomes: Patient enrolment was low (59% services <10%). Service models and implementation approaches varied substantially.
Intervention outcomes: No associations found between services and clinical outcomes. Patient and staff experiences were generally positive. However, barriers to delivery and engagement were found; with some groups finding it harder to engage.
Conclusions
Low enrolment rates and substantial variation due to tailoring services to local contexts meant it was not possible to conclusively determine service effectiveness. Process outcomes indicated areas of improvement. The MSC framework can be used to analyse rapid MSC. Implementation and factors influencing implementation may differ to non-rapid contexts (e.g. less uniformity, more tailoring). Our mixed-methods approach could inform future evaluations of large-scale rapid and non-rapid MSC in a range of conditions and services internationally.
Introduction
To ensure that patients and the public continually receive high-quality care, there has been a policy focus on pushing for change and innovation within health and care services internationally [1] and within the NHS [2, 3]. Some of these service innovations may be large and transformative, and termed ‘Major System Changes’ (MSC). MSC has been defined as “coordinated, systemwide change affecting multiple organisations and care providers, with the goal of significant improvements in the efficiency of healthcare delivery, the quality of patient care, and population-level patient outcomes” [4] p.422]. Previous research outlined five conditions necessary for successful MSC: involving stakeholders from all levels (e.g. service leads, healthcare providers), establishing feedback loops, attending to the local history of Major System Change, engaging healthcare providers, and engaging patients and families [4, 5]. MSC has also been defined as a type of complex intervention with multiple goals and change processes covering a range of levels and settings [6].
There are different ways of implementing MSC. Top-down approaches are those which are prescribed, directed and centrally coordinated [7]. Bottom-up approaches are driven from the ground up by local clinicians developing and implementing new ways of working [7]. Both top-down and bottom-up approaches have been seen to have benefits and drawbacks. For example, top-down approaches can be hindered by a lack of engagement and ownership from on-the-ground clinicians [7], whereas bottom-up approaches can be slow to take effect and adoption at scale can be low [7]. Previous research suggests that a combination of both top-down and bottom-up approaches may be beneficial [5, 7]; making the most of centrally coordinated and resourced innovations with bottom-up engagement from local clinicians taking control and ownership [7, 8].
Implementation science theories can help researchers to understand and conceptualise how health care interventions and/or services have been implemented. A review [9] proposes these have three aims: models which describe the process of translating research into practice (e.g. the Knowledge-to-Action framework [10]), theoretical frameworks that understand or explain factors influencing implementation outcomes (e.g. the Consolidated Framework for Implementation Research [11], Theoretical Domains Framework [12], COM-B model [13] and Normalisation Process Theory [14]), and evaluation frameworks (e.g. the RE-AIM framework [15]). However, these theories do not explicitly serve as frameworks to understand large system transformations, where local adoption and implementation decisions are connected to the co-ordination of systemic change. The MSC framework addresses this gap as it can be used to understand not only implementation processes across organisations and providers, but also relationships between stages of implementation and intervention outcomes [9]. The MSC framework [16] proposes five main inter-related domains of MSC that need to be evaluated: i) decision to change, influenced by drivers to change, governance and leadership of decision making, ii) decision on which intervention/service model to implement, iii) the implementation approach used (including context and approaches to facilitation), iv) implementation outcomes (including adoption, spread and fidelity), and v) intervention outcomes (including clinical outcomes, patient experience and cost effectiveness) [16] (see Appendix 1).
MSC occurring over several years has been evaluated using the MSC framework [16]. For example, the framework was developed and used when evaluating the reconfiguration of acute stroke services [5, 16], and the framework has also been used to evaluate MSC of specialist cancer services [17]. Earlier studies which applied this framework explored one type of MSC, which was the reconfiguration of services to centralise services [5, 16, 17]. Within these evaluations, the centralisation of services happened across multiple years [5, 16, 17]. However, some MSCs may happen more rapidly. It is not yet known whether this framework is suitable as a tool to learn about implementation and sustainability of healthcare system changes that occur at pace (i.e. ‘rapidly’). For example, non-rapid MSC may take place over years which would allow time for detailed considerations regarding decisions to change and models to implement, and allow time for consideration of relationships between the domain and clear mechanisms of impact. However, with rapid service change, these decisions may occur at pace and it is therefore not clear whether the domains of MSC and relationships identified between the domains will apply.
Rapid system changes were particularly prominent during the emergency context of the COVID-19 pandemic [2, 18] when the healthcare system in England had to introduce large cross-sector MSCs and innovations at pace in order to cope with immediate challenges they were facing, such as healthcare appointments not being able to be delivered face-to-face (due to lockdowns and risks of infection), workforces being redeployed [19, 20], and the introduction of new services (e.g. vaccination programmes and remote home monitoring). However, post-pandemic, the NHS continues to undergo large-scale transformations at pace to deal with workforce/capacity issues, such as backlogs, or to implement new technologies, such as artificial intelligence. However, these system changes often do not have existing evidence to support widespread adoption and implementation. There is an increasing need to develop approaches that can be used to build evidence and evaluate implementation of these rapid large-scale changes in real time.
The case study: COVID-19 remote home monitoring services
In England, COVID-19 remote home monitoring services were developed ad hoc in local services during the 1st wave of the pandemic, informed by local clinical need [21, 22]. Later, services were nationally rolled out in wave 2 of the pandemic to reduce pressure on hospitals and infection transmission, and to ensure that patients received appropriate care in the right place and were appropriately escalated as early as possible [23]. There were two approaches to COVID-19 remote home monitoring: community referral to remote home monitoring services (called COVID Oximetry @home – ‘CO@h’) and early discharge from hospital models (called COVID virtual wards – ‘CVW’). Appendix 2 summarises the care pathway for these services. We carried out a mixed-methods rapid evaluation of the services [23], which explored effectiveness [24, 25], cost [23, 26], implementation, staff [27, 28] and patient experience [28, 29] and inequalities [30].
Aims
Within this study, we aimed to evaluate the processes and outcomes of implementation of rapid MSC using COVID-19 remote home monitoring services as a case study. Our secondary aim was to consider whether and how the MSC framework can be applied in contexts of rapid MSC.
Methods
Setting
The study took place in England within primary and secondary healthcare organisations that delivered COVID-19 remote home monitoring services. The evaluation took place between October 2020 and November 2021, with data collection taking place between February and June 2021 (4 months).
The evaluation was conducted by a team of mixed-methods researchers from the NIHR funded Rapid Service Evaluation Team (RSET) [31] and NIHR funded Birmingham, RAND and Cambridge Evaluation Centre (BRACE) [32]. Methodological lessons drawn from this rapid evaluation are published elsewhere [33].
Design
We carried out a rapid multi-site evaluation of COVID-19 remote home monitoring services which included qualitative and quantitative approaches to analyse the implementation of the services for COVID-19 patients [17]. This manuscript draws on mixed-methods data from all aspects of the COVID-19 remote home monitoring evaluation, including national aggregated and patient-level data, cross sectional survey data (staff, patients/carers and service costs), interview data (staff, patients/carers, national leads), and documentary analysis. Multi-level mixed methods approaches were selected in order to fully evaluate the change processes involved in this MSC and the complexities of implementation [6]. Appendix 3 outlines how each type of data was used within this study.
Research setting
Twenty-eight purposively selected sites across England were recruited to our study and participated in staff and patient surveys. Twenty-six of these sites returned cost surveys. Seventeen of these sites were selected as case-study sites, in which interviews with patients and staff were conducted. Sites were representative of a range of regions across England, Clinical Commissioning Groups (CCGs, i.e., NHS organisations that organise the delivery of primary care services within a specified geographic area) and trusts, urban/rural mix, deprivation scores, ethnicity, and size of the population (see [35]).
This evaluation of the effectiveness of CO@h included all CCG areas in England where there was complete data on the number of people enrolled onto the programme between 2nd November 2020 and 21st February 2021. Our evaluation of the effectiveness of CVW included inpatient data from 123 hospital trusts whose CVW service start dates were known and used data on all discharges of patients with a confirmed or suspected COVID-19 diagnosis code between 17 August 2020 and 28 February 2021.
Recruitment
A range of stakeholders were recruited to this study, including national leads, staff involved in leading and delivering the COVID-19 remote home monitoring services, and patients and carers who had received the service.
Data collection
a. National data.
Aggregated data on new diagnoses of COVID-19 and mortality came from Public Health England (now the UK Health Security Agency). Data on enrolment to CO@h services was provided by NHS Digital (now part of NHS England). Data on the start dates of CVW services was provided by Kent, Surrey and Sussex Academic Health Science Network. We used Hospital Episode Statistics Admitted Patient Care data (HES APC) for patient-level data on hospital admissions, readmission, in-hospital mortality and length of stay. The CO@h analysis was restricted to adults aged 65 or over, while CVW analysis included all ages.
b. Primary data collected.
We conducted semi-structured interviews with national leads (n = 5), surveys with patients and carers (n = 1069) and staff (n = 292), and interviews with patients and carers (n = 62) and staff (n = 59).
Surveys and topic guides were developed specifically for this study and were adapted for different audiences. The surveys focused on experiences of delivering (staff) or receiving (patients and carers) remote home monitoring services for patients with COVID-19. National lead topic guides focused on questions about the service development, leadership and governance, data and implementation. Staff interview topic guides focused on questions about the service, experiences of delivering services and views on patient engagement. Patient interviews focused on experiences of receiving services and engagement (see Appendix 4 for surveys and topic guides).
To explore costs per patient, all sites were asked to complete a cost survey (see Appendix 4) which included questions about the number of patients triaged, monitored, escalated due to deterioration, and died, as well as questions about the staff and resources used for setting up and running the service.
Four sites which used both technology-enabled and analogue data submission modes were asked to provide more information about the time spent for each of the activities (i.e., patient triage/ risk stratification; patient information and training; patient monitoring; patient data reporting; flagging patient deterioration; patient escalation processes; and patient discharge from the ward). This approach allowed thorough investigation of the time spent for the specific activities as well as calculation of the cost per patient for all the activities, by data submission mode.
c. Documentary analysis.
Key documents relating to the COVID-19 monitoring services, such as national Standard Operating Procedures or local pathways (where available) were collected and analysed.
Data analysis
We selected the MSC Framework [16] as a conceptual framework for our analysis. The MSC Framework was selected as it enables researchers to unpack the ‘black-box’ of outcomes to study potential mechanisms of change, by exploring and indicating inter-relationships between implementation processes and between implementation processes and outcomes [6, 23]. The MSC framework can be flexibly interpreted which makes it suitable for exploring implementation rapidly at scale. To explore whether the MSC Framework [16] can be used to learn about rapid implementation of services during emergency contexts, we retrospectively applied the MSC framework [16] to our data.
We made minor adaptations to the MSC Framework in terms of terminology: ‘fidelity’ and ‘adoption’ (referred to in [16]) did not accurately capture the nuance within the concepts that we were referring to. For example, ‘fidelity’ was not necessarily an appropriate term to use, as the services’ Standard operating procedures [35, 36] were designed to give local services flexibility to implement the services as appropriate and were not necessarily intended to be prescriptive. Therefore, we decided to use alternative terms (e.g. ‘variations in implementation’ and ‘enrolment’). In Appendix 3, we summarise how we analysed data that is relevant to each of the five MSC domains: i) Decision to change, ii) Decision on which model to implement, iii) Implementation approach, iv) Implementation outcomes, v) Intervention outcomes.
Results
Site and participant characteristics
We received 292 staff surveys (39% response rate) across 28 sites and 1069 patient and carer surveys (18% response rate) across 25 sites (see Appendix 5). Interviews were conducted with: national leads (n = 5), staff (n = 58 across 17 sites) and patients or carers (n = 62 across 17 sites) (see Appendix 5). Cost surveys were received from 26/28 sites.
Over the period of analysis, we judged that enrolment data was complete for 37 CCGs (27% of the total number of 135 CCGs across England) and hence our analyses of the effectiveness of the CO@h service were limited to these areas. Between these CCGs there were no notable differences in mean age, proportions of non-white population or proportions resident in most deprived areas when compared to the remaining 98 that were not included; although included CCGs had a lower incidence of positive test results. There were also regional differences: no CCGs from the East NHS Region were included, and only one from the North East and Yorkshire. The South West, North West and Midlands were the best represented regions.
Our analysis of the CVW service used data on all live discharges of COVID-19 patients from 123 hospital trusts, covering 98% of all such discharges in England during the study period.
Evaluating implementation
Figure 1 shows an adapted MSC framework, building on this evaluation. This evaluation highlighted some amendments to the framework, including additional intervention process outcomes to evaluate e.g., staff experience [23], patient engagement [24] and disparities [26], and relationships between domains of implementation (reported in this manuscript) that must be considered when evaluating the implementation of MSC (see Fig. 1).
Adaptation to the Major System Change Framework [16]
Findings relating to each domain are discussed below.
1&2. Decision to change & decision on which model to implement
The development of COVID-19 remote home monitoring services occurred rapidly, in three overlapping stages: i) Local development and implementation, ii) National development and roll-out, and iii) Local implementation. Therefore, the ‘Decision to change’ and ‘Decision on which model to implement’ domains often occurred concurrently for COVID-19 remote home monitoring services.
Local development and early adopters of implementation (bottom up implementation)
COVID-19 remote home monitoring services began in wave 1 of the pandemic (March–May 2020) as several, local services were established on an ad hoc basis, motivated by attempts to mitigate silent hypoxia (very low oxygen saturations, often without breathlessness). Pulse oximetry was initially used to monitor patients in the community [21, 35]. During early stages of the pandemic, local clinical leaders within CCGs and secondary care trusts had identified a need for this service and were integral in facilitating the initial development, set-up and implementation of services, according to local service needs and infrastructure.
The local development and implementation of services gained interest from national stakeholders and pilot services were evaluated [22]. To support local implementation, two learning communities were established prior to the national roll-out: the community of practice and a national learning network (set up by AHSN patient safety collaboratives to support clinical leads and provide resources). Local leaders instigated the development of an informal community of practice (an informal group led by key national clinical and policy leads to support local services that were adopting pulse oximetry services). This group grew rapidly and met online every couple of weeks. The conversations and shared learning from the community of practice and national learning network played a key role in shaping the basic pathway underpinning these services. The national learning network was prompted through NHS @home (via the national team and regional medical directors). Additionally, regular webinars were held and an online platform was set up by AHSNs to share learning and resources.
National development, standardisation and roll-out (top down implementation)—> National spread and scale-up of implementation locally (bottom up implementation)
In England, between wave 1 (Spring 2020) and 2 (Autumn/Winter 2020/2021) of the pandemic, the organisation that leads healthcare in England (NHS England) gathered relevant information (research [e.g. [21, 22] and clinical consensus), how services should be developed, development of safety netting guidance) to inform the development of a Standard Operating Procedures [35, 36] for COVID-19 remote home monitoring services, that would be inclusive and enabled by technology. Once the service was approved at a national level (November 2020 for CO@h services and February 2021 for CVW services), the Standard Operating Procedures were published and services that had not early adopted services began to implement these services. National roll out was supported by regional launch events, and funding of national clinical leads and regional clinical leads to support local implementation (including funding to implement [34], purchase of pulse oximeters and support for tech-enabled platforms). National guidance indicated that local leads were responsible and accountable for their services (guided by SOPs); thus guiding local adaptation whereby local services allocated resources, designed staffing models, and distributed equipment and educational materials. Local services were supported by the community of practice and national learning network.
3. Implementation approach
There were a range of factors that influenced how local services implemented COVID-19 remote home monitoring services. These related to patients (patient demographics and disease profiles, digital access and literacy and patient engagement), staff (training, skill set, work environment, workload), the organisation delivering the programme (staffing models, cross-organisation collaboration, learning environment, engagement of senior management) and resources (staff availability, hardware, software) (see Appendix 6 for a summary).
4. Implementation outcomes
Enrolment
Patient enrolment to COVID-19 remote monitoring services was lower than expected. Within each of the 37 CCGs with complete data, the dates sites started enrolling patients to the CO@h service are plotted in Appendix 7a. Three of these CCGs were enrolling patients in October and all 37 were enrolling patients in the fortnight beginning 11 January 2021.
Once services started operating and up until 4 April, enrolment across the 37 CCGs among people aged 65 or over is shown in Appendix 7b. The highest enrolment rates achieved by a CCG were more than twice the next highest. 22 CCGs (59%) had enrolment rates below 10%. The overall enrolment rate over the period for this age group across all 37 CCGs was 8.7%.
As addressed above, data quality issues meant that we were not able to derive patient enrolment figures for the CVW services nationally. Appendix 7c presents the number of trusts with a CVW service by week; 14 of 123 trusts had no CVW service by the end of the study period. For seven hospital trust-based sites that returned a cost survey, we estimated a range of 4% up to 65% of discharged COVID-19 patients may have been enrolled to a CVW service.
Variation in implementation
We created a typology (classification system) and categorised sites according to seven domains (Appendix 8): 1) type of model (CO@h/CVW/Integrated), 2) sector leading services (primary care/secondary care/both), 3) type of monitoring (analogue-only (paper and telephone)/tech-enabled and analogue), 4) admission criteria (age and clinical vulnerability), 5) workforce (number and type of staff), 6) date service started, and 7) enrolment rates within the CCGs where the sites were located.
We found that the 28 local services implemented different types of COVID-19 remote home monitoring (see Appendix 8).
Findings indicated that the implementation of services varied between local sites, and also from national guidance [28, 29].
Service eligibility
Most services used age criteria of either 18 years or over, or 50 years or over, rather than the age criteria recommended within the SOP of 65 years or over. Many sites adapted and reduced their enrolment age throughout the period. Most sites used risk factors alongside age to enrol patients, but each site varied in the risk factors used.
Workforce
As shown in Appendix 8, there were large differences across sites regarding the number of staff involved in setting up (n = 2–20 +) and running (n = 2–70 +) the service, and with regard to the type of staff involved in monitoring. Most sites used clinical staff only to support monitoring.
Patient pathway
In line with national guidance, all sites involved key components of the patient pathway such as distributing pulse oximeters, asking patients to monitor blood oxygen saturation levels and submit readings daily. Even though all sites had these components, the way in which they were delivered varied substantially.
For example, means of distribution of oximeters varied from them being delivered to patients (n = 27 sites), or given to patients at GP/hospital (n = 9 sites), patients/family members collecting oximeters (n = 7 sites). Some patients described difficulties in collecting oximeters due to their poor health and isolation. Additionally, the way in which patients submitted readings (tech-enabled and/or analogue) and the level of interaction with staff varied across sites. Most services were tech-enabled with telephone options offered when tech-enabled modes of submission were not possible. Members of staff were involved in taking and reporting readings at all sites; none of the services solely asked all of their patients to self-monitor and self-escalate care.
However, other aspects of the patient pathway were more variable, including the provision of information (n = 22/25 provided written or verbal information), and having processes in place for triage, escalation and discharge. In terms of triage, most sites had triage processes (n = 24/25), but these varied across different models (e.g. in terms of who checks against admission criteria, and the processes used). Escalation processes varied, with some tech-enabled solutions identifying patients for escalation (with the addition of phone or face-to-face assessment in some cases) and some escalation processes being manually initiated (with the addition of phone or face-to-face assessment in some cases). Many but not all sites had explicit flexible processes in place for patient discharge (n = 19/25)—most sites reported patients were discharged after 14 days, but that patients could be discharged earlier or kept on if needed. However, a quarter of patients were not aware of discharge processes and 31% of patients were not asked to return oximeters.
5. Intervention outcomes
A summary of findings for clinical outcomes [24, 25], cost [26], staff [27] and patient experience/engagement [29], disparities [30] and mode of service [28] are outlined in Table 1. Findings show that there were no associations found between services and clinical outcomes, but that patient and staff experiences of the service were generally positive, but staff faced some barriers to delivering these services and certain groups of patients found it harder to engage due to barriers affecting engagement.
Discussion
Key findings
We demonstrate that the MSC Framework [16] can successfully be applied (with some minor amendments, see Fig. 1) as a tool to study the implementation of rapid healthcare system change, such as those which occurred during the pandemic. Using this framework, we identified that:
-
COVID-19 remote home monitoring services were driven by bottom up and top-down decision making.
-
Many local factors influenced the implementation of COVID-19 remote home monitoring services, including patient, staff, organisational and resource factors.
-
Patient enrolment was lower than expected, and services varied substantially from one another.
-
There were no associations found between services and clinical outcomes; patient and staff experiences of the service were generally positive, with some areas for improvement regarding disparities, patient engagement and delivery.
-
Implementation findings helped to interpret findings on effectiveness, cost and process outcomes (patient experience, engagement, staff experience, disparities). For example, low enrolment, variation in implementation and gaps in data collection created difficulties evaluating the true extent of effectiveness and cost effectiveness of COVID-19 remote home monitoring services.
How findings extend previous knowledge
Our findings support previous research on factors influencing implementation of MSC regarding the importance of stakeholder buy-in, support and cross organisational relationships [5, 16, 17], and the significance of networks in supporting MSC [5, 16, 17]. However, patient factors [29, 30] and workforce capacity, training and resources [27], were also important influences that have not been previously emphasised in research on transformative change [5, 16, 17].
How do findings relating to MSC differ in rapid vs non-rapid contexts?
Our findings and previous research [5, 16, 17] indicate that there are occasions (in rapid and non-rapid implementation contexts) when drivers to change and decision on model to implement may occur concurrently (when a combination of top down and bottom-up implementation approaches are used). Findings demonstrated that in rapid implementation contexts, intervention outcomes may not feed into decisions to change and decisions on which model to implement domains. This is due to implementation being delivered at speed (perhaps in this example due to the global urgency of providing healthcare during the pandemic), and intervention outcomes being unavailable at points of initial decision-making. However, findings could be used to retrospectively adapt service models, for example, the findings may be used to support the implementation of future virtual ward services that are being rolled out for other conditions [37].
Findings provide insight into how factors influencing implementation may vary from non-rapid contexts. For example, while findings from reconfiguration of stroke services outlined the importance of service specifications for increasing uniformity of implementation [5, 16], we did not find this to be the case in rapid transformation. Our study found that even with national standard operating procedures, there were substantial variations in implementation of COVID-19 remote home monitoring (as demonstrated within the findings outlining variation in implementation), due to many sites having developed and implemented services as early adopters, prior to the standard operating procedure publication, and adaptations were made at local levels to reflect local contexts; supporting previous research which suggests top down and bottom-up approaches may affect consistency of implementation [7]. Whilst some variation is encouraged nationally to ensure that local contexts adapt services to meet needs of their population and services, there are questions around the point at which variation could become counterproductive, a risk to implementation at scale or not represent the service as intended. Where a degree of standardisation is required, findings indicate the importance of publishing standard operating procedures or service specifications as early as possible in implementation and communicating these to local services. However, this may not be feasible during rapid implementation of service change in certain emergency contexts (e.g. during COVID-19).
Our findings support the Path Dependency Process theory [38], which proposes that past events and decisions can limit future choices. For example, local decisions on the service set-up and specification were initially influenced by pandemic pressures together with local clinical need faced by early adopters. These early decisions influenced national decisions regarding standardisation. These early decision-making points, together with the factors influencing implementation (understood as ‘conjunctural conditions’) then led to local service considering further tailoring as part of scale-up and spread; resulting in services forging their own paths regarding aspects of implementation such as eligibility criteria.
The application of the MSC framework to rapid and other contexts
The MSC framework has previously been used in a limited range of settings, e.g., acute stroke services [5, 16], specialist cancer services [17]. Our application shows that the MSC framework has potential to be used much more widely, in terms of i) applicability, ii) scope, and iii) scale; with some adaptations to terminology. The MSC Framework [9] has previously been used to evaluate non rapid MSC in healthcare contexts [5, 16, 17], to explore centralisation of services [5, 16, 17], in a small number of settings. However, the definition of MSC by Best et al. [4] indicates it is broader than reconfigurations to centralise care, and relates to all systemwide change that affects multiple organisations and providers and aims to improve care. Our study extends previous knowledge by demonstrating that the framework can successfully be applied to evaluate systemwide change that occurs on a national level, at a rapid pace in response to emergencies such as COVID-19. This provides an opportunity to use the framework to compare implementation across different geographical locations and service types. This has been highlighted as an important consideration in previous literature [6]. Furthermore, this evaluation outlines additional intervention process outcomes to consider when evaluating implementation of MSC, including patient engagement [29], staff experience [27] and disparities [30].
Previous research has mostly used a combination of case study methodology, qualitative interviews with national and local stakeholders and documentary analysis to explore implementation aspects of the MSC framework [5, 6, 16, 17]. We extend previous research by including patient and carer perspectives and surveys (staff/patients/carers) in addition to national and local staff interviews and documentary analysis to comprehensively develop an understanding of care pathways and compare this with standard operating procedures to determine whether actual implementation of services varies from planned implementation. Furthermore, the mixed-methods approach used within this manuscript and previous research [17] could be used by future researchers to thoroughly explore concepts such as adoption/enrolment (e.g. using national data, how well were services adopted, in comparison to individual adoption of interventions) and fidelity and implementation at service and system levels (e.g. generally how were whole services delivered in comparison to standard operating procedures), in comparison to measuring fidelity at individual levels. We focused on variation in implementation, rather than ‘fidelity’ specifically due to reasons outlined earlier such that given the emergency context many sites had developed and implemented the service prior to any national service specifications. However, this approach enabled us to better understand how services were delivered locally, which in turn helped us to consider and understand local variation in implementation in relation to the overall intervention outcomes (e.g. effectiveness findings, and differences in staff and patient experience).
Previous research has highlighted the limitations of carrying out data collection after the service change has occurred and that future research should aim to collect data at the same time as the service change [5, 16]. As this was a rapid study evaluating rapid MSC, data collection and service implementation happened concurrently. Whilst researchers had to exercise flexibility to adapt to changes in service implementation, concurrent evaluation meant that findings were able to reflect implementation of rapid MSC in real time.
Whilst previous research used the framework from inception of the study [5, 16, 17], our evaluation retrospectively applied the MSC framework; indicating that the framework can be used flexibly, depending on constraints of the evaluation.
Strengths and limitations
Due to the rapid timeframe of our evaluation, we retrospectively applied the MSC Framework to our findings instead of applying it at the point of study design. However, it was easy and feasible to apply the framework retrospectively, therefore it is likely that this could be applied at the time of the analysis in future rapid and non-rapid evaluations; indicating flexibility to evaluation constraints.
This study used mixed-methods data from a wide range of sources. Therefore, the findings provide a comprehensive picture of the development, coverage and implementation of COVID-19 remote home monitoring services through wave 2 of the pandemic.
Given the rapid implementation of services, we are unable to say whether the degree of variation we witnessed is a result of the rapid roll-out or whether it would have been observed if lengthier implementation had followed.
Our assessment of service enrolment within CCGs relied on their enrolment data being complete. To judge completeness, we relied on assessments made by individual CCGs, although as they were not necessarily the service providers, and therefore removed from the data entry process, we were not sure about the accuracy of this information in all cases. We were able to cross-check against the data provided by the 28 study sites themselves and found a reasonable match for most but not all, and we were unable to verify the data from the CCGs that were not study sites in the same way.
Implications
These findings indicate that the MSC Framework can be used to support and evaluate rapid service transformation in the NHS. Given the current continual rapid transformation of the NHS (e.g. to deal with backlogs and to implement new technologies such as artificial intelligence), large-scale healthcare system changes are often being implemented at pace. These large-scale system changes often do not have existing evidence or guidance in place at the point of inception and therefore evaluation and evidence-building need to take place concurrently. Therefore, it is important to develop approaches that can be used to evaluate rapid large-scale system change. Our manuscript outlines a framework for other studies to use when evaluating the implementation of rapid MSC (for example the wider roll out of virtual wards [37], and innovations such as Patient Initiated Follow up which will help with NHS recovery efforts and dealing with backlogs [39]).
Our findings have implications for interpretation of quantitative outcomes of the evaluation. Our findings on low rates of enrolment and variation in implementation can be used to help interpret findings relating to effectiveness and cost of the services [24,25,26]. However, we were not able to look at how specific variation linked with particular service outcomes or enrolment more generally.
Future research
Future research should consider using the MSC Framework to explore how similar urgent and/or rapid nationally-rolled our programmes are implemented, as well as when organisations are asked to implement and deliver multiple services at a single point in time. Further evidence will help us to draw conclusions about the use of the MSC Framework in rapid and non-rapid contexts.
Conclusions
Findings show that the MSC framework has the potential to be used more widely than previously indicated, as it is applicable in rapid MSC contexts, can be applied to different types of MSC and can be used to understand MSC on national and local levels without much adaptation. We outline a method that can be followed by other researchers to explore MSC using a range of mixed methods approaches at a large scale. Findings provide insight into how factors influencing implementation may vary across rapid and non-rapid contexts. Our findings demonstrating substantial variation irrespective of service specifications indicated that service specifications may not be as impactful for informing change in rapid MSC as previously suggested for non-rapid MSC, but that tailoring services to the local context in terms of patient needs, and existing staffing and resources was key.
Data availability
The data that support the findings of this study are available on request from the corresponding author. The data are not publicly available due to privacy or ethical restrictions.
References
Guarcello C, de Vargas ER. Service innovation in healthcare: A systematic literature review. Latin American Business Review. 2020;21(4):353–69. https://doiorg.publicaciones.saludcastillayleon.es/10.1080/10978526.2020.1802286
Department of Health & Social Care. Policy paper. Integration and innovation: Working together to improve health and social care for all 2021. Retrieved 22/11/2022 from https://www.gov.uk/government/publications/working-together-to-improve-health-and-social-care-for-all/integration-and-innovation-working-together-to-improve-health-and-social-care-for-all-html-version#delivering-for-patients-citizens-and-local-population
National Health Service. Innovation into action. Supporting delivery of the NHS Five Year Forward View. 2015. Retrieved 22/11/2022 from https://www.england.nhs.uk/wp-content/uploads/2015/10/nhs-inovation-into-action.pdf
Best A, Greenhalgh T, Lewis S, Saul JE, Carroll S, Bitz J. Large-system transformation in health care: a realist review. Milbank Q. 2012;90(3):421–56. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/j.1468-0009.2012.00670.x.
Turner S, Ramsay A, Perry C, Boaden R, McKevitt C, Morris S, et al. Lessons for major system change: centralization of stroke services in two metropolitan areas of England. J Health Serv Res Policy. 2016;21(3):156–65. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/1355819615626189.
Turner, S., Goulding, L., Denis, J. L., McDonald, R., & Fulop, N. J. Essay 6 Major system change: a management and organisational research perspective. In R.Raine et al. Challenges, solutions and future directions in the evaluation of service innovations in health care and public health. Health Serv & Del Res. 2016;4(16), https://doiorg.publicaciones.saludcastillayleon.es/10.3310/hsdr04160
Ogunlayi F, Britton P. Achieving a ‘top-down’ change agenda by driving and supporting a collaborative ‘bottom-up’ process: case study of a large-scale enhanced recovery programme. BMJ Open Qual. 2017;6(2):e000008. https://doiorg.publicaciones.saludcastillayleon.es/10.1136/bmjoq-2017-000008.
Ham C, Berwick D, Dixon J. Improving quality in the english NHS: a strategy for action. London: The King’s Fund; 2016.
Nilsen P. Making sense of implementation theories, models, and frameworks. Implement Sci. 2020;3:53–79. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s13012-015-0242-0.
Wilson KM, Brady TJ, Lesesne C, NCCDPHP Work Group on Translation. Peer reviewed: an organizing framework for translation in public health: the knowledge to action framework. Preventing chronic disease. 2011;8(2). Accessed [16/12/2024] from: https://pmc.ncbi.nlm.nih.gov/articles/PMC3073439/
Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:1–5. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-4-50.
Cane J, O’Connor D, Michie S. Validation of the theoretical domains framework for use in behaviour change and implementation research. Implement Sci. 2012;7:1–7. https://doiorg.publicaciones.saludcastillayleon.es/10.1186/1748-5908-7-37.
Michie S, Atkins L, West R. The behaviour change wheel. A guide to designing interventions. 2014;1:1003–10.
May C, Finch T. Implementing, embedding, and integrating practices: an outline of normalization process theory. Sociology. 2009;43(3):535–54. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/00380385091032.
Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322–7. https://doiorg.publicaciones.saludcastillayleon.es/10.2105/AJPH.89.9.1322.
Fulop NJ, Ramsay AI, Perry C, Boaden RJ, McKevitt C, Rudd AG, Turner SJ, Tyrrell PJ, Wolfe CD, Morris S. Explaining outcomes in major system change: a qualitative study of implementing centralised acute stroke services in two large metropolitan regions in England. Implement Sci. 2015;11(1):1–3.
Fulop NJ, Ramsay AI, Vindrola-Padros C, Clarke CS, Hunter R, Black G, Wood VJ, Melnychuk M, Perry C, Vallejo-Torres L, Ng PL. Centralisation of specialist cancer surgery services in two areas of England: the RESPECT-21 mixed-methods evaluation. Health Soc Care Deliv Res. 2023;11(2). https://doiorg.publicaciones.saludcastillayleon.es/10.3310/QFGT2379
Ramalingam B, Prabhu, J. Innovation, development and COVID-19: Challenges, opportunities and ways forward: OECD; 2020. Retrieved 22/11/2022 from https://www.oecd.org/coronavirus/policy-responses/innovation-development-and-covid-19-challenges-opportunities-and-ways-forward-0c976158/
Oxtoby K. Covid-19:“Life on hold” for NHS patients needing musculoskeletal care. bmj. 2021;373. ARTN n161610.1136/bmj.n1616
NHS England. Responding to new challenges and opportunities n.d. Retrieved 22/11/2022 from https://www.england.nhs.uk/ournhspeople/online-version/challenges-and-opportunities/
Vindrola-Padros C, Singh KE, Sidhu MS, Georghiou T, Sherlaw-Johnson C, Tomini SM, et al. Remote home monitoring (virtual wards) for confirmed or suspected COVID-19 patients: a rapid systematic review. EClinicalMedicine. 2021;37:100965. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.eclinm.2021.100965.
Vindrola-Padros C, Sidhu MS, Georghiou T, Sherlaw-Johnson C, Singh KE, Tomini SM, et al. The implementation of remote home monitoring models during the COVID-19 pandemic in England. EClinicalMedicine. 2021;34:100799. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.eclinm.2021.100799.
Fulop NJ, Walton H, Crellin N, Georghiou T, Herlitz L, Litchfield I, et al. A rapid mixed-methods evaluation of remote home monitoring models during the COVID-19 pandemic in England. Health Soc Care Deliv Res. 2023;11(13). https://doiorg.publicaciones.saludcastillayleon.es/10.3310/FVQW4410
Sherlaw-Johnson C, Georghiou T, Morris S, Crellin N, Litchfield I, Massou E, et al. The impact of remote home monitoring of people with COVID-19 using pulse oximetry: a national population and observational study. EClinicalMedicine. 2022. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.eclinm.2022.101318.
Georghiou T, Sherlaw-Johnson C, Massou E, Morris S, Crellin N, Herlitz L, Sidhu M, Tomini S, Vindrola-Padros C, Walton H, Fulop N. The impact of post-hospital remote monitoring of COVID-19 patients using pulse oximetry. eClinicalMedicine. 2022;48(101441). https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.eclinm.2022.101441
Tomini SM, Massou E, Crellin NE, Fulop NJ, Georghiou T, Herlitz L, Litchfield I, Ng PL, Sherlaw-Johnson C, Sidhu MS, Walton H, Morris S. Cost Evaluation of COVID-19 remote home monitoring services in England. PharacoEconomics - Open (in press).
Sidhu M, Walton H, Crellin N, Ellins J, Herlitz L, Litchfield I, Massou E, Tomini SM, Vindrola-Padros C, Fulop NJ. Staff experiences of training and delivery of remote home monitoring services for patients diagnosed with COVID-19 in England: A mixed-methods study. J Health Serv Res Policy. 2023;28(3):171–80. https://doiorg.publicaciones.saludcastillayleon.es/10.1177/135581962311725.
Herlitz L, Crellin N, Vindrola-Padros C, Ellins J, Georghiou T, Litchfield I, Massou E, Ng PL, Sherlaw-Johnson C, Sidhu MS, Tomini SM. Patient and staff experiences of using technology-enabled and analogue models of remote home monitoring for COVID-19 in England: A mixed-method evaluation. International Journal of Medical Informatics. 2023:105230. https://doiorg.publicaciones.saludcastillayleon.es/10.1016/j.ijmedinf.2023.105230
Walton H, Vindrola-Padros C, Crellin NE, Sidhu MS, Herlitz L, Litchfield I, Ellins J, Ng PL, Massou E, Tomini SM, Fulop NJ. Patients’ experiences of, and engagement with, remote home monitoring services for COVID-19 patients: A rapid mixed-methods study. Health Expect. 2022;25(5):2386–404. https://doiorg.publicaciones.saludcastillayleon.es/10.1111/hex.13548.
Crellin N, Herlitz, L., Sidhu MS., Ellins J., Georghiou T., Litchfield I., Massou E., Ng PL., Sherlaw-Johnson C., Tomini SM., Vindrola-Padros C., Walton H., Fulop N. Examining disparities relating to service reach and patient engagement with COVID-19 remote home monitoring services in England: a mixed methods rapid evaluation. medRxiv. 2021. Retrieved 07/06/2022 from: https://doiorg.publicaciones.saludcastillayleon.es/10.1101/2022.02.21.22270793v1
Nuffield Trust. Rapid Service Evaluation Team. 2024. Retrieved 16/12/2024 from: https://www.nuffieldtrust.org.uk/rset-rapid-evaluations-of-new-ways-of-providing-care
University of Birmingham. NIHR BRACE Rapid Evaluation Centre. 2024. Retrieved 16/12/2024 from: https://www.birmingham.ac.uk/research/centres-institutes/brace-rapid-evaluation-centre
Walton H, Crellin NE, Sidhu MS, Sherlaw-Johnson C, Herlitz L, Litchfield I, Georghiou T, Tomini SM, Massou E, Ellins J, Sussex J. Undertaking rapid evaluations during the COVID-19 pandemic: Lessons from evaluating COVID-19 remote home monitoring services in England. Front Sociol. 2023;8:982946. https://doiorg.publicaciones.saludcastillayleon.es/10.3389/fsoc.2023.982946.
NHS England and Improvement. Supporting general practice – additional £150 million of funding from NHS England (official letter). 2020. Retrieved 11/01/2022 from https://www.england.nhs.uk/coronavirus/wp-content/uploads/sites/52/2020/03/C0828_GP-funding-letter-_second-wave_9novreb.pdf
National Health Service. Novel coronavirus (COVID-19) standard operating procedure: COVID Oximetry @home. v1.2. Retrieved 10/11/2021 from https://www.england.nhs.uk/coronavirus/wp-content/uploads/sites/52/2020/11/C1396-sop-covid-oximetry-@home-v2-september-21.pdf
National Health Service. Novel coronavirus (COVID-19) standard operating procedure: COVID Virtual Ward. V1.0. . Retrieved 10/11/2021 from https://www.england.nhs.uk/coronavirus/wp-content/uploads/sites/52/2021/01/C1042-sop-discharge-covid-virtual-ward-13-jan-21.pdf
NHS England. (2022). "2022/23 priorities and operational planning guidance." Retrieved 22/11/2022, from https://www.england.nhs.uk/wp-content/uploads/2022/02/20211223-B1160-2022-23-priorities-and-operational-planning-guidance-v3.2.pdf.
Wilsford D. Path dependency, or why history makes it difficult but not impossible to reform health care systems in a big way. J Publ Policy. 1994;14(3):251–83. https://doiorg.publicaciones.saludcastillayleon.es/10.1017/S0143814X00007285.
Reed, S. & Crellin, C (2022). Patient-Initiated follow up: does it work, why it matters, and can it help the NHS recover? Accessed [08/02/2023] from: https://www.nuffieldtrust.org.uk/resource/patient-initiated-follow-up-does-it-work-why-it-matters-and-can-it-help-the-nhs-recover
Acknowledgements
Thank you to Dr Angus Ramsay for feedback on this manuscript, and framing the findings in relation to the wider Major System change evidence base.
We are indebted to all of the services who participated in this study and to all of the patients and carers who participated in our surveys and interviews. Thank you to the following: Dr Jennifer Bousfield for supporting with study design and data collection, Simon Barnes for supporting with data entry; Steve Morris and Cecilia Vindrola-Padros for advice given throughout the project; our NIHR BRACE and NIHR RSET public patient involvement members for feedback throughout the study and to Raj Mehta for commenting on a draft of the manuscript; the NIHR 70@70 Senior Nurse research Leaders for providing feedback on the development of our study; Russell Mannion for peer-reviewing our study protocol; and the NIHR Clinical Research Networks for supporting study set up and data collection.
We thank the NHS Digital CO@h Evaluation Workstream Group chaired by Professor Jonathan Benger for facilitating and supporting the evaluation, and to the other two evaluation teams for their collaboration throughout this evaluation: i) Institute of Global Health Innovation, NIHR Patient Safety Translational Research centre, Imperial College London and ii) the Improvement Analytics Unit (Partnership between the Health Foundation and NHS England).
Many thanks to our Clinical Advisory Group for providing insights and feedback throughout the project (Dr Karen Kirkham (whose previous role was the Integrated Care System Clinical Lead, NHSE/I Senior Medical Advisor Primary Care Transformation, Senior Medical Advisor to the Primary Care Provider Transformation team), Dr Matt Inada-Kim (Clinical Lead Deterioration & National Specialist Advisor Sepsis, National Clinical Lead—Deterioration & Specialist Advisor Deterioration, NHS) and Dr Allison Streetly (Senior Public Health Advisor, Deputy National Lead, Healthcare Public Health, Medical Directorate NHS England).
Funding
This is independent research funded by the National Institute for Health Research, Health and Social Care Delivery Research programme (RSET Project no. 16/138/17; BRACE Project no. 16/138/31) and NHSE. NJF is an NIHR Senior Investigator. The views expressed in this publication are those of the authors and not necessarily those of the National Institute for Health Research or the Department of Health and Social Care.
Author information
Authors and Affiliations
Contributions
All authors were responsible for the study conception, design, and data collection throughout the study (HW, NC, IL, CSJ, TG, EM, MS, SMT, LH, JE, PLN, NJF). HW, NC and IL led data analysis for the qualitative aspects, TG/CSJ led data analysis for the quantitative aspects and EM/ST led data analysis for cost effectiveness aspects. HW and NC drafted the manuscript with contribution from all authors (IL, CSJ, TG, EM, MS, SMT, LH, JE, PLN, NJF). All authors (HW, NC, IL, CSJ, TG, EM, MS, SMT, LH, JE, PLN, NJF) commented on drafts of the manuscript and approved the final version. NJF was principal investigator for the study.
Corresponding authors
Ethics declarations
Ethics approval and consent to participate
For this evaluation, the research was divided into two separate protocols. A protocol covering effectiveness, cost and staff elements received ethical approval from the University of Birmingham Humanities and Social Sciences ethics committee (ERN_13-1085AP39) and was categorised as a service evaluation by the HRA decision tool and UCL/UCLH Joint Research Office.
The patient experience element (survey and case study interviews) was reviewed and given favourable opinion by the London-Bloomsbury Research ethics committee (REC reference: 21/HRA/0155). This patient experience study was categorised as an urgent public health study by NIHR.
Consent for publication
Not applicable.
Competing interests
The authors declare no competing interests.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
About this article
Cite this article
Walton, H., Crellin, N., Litchfield, I. et al. Applying the major system change framework to evaluate implementation of rapid healthcare system change: a case study of COVID-19 remote home monitoring services. Implement Sci Commun 6, 24 (2025). https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00707-y
Received:
Accepted:
Published:
DOI: https://doiorg.publicaciones.saludcastillayleon.es/10.1186/s43058-025-00707-y