Abstract
We study the long-term effects on hospital activity of a three-year national integration programme. We use administrative data spanning from 24 months before to 22 months after the programme, to estimate the effect of programme discontinuation using difference-in-differences method. Our results show that after programme discontinuation, emergency admissions were slower to increase in Vanguard compared to non-Vanguard sites. These effects were heterogeneous across sites, with greater reductions in care home Vanguard sites and concentrated among the older population. Care home Vanguards showed significant reductions beginning early in the programme but falling away more rapidly after programme discontinuation. Moreover, there were greater reductions for sites performing poorly before the programme. Overall, this suggests the effects of the integration programme might have been lagged but transitory, and more reliant on continued programme support.
Similar content being viewed by others
Introduction
Pilot policy experiments are often natural predecessors of large-scale implementation [1]. The “success” of pilot experiments generally involve complex judgments over evolving policy objectives [2], with effects that are often not immediate and may take time to emerge. This is especially true if there is a learning period following implementation or lags in full implementation [3, 4]. However, the question of whether the beneficial impacts (if any) of a pilot experiment could persist after its discontinuation has received limited attention in policy evaluation literature. Undertaking impact evaluation, well beyond programme duration, can be informative about underlying mechanisms and circumstances that lead to permanent changes [5, 6].
We examine this in the context of integrated health care programmes. Health care systems around the world are being re-designed with a focus on delivering care in a resource efficient manner while ensuring timely and quality care to the patient. The focus within high-income countries is upon elderly population and/or those with complex health needs, whereas low-income countries are gradually moving towards addressing emerging challenges of dual (communicable and non-communicable) disease burden [7]. Among high-income countries, several previous integration programmes have remained localised to facilitate significant change at grassroot levels [8]. However, there are also working models of system wide integration efforts [9].
In this paper, we undertake impact evaluation for the Vanguard integrated health and care programme in England. This pilot programme included a mix of models that targeted a ‘high-risk’ group as well as broad population-based approaches. It was aimed at delivering care through an integrated system developed via enhanced coordination between general practitioners, communities, hospitals and social care services.
This programme is relevant for at least three reasons. Firstly, for its scale: it was a flagship National Health Service (NHS) England programme, running from 2015 to 2018, costing about \(\pounds\)389 million and covering a population of around 5 million (around 9% of the entire population in England). It was congruent to some extent with previous integrated care programmes piloted in England in terms of target population-older and those with complex conditions [10]. Second, for its scope: it was aimed at developing new models of care that would be sustainable within and beyond the Vanguards [11]. Third, for its policy relevance: The NHS Long-Term Plan [12, p. 13] announced the commitment to spread the innovative practices piloted with the Vanguard initiative across England.
In the initial stages, the Vanguards were allowed to set their own objectives with some guidelines from NHS England. But by the final year, the funding of the sites were linked to demonstration of reduction in emergency admissions and hospital bed days [13]. Previously, Morciano et al. [14] documented how the Vanguard programme slowed the persistent rise in hospital emergency admissions observed in England [15] during the programme period. However, the overall modest net reductions in the emergency hospital admissions of Vanguard sites largely occurred in the final year of the programme. They were also heterogeneous across initiatives and among sites differently exposed to previous integration initiatives [16]. However, legacy effects of the programme are not yet known.
In the field of medicine, legacy effects of a therapy are treatment effects that persist or emerge some time after treatment ends [17]. In a narrative review, Folz and Laiteerapong [17] show that the duration of follow-up period to be examined can vary from 2–5 years to until decades after. There are examples from other fields such as public economics wherein, Roper and Hewitt-Dundas [18] examined the legacy effects of public subsidies on private innovation 4–6 years after the initial subsidy. In the case of policy experiments, we examine whether changes adopted during the Vanguard initiatives were integrated into general capabilities of the institution, and therefore evaluate legacy effects.
This paper builds on Morciano et al. [14] in two ways. First, we extended the period of analysis to assess whether the beneficial effects of the Vanguard programme persisted after the programme finished using a standard difference-in-differences setting. Our follow-up period spans from the end of the Vanguard programme to the start of the COVID-19 pandemic. Doing so we are able to distinguish between short-term effects (during the treatment period itself) and long-term effects of the programme (post-programme discontinuation). Second, we use conditional quantile regressions to assess whether the programme led to heterogeneous outcomes among treated sites during and after the programme compared to the levels observed in the pre-Vanguard period for untreated sites.
Theoretically, the effect on outcomes post-programme discontinuation may be ambiguous. Any sustainable organisational, managerial and/or technological changes made during the programme period might lead to persistent effects on outcomes.Footnote 1 But on the other hand, the support Vanguard sites received may have been pivotal to generating the beneficial effects seen at the time. Therefore, the effects may not persist without continued funding. Further, a stability (slow down) in net outcomes might also lead to an appearance of convergence (divergence) of trends between Vanguards and non-Vanguard sites. However, such convergence itself may come from well-performing or poor-performing sites. From a policymaker’s perspective, this insight is critical to knowing when to measure impact and when to discontinue investment.
Our difference-in-differences estimates show that after the end of the Vanguard programme average emergency admission rates were slower to increase among Vanguard relative to non-Vanguard sites. Furthermore, we find that the net reductions were greater at the upper end of the distribution (i.e., for sites with high admission rates). However, the net reduction in the post-Vanguard period became smaller and non-significant towards the end of the period we have covered, suggesting that the effects were lagged rather than permanent.
Vanguard programme: details
The genesis of the Vanguard programme came about in 2014 in the NHS England’s Five Year Forward View (FYFV) which recognised that instead of structural reform involving a ‘one size fits all’ model, new ways of working may need to be developed to improve care delivery [21]. Thus, the core objective of the Vanguard programme was to create integrated systems that join up different arms of health and care services through innovative models. There were two ‘population-based’ Vanguard schemes (Multi-speciality Community Providers (MCPs) and Primary and Acute Care Systems (PACS)). Population-based vanguards were aimed at moving specialist care for the general population out of hospitals and into the community by fostering closer collaboration between GPs, hospitals, communities and social care services. There was also the ‘care home’ Vanguard scheme (The Enhanced Care in Care Homes (ECH)) aimed at improving the quality and coordination of health, care and rehabilitation services for care home residents by increasing the medical support available and by promoting collaboration between the NHS, local authorities, the voluntary sector, carers and relatives [22, 23]. These were aimed at delivering integrated care in the community involving primary, secondary, social and community care. There were other Vanguards focused on improving coordination among hospitals and emergency services as well.Footnote 2 In all, 50 local areas were selected to act as Vanguards for the five proposed models. Subsequently, a support programme was devised to help develop and spread these new models of care within and beyond the Vanguards, which included a national lead for each model, support to develop logic models for local schemes, local account managers, learning and networking events, etc [16, 24]. The Vanguard programme also received substantial funding to support service changes within eligible sites. The total costs estimated by NAO include, direct costs at \(\pounds\)329 million and another \(\pounds\)60 million for national support and monitoring [25].
Data and descriptives
We received data from NHS England on monthly counts of emergency admissions from 01 April 2013 until just ahead of the pandemic, 01 January 2020.Footnote 3 The time horizon spans over 24 months before the introduction of the Vanguard programme, 36 months of the programme and 22 months after its termination. We focus on two ‘population-based’ Vanguard schemes as well as ‘care home’ Vanguard scheme.
As in Morciano et al. [14], the analysis is aggregated at site level. A treated site is defined as a set of practices within a Clinical Commissioning Group (CCG) that were exposed to the Vanguard programme. All practices in a CCG not exposed to the programme or a part of a CCG where some practices are not exposed to the programme, are classified as control sites.Footnote 4 We therefore observe 24 sites involved in ‘population-based’ models (PACS and MCP combined), five sites exposed to ‘care home’ (ECH), and 175 not exposed sites that form our control group.Footnote 5 Accordingly, our sample comprises 16,728 observations.
We measure hospital activity through Emergency Admissions (EA) which are those with a ‘specific acute’ treatment function code. Better integrated care in the community might plausibly affect (preventable) emergency route into hospital, less plausibly elective admissions. To account for different population sizes, we analysed EA rate per 1000 persons.
In Fig. 1, we report a time series plot of monthly EA rates observed in the treated and control groups. Emergency admissions were higher for the treated sites in the pre-intervention period. The population-based (PACS/MCP) sites follow a similar pre-intervention trend to the control groups, except just before the call for expressions of interest in the Vanguard programme was issued (November–December 2014). Emergency admission rates in the care home sites rose faster than the control sites just before the Vanguard programme started. However, we will later show in “Results”, through various parallel trends checks, that the overall pre-Vanguard trends across treated and control group were similar.
In line with what has already been reported [14], Fig. 1 shows Vanguard initiative slowed the rise in EA rates observed in England during the programme period in the treated groups, especially for care home Vanguard sites, closing the initial gap in EA rates with the non-Vanguard sites. After programme discontinuation, EA rates for care home Vanguard sites rose again, with the re-emergence of the initial gap. On the other hand, for population-based sites the converging trends which emerged in year 3 persisted in the post-Vanguard period.
One way to assess convergence in EA rates, evidenced by a reduction in dispersion or compression in the EA rates distribution over time, is by looking at trends in the 25th and 75th percentiles of logged EA rates by groups.Footnote 6
Among better performers (25th percentile, Fig. 2, panel A), an upward trend is found for both the treated and the control groups throughout the programme duration, which continues after its end. A slowdown in the rising trend is observed for care home Vanguards in the first year of the programme and after its termination. The better performing population-based sites had lower EA rates than non-Vanguards from the third year of the programme and after its end.
Poorest performing (75th percentile, Fig. 2, panel B) control sites also experienced rising trends. In comparison, care home Vanguards experienced a steady reduction in EA rates throughout the three years of the programme, before rising again after its end. Poorly performing population-based Vanguards experienced a slight increase in EA rates at the start of the programme, followed by a reduction around the final year and continues to remain stable for most of the post-Vanguard period.
These graphical representations indicate that convergence in EA rates emerges due to reductions in the poorest performing Vanguard sites. Regression analysis in the following section sheds further light upon these trends.
Empirical specification
To examine the net impact of Vanguard on hospital activity, we employ a two-way (site and month) fixed-effect OLS estimator in a difference-in-differences setup, using the following specification:
\(ln(Y_{it})\) identifies the logged outcome of interest (EA rates) for site i and month t. \(\alpha\) and \(\beta\) identify the site and month fixed effects, respectively. To account for factors that vary over time within site, we add controls (X) for site-level population structure as the monthly proportion of population by age-groups (0–24; 25–64; 65 and older). V identifies three groups: the control group of non-Vanguard sites (\(j = 0\)) and the two treated groups of sites exposed to population-based (\(j = 1\)) and care home (\(j = 2\)) Vanguards. P identifies programme timing in quarters: the pre-Vanguard period (\(k<0\)), quarter 0 to quarter 12 of the Vanguard programme (\(k=0,\ldots ,12\)) and the post-Vanguard period (\(k=13,\ldots ,18\)). The key parameters of interest are those associated with the interaction terms \(V_{i}\) and \(P_{t}\), \(\eta _{jk}\). Specifically, they measure the net change among population-based Vanguards in each quarter k (\(\eta _{1k}\)) of the programme and in the follow-up period compared to non-Vanguard sites (\(j = 0\)), compared to the gap between them in the pre-Vanguard period (\(k<0\)). Similarly, the net change among care home Vanguards is captured by \(\eta _{2k}\) for each quarter k.
A focus on the mean net impact of the programme may mask meaningful heterogeneous treated sites’ responses to the programme. We therefore present a model for conditional quantile regressions to estimate the effect of programme status on emergency admissions distribution. There are several merits to doing this. Firstly, the conditional mean is more prone to distorting effects of outliers, to which conditional quantiles are more robust. Secondly, conditional quantiles provide more valuable information about the full distributional impact of the programme.
Our approach is based on Machado and Silva [26]. Their model allows for additive fixed effects and multiple treatment groups both of which are relevant for our set up. As before, the treatment variable (\(V_{i}\)) refers to population-based or care home Vanguard sites versus a non-Vanguard site. The \(\tau\)-quantile distribution of our outcome of interest \(-ln(Y_{it})\), \(Q_{ln(Y_{it})}\) is defined as:
In the present paper, \(X_{it}\) includes time varying variables such as the proportion of population by age group and the interaction terms of Vanguard type and treatment period. \(Q_{ln(Y_{it})}(\tau |X_{it})\) is the quantile distribution of logged emergency admissions conditional on the location of \(X_{it}\). Whereas \(W_{t}\) indicates time fixed effects. Z is a k-vector of known differentiable transformations of X with element l, \(Z_{l}=Z_{l}(X)\). \((\alpha _{i} +\delta _{i}q(\tau ))\) is the scalar coefficient that provides an estimate of the fixed effect at quantile \(\tau\) for a given unit i. This represents the effects of time invariant unit specific characteristics which have variable impacts across different regions of the conditional distribution of outcome variable [26]. Accordingly, \(\alpha _{i}\) can be interpreted as the average effect for unit i. We use robust standard errors, and we did not cluster based on Abadie et al. and Roth et al. [27, 28]. We explain this further in “Appendix 3: Clustering standard errors”.
To infer about distributional effects of Vanguard programme to individual sites, we need to be able to assume rank preservation, i.e. the ranks of outcomes are same across treatment states. In the present context, this implies that better performing sites in the pre-Vanguard period, remain better performers having been selected into the programme. A less strict assumption that is often made in the literature on quantile treatment effects is one of rank similarity, which requires that there are no systematic deviations between distribution of outcomes across treatment states.Footnote 7 In the presence of rank preservation/similarity, quantile regressions would inform us of the following; (a) whether observed convergence is due to those at the upper or lower end of the distribution; (b) whether the effect of the programme varies across the distribution of the outcome variable. However, there are conditions under which rank similarity may not always hold.Footnote 8 Moreover, this is not always testable if systematic deviations may be caused by unobservables. In the absence of rank preservation, we can still make meaningful inferences about the effect of the programme on the overall distribution of the outcome variable [31].
Results
For simplified presentation, estimates are reported by quarters from/to programme’s start (quarter 0) in an event study format in Figs. 3 and 4. This representation allows the common trend assumption to be easily checked for quarters \(<0\). Moreover, it helps in detecting how the net impact of the programme evolves over time among treated sites versus the control group. It is evident from Fig. 3 that the parallel trend assumption holds in pre-Vanguard period for each Vanguard group and for all periods.Footnote 9
For the care home Vanguard group, there was a significant net decline in emergency admission rates from the later quarters of year 1, which persisted in year 2 (except quarter 7) and further declined in year 3. In the post-Vanguard period, the decrease appears to slow down. A net impact among population-based sites emerged only in year 3 (except quarters 10 and 12) and remained significant for most of the post-Vanguard period (except quarters 14 and 17).Footnote 10
Table 1 reports results from the quantile regressions for the 10th, 25th, 50th, 75th and 90th quantiles estimated using Eq. 2. For population-based sites, at the end of year 3 (11th and 12th quarter), there appear heterogenous effects of increasing magnitude at higher quantiles, with the trend reversing/remaining stable in the post-Vanguard period (quarters 13, 15, 16, 18). Similarly for care home Vanguards, in the initial years the net effects are higher when moving to higher quantiles, but in the third year the differences across quantiles are stabilised or reversed. Towards the end of the post-Vanguard period, the increasing net impact across quantiles appeared again.
These results suggest that the effect of the programme is heterogenous with the effect being greater for poor-performing sites (for most of the programme duration). For care home vanguards, the net reduction in emergency admissions emerged mainly from early improvements in poor-performing sites. Under the assumption of rank preservation this would imply that the Vanguard programme had a greater positive effect upon pre-existing poorer performers than those that were already performing well in the pre-Vanguard period.
Figure 4 reports on graphical presentation of estimates for the 25th (panel A) and 75th (panel B) percentiles from Table 1. At the 25th percentile, we found no significant differences among treated sites in the pre-Vanguard period. Further, we observe no significant impact of population-based or care home Vanguard initiatives for the first two years of the programme (except a transient significant effect in quarter 2 of year 1 for population-based Vanguards and quarter 4 for care home Vanguards). Significant net impacts emerged from the beginning of year 3 and remained significant at 95% level up to three quarters of the post-Vanguard period.
At the 75th percentile, pre-Vanguard parallelism holds for care home Vanguards and for population-based Vanguards with the exception of the 5th quarter in the pre-Vanguard period. We found a significant net reduction in EA rates for care home vanguards that started around the mid of year 1 and remained more or less soFootnote 11 for the entire programme duration. On the other hand, population-based sites register a net reduction for most of year 3, with a lagged effect that continues for most of the post-Vanguard period.Footnote 12
We do an additional sensitivity check to establish how the programme affected sites that were better/poor performers before the start of the programme. This was done by assigning site membership into quantiles based on monthly values of emergency admissions from April 2013-March 2014 (first year of pre-vanguard period). Given the transient movement across quantiles, we focused on sites that were consistently in the 25th (75th) quantile for at least 6 months (50%) in that year.Footnote 13 The dependent variable was computed by taking the difference in the mean observed log of emergency admission rates at a point of time beginning April 2014 until January 2020, between sites that were pre-determined to be in the top and bottom quartiles. The estimation was carried out using a feasible generalised least squares method and imposing an error structure with heteroskedastic panels.
The results from quantile regressions (Table 1) slightly differ in magnitude (but the statistical significance of the parameters of interest diluted significantly) from the estimates obtained when quantile membership is pre-determined according pre-vanguard EA rates (results available upon request). There are two main reasons for this. First, there are methodological differences. Computing average treatment effects for pre-determined quantiles through ordinary least squares involves minimising sum of squared residuals, whereas quantile regression computes estimates by minimising the sum of absolute residuals. Second, since assigning membership on pre-vanguard data severely restricts our sample (6 population-based sites and 2 care home sites), thereby leading to small sample/aggregation bias seen in “Appendix 1: Testing aggregation bias”. For this reason, we preferred to draw our inferences from the analysis of disaggregated data.
Age group analysis
We next present differences of policy effects by age group. We do this by estimating our original difference-in-differences regression as per Eq. 1 with the dependent variable as total EA rates per 1000 persons in each of the three age groups. We control for site level-population structure in other age groups. The results are presented in Fig. 5.
These results show that among the youngest age group (0–25), the effect of care home Vanguards upon emergency admissions reduction is significant for a few quarters (4, 6, 11, 12) during the programme and to a limited extent in the post-Vanguard period (13 and 15 quarters). However, there are no significant effects of the population-based vanguards on emergency admissions amongst the youngest cohort. In case of the oldest population group, significant effects of care home Vanguard on emergency admissions are evident early in the programme (from quarter 2). While the effects in the 6th and 7th quarters are not significant, there is a downward trend afterwards until the end of the programme. After programme discontinuation, the reductions in emergency admissions tend to remain significant and fade only slightly in the last three quarters. In contrast, the effect of population-based vanguards among the oldest cohort is not so clear. During the programme the effects appear insignificant (except quarter 9). Though in the post Vanguard period, the lower admissions rates appear to persist until the last two quarters. For the adult age cohort (25–64), the effects of care home Vanguard on emergency admissions appear similar to the oldest group in the early stages of the programme, with a clear downward trend after the 7th quarter. In the post-programme period, the effects remain significant and the EA rates appears to become slightly higher but remain below pre-Vanguard levels. For population-based vanguards, significant reductions in emergency admissions for adults appear only at the end of the programme (quarters 9 and 11) and persist at the same levels in the post Vanguard period. Detailed regression output is in Table 2.
Overall, our results suggest that persistent effects come from care home Vanguards benefiting the oldest cohorts the most. This might explain why the average effect on care homes in the post Vanguard period appears to disappear more rapidly (in Fig. 1), since it is likely that the reductions for the older cohorts were muted by sharper increases in EA rates among the younger cohort. This also supports our understanding that care homes have more older residents and/or those more vulnerable to emergency admissions, and thus the targeted nature of interventions produce effects early on and may persist for longer for this group.
Limitations
There are a few limitations to this study worth noting. Our present results show the aggregate effect of integration upon emergency admissions without accounting for changes in case mix. Beyond reducing the number of hospital admissions, integration programmes may have led to less severe and/or shorter hospital days. While we do not have access to additional data on these aspects of emergency admissions, we expect that these factors would vary little over the study period. As such, these may be absorbed to some extent within area fixed effects and common time effects that we have included in our estimation.
Another limitation might come from confounding effects of other policies concomitant during the Vanguard period, implying that the legacy might not be from Vanguard alone. These effects to some extent may be captured in our econometric approach, since we include month fixed effects and if such policies targeted Vanguard sites, then they would be absorbed by site fixed effects. As an example of such a policy, we consider the Pioneer programme, aimed at promoting horizontal integration between health and social care systems [16], which ran from 2013 to 2018 (for details, see “Appendix 2: Effects of confounding policies”). We exploit the partial overlapping of the Pioneer programme with the Vanguard by running our main specification (1) using the sub-sample of sites that were not involved in the Pioneer programme. The results are confined in Table 4 in “Appendix 2: Effects of confounding policies”. We found that non-pioneer sites had detectable effects that continued after the programme termination, although the magnitude of the effects becomes smaller and becomes non-significant at the end of the sample period.
We report evidence of legacy effects for two years after the end of the programme, but before the onset of COVID-19 which is known to have altered activities especially in confined environments like care homes [33]. Whether initiatives such as Vanguard have helped institutions to be more resilient to pandemic is difficult to assess and beyond the scope of this paper.
Finally, we are also unable to specify the precise mechanism through which these legacy effects may be generated. This is because it is difficult to ascertain how funding was utilised to make the necessary changes due to the lack of financial accountability of spending patterns for the Vanguard initiatives [25].Footnote 14
Discussion
We examined the follow-up effects of Vanguard—an integrated care pilot programme that was active in England between 2015 and 2018. Since we report (in “Data and descriptives” section) higher emergency admissions among the treated sites in the pre-intervention period, it may then be expected, that with improved coordination in delivery of care, better care may reduce the likelihood of admissions within these sites. While confirming previous results in Morciano et al. [14], our expanded analysis reported lagged effects in the six quarters following the end of the programme. Care home sites are vulnerable to high levels of emergency admissions, given their residents are mainly older people [35, 36]. Therefore, focused interventions on a ‘high-risk’ population, living in confined environments such as care homes, are likely to produce detectable effects upon hospital activities quicker than wider population-based interventions. Our age group analysis supports this hypothesis.
Consistent with this, our analysis demonstrated that care home Vanguards showed significant reductions beginning early in the programme but falling away more rapidly after programme discontinuation. As care home sites catered to a more vulnerable category of the population, continued funding and integration support may have been critical to sustaining the effects seen during the programme. Moreover, care homes being smaller organisations might be less able to invest the integration funds towards making lasting changes, and so may have been more reliant on continued funding support.
Moreover, our analysis showed that for most of the programme, and to some extent in the post-Vanguard period, the reductions in hospital admission rates were greater in magnitude among poor-performing sites. These outcomes may have been influenced by the non-pecuniary support associated with the programme.Footnote 15
Our results indicate that the programme has enabled reductions in hospital admission rates which have persisted beyond the programme period, supporting the thesis of legacy effects. More importantly, we found that the net reductions tended to fade away over time after the end of the programme and towards the end of the period we have covered. This suggests that integration efforts may have had lagged effects, but the effects were transitory.
Earlier evaluations have demonstrated that given the numerous hurdles in the integration process, it could take 5–6 years for any meaningful impacts of these efforts to show up [10]. Longer term engagements may allow enough time for stabilisation of processes. Our study has been able to demonstrate that some effects persist in the immediate follow-up period. From the perspective of policy evaluation, there may be a case for a longer term horizon to arrive at any clear conclusions about policy impact.
Finally, though the initial vision was to scale up all Vanguards to the national level, however this was implemented for only one of the models—ECH [12]. This is said to have been enabled as a direct result of Vanguard’s legacy in fostering strong relationships across service partner organisations [37]. But also since at the outset, populations in care homes are well-defined, homogeneous and services in this area were more underdeveloped to start with [14]. This indicates more effective population-based approaches to integrated care delivery may need to be explored again in the future.
Data availability and materials
The data that support the findings of this study are not publicly available but can be requested through NHS England. Program codes are available upon request.
Notes
These were 8 Urgent and Emergency Care Vanguards aimed at improving coordination and reducing pressure on Accident and Emergency departments. As well as 13 Acute Care Collaborations working towards linking hospitals and improving both clinical and financial viability.
We received data up to March 2020, but we drop the last month for the possibility of any discrepancies. Moreover, with pre-pandemic data we avoid any bias in our estimation of the effect of programme discontinuation.
Where a CCG has a Vanguard site along with some other practices which are not exposed to the programme, the same CCG has two entries. However, they have variation in population structure corresponding to the participating vs non-participating regions.
The raw data included 184 sites that were coded as non-Vanguards. We dropped 5 sites which had missing population values and another 4 sites that had unbalanced number of observations. Note these sites show group of practices, but we do not observe data at individual care home or practice level.
The advantage of this approach over synthetic measures of inequality computed from various metrics of the underlying distribution (based on, e.g. associated interquartile ranges, standard deviation, Gini coefficient, etc.) lies in the possibility of a graphical assessment on whether convergence occurred mainly because of changes among the best (25th percentile) or the worst (75th percentile) performing sites.
We may be able to assume this condition to hold if pre-determined variables are unlikely to be affected by treatment [29]. In our context, this implies that if variables such as area level-population structure, level of deprivation, disease prevalence are unaffected by treatment, then rank similarity may not be violated.
For example, Frandsen and Lefgren [30] show this in the context of class size assignment experiment on student test scores, as distribution of students with lower scores are not invariant to predetermined family income levels. Therefore, treatment (assignment to smaller classes) may benefit struggling students from low-income backgrounds more than those from high income backgrounds.
The null hypothesis of parallel trends was not rejected at 5% statistical levels for the Vanguard population-based (p-value = 0.94) and care home (p-value = 0.07) sites nor by the event history analysis reported in Fig. 3. It should be noted though that the parallel trends test for care home Vanguards may marginally (at 10%) be rejected. To check the extent to which our results are sensitive to violations of post-treatment parallel trends, we follow the method proposed by Rambachan and Roth [32] and discuss this in “Appendix 4: Parallel trends in the pre-Vanguard period”. As an additional robustness check, we exclude the initial quarter (April–July 2013) and report our event history results in Fig. 6 in “Appendix 4: Parallel trends in the pre-Vanguard period”.
The full list of results from this estimation are presented in column 1, Table 3 in “Appendix 1: Testing aggregation bias”. We also contrast these results with those obtained from aggregating data at Vanguard group level and discuss the resultant aggregation bias.
The effect is insignificant at the end of year 2 and beginning of year 3 (quarters 7, 8 and 9).
The effect is insignificant for quarter 10 in year 3. In the post-Vanguard period, the effect is significant for all, except quarters 14 and 17.
As mentioned before, absence of any movements would be evidence of rank preservation in it’s strictest sense.
An alternative is to consider funding allocation across Vanguard sites. However, funding itself was endogenous to the performance of any single Vanguard site, with sites deemed not to be performing at the end of year one denied funding for year two [11]. Qualitative research confirms that additional funding was considered crucial and many initiatives were downgraded/withdrawn when funding was withdrawn [34].
Previous qualitative research [11] has also indicated that local support structures that were put in place had valuable effects in terms of enhancing trust and enthusiasm among participating sites. However, given the unmeasurable nature of these inputs, we were not able to quantify their impact upon outcomes for Vanguard sites. Nevertheless, these are likely to have reduced between-group variation while being more beneficial to the worst performing sites, in line with what we have found with the quantile regression analysis.
Roth [39] discusses that most previous studies report the significance of individual coefficients in the pre-treatment period as the commonly used criterion for parallel trend tests.
References
Jowell, R.: Trying it out: the role of ‘pilots’ in policy making: Report of a review of government pilots. Technical Report, Cabinet Office, Strategy Unit, United Kingdom (2003)
Checkland, K., Hammond, J., Coleman, A., Macinnes, J., Mikelyte, R., Croke, S., Billings, J., Bailey, S., Allen, P.: ‘Success’ in policy piloting: process, programmes and politics. Public Admin. 2021, 1 (2021)
Rocks, S., Berntson, D., Gil-Salmerón, A., Kadu, M., Ehrenberg, N., Stein, V., Tsiachristas, A.: Cost and effects of integrated care: a systematic literature review and meta-analysis. Eur. J. Health Econ. 21(8), 1211–1221 (2020)
Tsiachristas, A., Stein, K.V., Evers, S., Rutten-van Mölken, M.: Performing economic evaluation of integrated care: Highway to hell or stairway to heaven? Int. J. Integr. Care 16(4), 3 (2016)
Quimbo, S., Wagner, N., Florentino, J., Solon, O., Peabody, J.: Do health reforms to improve quality have long-term effects? Results of a follow-up on a randomized policy experiment in the Philippines. Health Econ. 25(2), 165–177 (2016)
Celhay, P.A., Gertler, P.J., Giovagnoli, P., Vermeersch, C.: Long-run effects of temporary incentives on medical care productivity. Am. Econ. J. Appl. Econ. 11(3), 92–127 (2019)
Mounier-Jack, S., Mayhew, S.H., Mays, N.: Integrated care: learning between high-income, and low- and middle-income country health systems. Health Policy Plan. 32(suppl-4), 46–412 (2017)
Wodchis, W.P., Dixon, A., Anderson, G.M., Goodwin, N.: Integrating care for older people with complex needs: key insights and lessons from a seven-country cross-case analysis. Int. J. Integr. Care 15, 1 (2015)
Pearson, C., Watson, N.: Implementing health and social care integration in Scotland: renegotiating new partnerships in changing cultures of care. Health Soc. Care Commun. 26(3), e396–e403 (2018)
Lewis, R.Q., Checkland, K., Durand, M.A., Ling, T., Mays, N., Roland, M., Smith, J.A.: Integrated care in England—What can we learn from a decade of national pilot programmes? Int. J. Integr. Care 21(4), 1 (2021)
Checkland, K., Coleman, A., Billings, J., Macinnes, J., Mikelyte, R., Laverty, L., Allen, P.: National evaluation of the vanguard new care models programme. interim report: understanding the national support programme. Technical Report, University of Manchester, Manchester, England (2019)
NHS: The NHS long term plan. Technical Report, National Health Service England. Retreived on 3 December 2021 from https://www.longtermplan.nhs.uk/wp-content/uploads/2019/08/nhs-long-term-plan-version-1.2.pdf (2019)
NHS: Next steps of the NHS five year forward view. Technical Report, National Health Service England. Retreived on 20 September 2023 from https://www.england.nhs.uk/wp-content/uploads/2017/03/NEXT-STEPS-ON-THE-NHS-FIVE-YEAR-FORWARD-VIEW.pdf (2017)
Morciano, M., Checkland, K., Billings, J., Coleman, A., Stokes, J., Tallack, C., Sutton, M.: New integrated care models in England associated with small reduction in hospital admissions in longer-term: a difference-in-differences analysis. Health Policy 124(8), 826–833 (2020)
Deeny, S., Thorlby, R., Steventon, A.: Briefing: Reducing Emergency Admissions: Unlocking the Potential of People to Better Manage Their Long-Term Conditions. The Health Foundation, London (2018)
Morciano, M., Checkland, K., Durand, M.A., Sutton, M., Mays, N.: Comparison of the impact of two national health and social care integration programmes on emergency hospital admissions. BMC Health Serv. Res. 21(1), 1–10 (2021)
Folz, R., Laiteerapong, N.: The legacy effect in diabetes: are there long-term benefits? Diabetologia 64, 2131–2137 (2021)
Roper, S., Hewitt-Dundas, N.: The legacy of public subsidies for innovation: input, output and behavioural additionality effects. ERC Research Paper, 21 (2016)
Gertler, P.J., Martinez, S.W., Rubio-Codina, M.: Investing cash transfers to raise long-term living standards. Am. Econ. J. Appl. Econ. 4(1), 164–92 (2012)
Baird, S., McIntosh, C., Özler, B.: When the money runs out: Do cash transfers have sustained effects on human capital accumulation? J. Dev. Econ. 140, 169–185 (2019)
NHS: NHS five year forward view. Technical Report, National Health Service England. Retreived on 9 August 2023 from https://www.england.nhs.uk/wp-content/uploads/2014/10/5yfv-web.pdf (2014)
NHS: The framework for enhanced health in care homes. Technical Report, National Health Service England. Retreived on 20 September 2023 from https://www.england.nhs.uk/wp-content/uploads/2016/09/ehch-framework-v2.pdf (2016)
NHS: The New Care Models: Vanguards-developing a blueprint for the future of NHS and care services. Technical Report, National Health Service England. Retreived on 1 March 2022 from https://www.england.nhs.uk/wp-content/uploads/2015/11/new_care_models.pdf (2016)
NHS: The forward view into action: new care models: Support for the vanguards. Technical Report, National Health Service England. Retreived on 11 August 2023 from https://www.england.nhs.uk/wp-content/uploads/2015/12/acc-uec-support-package.pdf (2015)
NAO: Developing new care models through NHS vanguards. Technical Report, National Audit Office, United Kingdom (2018)
Machado, J.A., Silva, J.S.: Quantiles via moments. J. Econ. 213(1), 145–173 (2019)
Abadie, A., Athey, S., Imbens, G.W., Wooldridge, J.M.: When should you adjust standard errors for clustering? Q. J. Econ. 138(1), 1–35 (2023)
Roth, J., Sant’Anna, P.H., Bilinski, A., Poe, J.: What’s trending in difference-in-differences? A synthesis of the recent econometrics literature. J. Econ. 235(2), 2218–2244 (2023)
Schiele, V., Schmitz, H.: Quantile treatment effects of job loss on health. J. Health Econ. 49, 59–69 (2016)
Frandsen, B.R., Lefgren, L.J.: Testing rank similarity. Rev. Econ. Stat. 100(1), 86–91 (2018)
Callaway, B., Li, T.: Quantile treatment effects in difference in differences models with panel data. Quant. Econ. 10(4), 1579–1618 (2019)
Rambachan, A., Roth, J.: A more credible approach to parallel trends. Rev. Econ. Stud. 90(5), 2555–2591 (2023)
Morciano, M., Stokes, J., Kontopantelis, E., Hall, I., Turner, A.J.: Excess mortality for care home residents during the first 23 weeks of the COVID-19 pandemic in England: a national cohort study. BMC Med. 19, 1–11 (2021)
Checkland, K., Coleman, A., Croke, S., Billings, J., Mikelyte, R., Macinnes, J., Allen, P., Morciano, M., Jones, K., Malisauskaite, G., Sutton, M.: National evaluation of the vanguard new care models programme: final report. Technical Report, University of Manchester, Manchester, England (2022) (in press)
Smith, P., Sherlaw-Johnson, C., Ariti, C., Bardsley, M.: Focus on: hospital admissions from care homes. Quality Watch: The Health Foundation, The Nuffield Trust: London, UK (2015)
Wolters, A., Santos, F., Lloyd, T., Lilburne, C., Steventon, A.: Emergency admissions to hospital from care homes: how often and what for? Health Foundation London (2019)
MacInnes, J., Billings, J., Coleman, A., Mikelyte, R., Croke, S., Allen, P., Checkland, K.: Scale and spread of innovation in health and social care: insights from the evaluation of the New Care Model/Vanguard programme in England. J. Health Serv. Res. Policy 13558196221139548. PMID: 36631723 (2023)
Garrett, T.A.: Aggregated versus disaggregated data in regression analysis: implications for inference. Econ. Lett. 81(1), 61–65 (2003)
Roth, J.: Pre-test with caution: event-study estimates after testing for parallel trends. Am. Econ. Rev. Insights 4(3), 305–22 (2022)
Crawford, R., Stoye, G., Zaranko, B.: Long-term care spending and hospital use among the older population in England. J. Health Econ. 78, 102477 (2021)
Acknowledgements
VW did the statistical analysis and led the design and writing of the paper with MM overseeing all phases of this research. KC and MS contributed to the design and contributed to writing the paper. The authors are grateful to the wider team members for their suggestions in all phases of the research. The authors read and approved the final manuscript.
Funding
This research is funded by the National Institute for Health and Care Research’ (NIHR) Policy Research Programme, conducted through the ‘National evaluation of the Vanguard New Care Models Programme’, PR-R16–0516-22001 (VW, MM, KC and MS), the NIHR Research for Social Care within Research for Patient Benefit (RfPB) Programme through the ‘Supporting the spread of effective integration models for older people living in care homes: A mixed method approach’ project, NIHR201872 (MM,KC), NIHR Applied Research Collaboration for Greater Manchester, NIHR200174 (MS and MM). The views expressed are those of the author(s) and not necessarily those of the NIHR or the Department of Health and Social Care.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no competing interests to declare that are relevant to the content of this article.
Ethical approval
Not applicable
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendices
Appendix 1: Testing aggregation bias
We checked for aggregation bias by comparing the overall results from Eq. 1 with those obtained from group level estimates. The latter involves obtaining mean values of emergency admissions for \(V_{j}\) groups at each point of time. Since there are 3 groups; 2 treatment and 1 control group spanned over 81 months, we have 243 observations. Given this method of aggregation we do not include time fixed effects in the estimations. With the short panel and relatively longer time series, we implemented a feasible generalised least squares estimation. Table 3 contrasts our main estimates displayed in the paper (column 1) with group-level estimates obtained with and without assuming potential heteroskedasticity across panels (columns 2 and 3).
We found that panel level heteroskedasticity assumptions (column 2 vs column 3) has no impact on our estimates of interest. Group-level analysis confirms the magnitude of findings reported in the paper that mean emergency admissions rates slowed down for care home Vanguards in early stages of the Vanguard programme, whereas a net reduction emerged mainly in the third and final year of the programme for population-based vanguards. The group-level analysis also confirms the presence of legacy effects and/or lagged effects after programme discontinuation. However, the information loss resulting from the aggregation might have impacted on the precision of estimated parameters and statistical significance. This may be linked with higher standard errors in the aggregated equation due to correlation of residuals across disaggregated regressions [38]. As such, we found that the Vanguard programme has little to no significant effect upon Vanguard types during the programme. In the post-Vanguard period, the slow down appears to continue for care home Vanguards although the magnitude of the effect becomes smaller. For population-based sites, we found a consistent net reduction over the post-Vanguard period which is estimated to be significant at quarters 13, 15 and 16.
Appendix 2: Effects of confounding policies
An integrated care initiative that closely preceded Vanguard, the Integrated care and support Pioneer programme ran from 2013 to 2018. Thus the programme overlapped in time and in some instances in place as well with the Vanguard programme. Morciano et al. [16] previously compare the effect of Pioneer and Vanguard programme. They found that emergency admission rates grew less overtime in sites that participated in both programmes.
In our sample there were 66 sites (including Vanguards and non-Vanguards) that were also exposed to the pioneer programme. To the extent that those participating in pioneer programmes may have led to stronger legacy effects, we re-estimate our main model (Eq. 1) for those sites which never participated in the Pioneer programme. In Table 4, we report the results from our main sample and compare this with our ‘no pioneers’ sample. In the ‘no pioneers’ sample there are 13 population-based Vanguards and 3 care home Vanguards. For population-based vanguards, the effects earlier seen in the last few quarters before the end of the programme (first column) are no longer significant. But the legacy effects are still evident (strong effects are evident particularly in quarters 15 and 18).
For care home sites, the effects from the ‘no pioneers’ sample appear to follow a similar pattern as the main sample. We do see legacy effects that are significant until the last two quarters of the sample. The magnitude of the effect appears slightly larger during the programme and starts to become smaller after the end of the programme, and become non-significant in the last two quarters.
Our analysis suggests that possibly due to a head start in integrated care initiatives, pioneer sites may have contributed to more stable and longer lasting effects. However, our evidence also indicates that the non-pioneer sites had detectable effects (particularly for care home Vanguards) that continued up to four quarters after the end of the programme.
Appendix 3: Clustering standard errors
Recent econometric research has debated upon whether and when clustering of standard errors seems relevant. Abadie et al. [27] suggest that the decision to cluster should be based on treatment assignment or data sampling. In our study, we have total population data but treatment assignment happens at site level (i.e., several sites applied to be Vanguards and 50 were selected). In this case, we have few treated clusters—24 population based sites and 5 care home sites. When there are few clusters, Abadie et al. [27] suggest that clustering leads to conservative estimates and Roth et al. [28] indicate that this may lead to under-rejection of the null of parallel trends.
We cluster at site level and report the estimates in Table 5 to demonstrate where the problem occurs in contrast to robust standard errors.
Appendix 4: Parallel trends in the pre-Vanguard period
To correctly establish the causal effect of the programme, a crucial assumption is that the trends between treatment and control groups would have been parallel in the post-treatment period in the absence of the intervention. However, since that is untestable, the alternative is to test whether the trends were parallel in the pre-treatment period. In Fig. 3, we observed that the interaction terms estimated for the pre-treatment period (quarter \(<0\)) were not significant for both, population-based and care home Vanguards.Footnote 16
As a robustness check, we drop the initial quarter (April–July 2013) instead of the 1st quarter before treatment, and report our estimates from the event history analysis shown in Fig. 6. We do not find any significant difference at 5% level in the estimated parameters for the two treated groups in the pre-treatment period.
Earlier in the paper (in “Results”), we noted that the p value for parallel trends test for care home vanguards is marginal. We thus follow Rambachan and Roth [32] to show the extent to which our results are robust to violations of parallel trends in the post-Vanguard period. There may be cause for concern if there are confounders in the form of other trends in the health and/or social care sector that systematically effect one group more than the other (for example-trends in funding of social care sector may affect care home sites independently of the Vanguard programme). We carry out a sensitivity check which allows us to impose smoothness restrictions, i.e. bounds to the extent to which the slope of the difference in trends can vary across consecutive periods. We compute this for the first post-Vanguard period (quarter \(=12\)) for care home Vanguards which is shown in Fig. 7. From the plot, it can be observed that the significant results for care home Vanguards are robust until \(M\approx 0.002\). It is observed that the confidence intervals when allowing for linear violations of parallel trends (\(M\approx 0\)) are similar to OLS, but they become wider when we allow for larger deviations from linearity. The overall picture implies that our robust results allow for deviations from non-linearity of no more than 0.002 across consecutive periods.
To put these results in context, we go back to our possible confounders to underlying linear trends, such as the effect of social care funding. Earlier studies that have considered the effect of cuts to long-term care funding on hospital use among the elderly in England, such as Crawford et al. [40], found no significant effects upon mean emergency admissions, but did find significant effects for short stay emergency admissions (less than 3 days). Their estimates suggest that a \(\pounds\)100 decrease in per capita long-term care spending led to an increase of 0.002 emergency admissions that last less than 1 day and an increase of 0.004 emergency admissions that last 3 days or less. If we interpret this as the effect of changes in underlying trends, then our sensitivity checks suggest that the core results are robust to the slope of the differential trend changing by the equivalent of the full effect of the emergency admissions less than 1 day and up to 50% of the effect of emergency admission spells that last 3 days.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Wattal, V., Checkland, K., Sutton, M. et al. What remains after the money ends? Evidence on whether admission reductions continued following the largest health and social care integration programme in England. Eur J Health Econ (2024). https://doi.org/10.1007/s10198-024-01676-0
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10198-024-01676-0