We conducted a feasibility study to assess how the programme’s impact could be robustly evaluated. The feasibility study ran between May 2022 and March 2023 with 15 local authorities (LAs) who had been awarded funding to implement Staying Close . We carried out a range of qualitative and quantitative work and considered findings from both strands of work to make design recommendations for the next stage of evaluation due to start in October 2023. This project was supported by the Department for Education (DfE) who developed Staying Close to provide support for young people leaving residential care and transitioning to independence.
The feasibility study concluded that the evaluation of the programme should focus on not in education, employment or training (NEET) status and accommodation outcomes. As the What Works Centre for Children and Families, accommodation changes and NEET outcomes are outside the expertise of Foundations.
Social connectedness, which falls within Foundation’s area of expertise, was ruled out as a main outcome for the evaluation because historical data is required to conduct a Qausi-Experimental Design. To ensure the most robust evaluation, and due to their expertise measuring similar outcome measures to EET status and accommodation outcomes, Foundations concluded that the Centre for Homelessness Impact (CHI) would be best placed to continue the evaluation of Staying Close and the evaluation is now being run by to CHI.
The main objectives of the feasibility study were:
- To improve theoretical understandings of Staying Close and explore how LAs may vary in their implementation of the programme
- To identify the most suitable research questions and evaluation methods that can be used in future impact evaluations
How we went about it
The feasibility study involved two strands of work. The first strand involved theory-building, and focused on reviewing Staying Close pilot evaluations, bids from LAs already implementing the programme and interviews with staff. These activities helped identify LA variations in implementation based on each LA’s unique context and identify which programme activities were linked to desired outcomes (also known as mechanisms of change). We were also able to identify barriers and facilitators to implementation through this work.
Our second strand focused on design testing. We systemically ranked potential design options using set criteria and identified the most suitable methods for a future impact evaluation. We also conducted data scoping work where we identified data sources linking to variables of interest and tested collecting administrative and survey data from LAs.
Through our first strand of work (theory-building), we found:
- Local variations in Staying Close implementation were found in accommodation, relationships, wellbeing, independent living skills, and education, employment or training (EET) support.
- Across LAs, the mechanisms through which programme activities achieved outcomes were care leavers’ awareness of the programme, their experience of the programme and the take-up of the programme within an LA (all influencing the extent of engagement with programme activities). We also identified that stable and suitable accommodation, having trusted relationships with staff and improved wellbeing were both mechanisms and short-term outcomes for young people.
- Barriers to implementation included limited housing availability, high staff turnover, unfilled specialist staff roles, poor engagement with care leavers in programme development and suboptimal matching of different young people in shared accommodation. Facilitators included suitable accommodation being available within the LA, staff training in trauma-informed care and co-production with care leavers.
Through our second strand of work (design testing), we found:
- The most suitable methods for a future impact evaluation would be a randomised control trial (RCT) or a difference-in-differences design (DiD) if randomisation was not possible. An RCT involves randomly assigning LAs to implement Staying Close or not. For a DiD, young people receiving Staying Close are compared with those with similar characteristics from LAs that are expected to show the same outcomes as the treatment LAs if the programme were not implemented.
- While administrative data was of sufficient quality and reliability, potential barriers to securing this data are low staff capacity, internal miscommunications and high turnover rates within teams. National databases would also not be compatible with reporting requirements for DfE.
- Collecting survey data for primary outcomes would pose substantial risks to the evaluation, but this can be offset by using administrative data for primary outcomes and including formative questions effective data collection strategies.
As mentioned, future evaluations will be handed over to CHI. Based on our findings, we make the following recommendations for a trial design that can robustly evaluate Staying Close:
- An RCT is not feasible given extra funding for Staying Close, meaning the program is now being rolled out more widely and a control group would not exist. We therefore recommend that the evaluation be considered a pilot DiD evaluation.
- We propose that the evaluation runs from October 2023 to October 2024. If LAs begin delivery after October 2023, we recommend allowing for 12 months delivery up to December 2024. This means that participants will be included if they start receiving the intervention by April 2024, or six months before the endpoint for their LA’s delivery period.
- The DiD will involve identifying comparator LAs based on historical trends in the outcomes of interest (before the introduction of the programme). Differences in these trends in the two sets of LAs after the programme is delivered are then assumed to be the result of the intervention (the “parallel trends assumption”). We recommend accessing individual-level data from the comparator LAs and matching individuals with young people receiving the programme to estimate the effect of Staying Close on outcomes.
- We recommend accommodation changes and not in education, employment and training (NEET) as primary outcomes for the evaluation given their centrality to the programme’s theory, data access and characteristics, outcome maturation, stakeholder interest and suitability for cost analysis.
- For the formative aspect, we recommend that CHI explore other variable such as social connectedness, wellbeing and homelessness. Additionally, we recommend using this work to establish the most effective data collection strategies and allow for exploratory comparisons between Staying Close and control sites.
- The formative work can also explore mechanisms, subgroup and local effects using qualitative approaches. An implementation process evaluation (IPE) will supplement the impact evaluation and focus on programme acceptability, sustainability and fidelity.