Our YIF Evaluation Approach
Five types of data
It is no coincidence that there has been very little impact evaluation in open access provision — it presents particular challenges to traditional or established approaches to impact measurement. Open access, non-formal youth provision can include a wide range of different types of interventions and activities. It is underpinned by the building of trusted relationships and voluntary engagement on behalf of the young person, rather than activities or journeys with a defined beginning, middle and end. It may not have pre-defined outcomes, and different young people are likely to engage in very different ways or have very different experiences of provision.
The Youth Investment Fund Learning Project is taking a broad approach to data collection. We are focusing on five types of data (see below) which will allow us to develop a comprehensive picture of a young person’s experience of open access youth provision. Our ambition is to link the different types of data (attendance, feedback, quality and outcomes) and look for predictive links between them, i.e. does provision that is rated high quality have higher levels of youth engagement? do young people have better outcomes if the provision is rated highly by young people and is high quality?
Our shared evaluation approach includes the collection of the first four data types by all Youth Investment Fund projects. Outcomes data will be collected by approximately 1/3 of grant holders, which will provide us with a large enough sample size from which to generate robust findings.
1) User and attendance data
We collect data on what activities young people attend and how often to help understand patterns of youth engagement with different types of open access provision. Evidence already suggests that more sustained engagement with youth provision is likely to generate greater impact for young people. In addition, we are asking grant holders to record young people’s age, gender, ethnicity and date they first attended – which will enable us to make comparisons between different groups of young people. This will help us develop a clearer picture of the profile of young people organisations are reaching, how young people engage with provision, and how this is related to impact.
2) User feedback
We believe that young people are able to explain their relationship with, and experience of, attending their open access provider and should have systematic opportunities for doing so. We have developed two separate question banks consisting of 18 anonymous feedback questions each – one is aimed at younger age groups of 10-12 years old and the other at 13+. Flexibility exists for organisations to choose up to six questions as well as the option to include some of their own.
The majority of these questions are drawn from the mechanisms of change in our YIF theory of change and so closely align with features of good quality open access provision and also help organisations to identify areas to improve services.
3) Quality of provision
Too often outcomes are explored in isolation with little or no systematic attention devoted to understanding, measuring and improving the quality of provision. The approach to quality we are rolling out as part of the YIF is designed to systematically measure the quality of provision at the point of interaction with young people in a detailed way. It applies an approach to measuring, understanding and improving quality that has been successfully developed in the US since the 1990s by the David P. Weikart Center. The approach is embedded in a continuous quality improvement cycle.
The approach begins with an assessment of provision through both self-assessment (by grant holders) and external assessment (by the evaluation team). The assessments are undertaken through a two-step process of peer observation and scoring. The self-assessment is based on multiple observations of sessions by team members who then meet to score provision against the 70 individual items on the Social and Emotional Learning Program Quality Assessment – scoring each item high (5), medium (3) or low (1).
It is not an organisational self-assessment nor is it an assessment of an individual session or an individual member of staff. Instead it is assessment at the provision-level (e.g. sports provision would include football, rugby and basketball sessions). Each full cycle takes 6-months, meaning most grant holders will be able to do four full cycles by the end of their YIF funding.
If we can connect the quality of youth work to the change experienced by young people, the use of quality markers in provision could allow us to predict the change that young people experience.
We are also developing a shared outcomes framework for YIF open access youth provision, based on some of the intermediate outcomes identified in our Youth Investment Fund shared theory of change. Approximately 1/3 (n=30) of grant holder organisations will collect young people’s outcome surveys for both their new and existing youth members at 3 or 4 different time points over a 12-month period (baseline, 3, 6 and 12 months). Our outcomes approach is focussed on measuring individual young people’s skill development in their personal, social and emotional learning over time. It has been informed by existing frameworks such as the Inspiring Impact’s Journey to Employment, the Life Effectiveness Questionnaire (LEQ) and Review of Personal Effectiveness & Locus of Control (ROPELOC). By collecting outcomes consistently across a range of provision and organisations, we will develop a stronger evidence base about the impact of open access youth work on young people’s development.
5) Counterfactual Impact data
The hardest part of assessing the impact of any project is knowing ‘what would have happened anyway?’. We are working with Bryson Purdon Social Research to commission a counterfactual study, in which a sample of young people from across England who don’t attend YIF provision will be matched as close as possible to our YIF sample of young people and invited to complete the same outcome survey. This is due to take place between March 2019 – March 2020. By measuring the same outcomes among a comparative sample of young people who didn’t access youth provision we anticipate that we will have a greater understanding of causality and contribution of Youth Investment Funded projects.
Qualitative impact data
As part of our Youth Investment Fund evaluation approach, we will include a process evaluation strand. Between March – Dec 2019 researchers from NPC and CYI will visit between 4-6 case study grant holders to undertake qualitative investigation with young people, youth workers and other stakeholders to explore the processes through which open access youth services (in its various forms) works to generate change in young people’s lives.
We will explore the importance and influence of how open access youth services are delivered and identifying features of provision that are most effective in sustaining engagement and relationships and align with good quality provision. We will also explore how they are experienced and how this influences positive change for young people over time.