Copy
Understanding and improving our impact in young people's lives. 
View this email in your browser
News

Asking Good Questions: Insights from the February survey and March survey now live!
 
Last month we launched our very first Asking Good Questions: The Survey – a regular anonymous survey created by and for organisations working with young people. As with everything, there is some reviewing and refining to do, but we are pleased with the response we’ve had so far.  Our intention is to draw insights from the survey to inform our work, representing the voice of the sector on all things evaluation, quality and impact related.
 
We received over 140 survey responses to our February survey and are looking forward to growing a regular respondent base as organisations get to know what the Asking Good Questions survey is all about. We’ll be drawing insights from the data over the next few weeks but wanted to share some top-level findings. So – what is the sector saying?
 
Q1: Do you think evaluation helps your organisation to achieve its mission for young people?
 

 
There is a pretty clear statement here. 89% of participating organisations agree that evaluation makes a contribution to organisational aims for young people.  This is good news.  However, we cannot ignore that 11% either do not know, or do not feel, that evaluation is making a real difference.  In reality, factoring in the potential bias of the people we are likely to have reached in our first survey, this figure is probably higher.  It’s also notable that just under half of respondents (45%) feel that evaluation is  ‘somewhat’ supporting them to achieve their mission, suggesting room for improvement.  The following questions provide some initial insights into what might support this.
 
Q2: What is the number one thing that would make evaluation a really useful experience for you in your work? 


 
There are a number of things that organisations feel would be useful to them in conducting evaluation. The key take- away message is that those working in the youth sector feel they need more space, resources and training in order to employ useful evaluation practices. We’ll be digging into this more in the March survey (take part here) but in the meantime, here are a few opportunities that might be helpful if you are looking for structured and supported ways to get more time to think and learn about evaluation:
  • Youth Programme Quality Intervention (YPQI) - a continuous quality improvement process designed to support organisations working with young people to improve the quality of what they do, and thus the impact of their work on the lives of young people.
  • Relationships Cohort – A two year project, supported by City Bridge Trust, that will deepen understanding of evaluating the quality of relationships between staff and young people. 
  • Regional Impact Network - take part in regional meetings, events and training to develop your skills and confidence related to quality, evaluation and impact measurement.
  • Impact Accelerator – a 12 month programme designed to foster a culture of learning within youth programme delivery, build organisational capacity for evidence-led improvement, and establish a common approach to understanding and improving impact.
Q3: If evaluation is to have the greatest impact on practice in the youth sector, which of the following should be the top two priorities?
 

 
Finally, we asked what the sector needs to enable impactful and useful evaluation.  In short, no one thing will do. What is evident is that youth sector organisations vary on what they see as the priorities for embedding evaluation which has a genuine impact on their practices. Some of this is clearly relatable to Q2.  There is need for better (understanding of) tools and approaches to measurement, as well as a commitment from funders to support good and effective evaluation practices. But this data also points to a desire for some wider, cross-sector learning, around understanding ‘what works’, and the role of quality youth provision in outcomes for young people.
 
We know we have only dipped our toe in the water of the potential of our survey and will share more detailed findings, for example looking at regional variation, over the coming weeks.  Moving forward we want this to be a collaborative project, so if you have ideas for questions you’d like to ask your colleagues across the sector, or you think other stakeholders need to hear practitioner views on, fill out this google form or get in touch with hello@youthimpact.uk
 
If you've already taken part in the survey, you'll be receiving a personalised link via email! If you haven't yet taken part, you can find the March edition of the next Asking Good Questions: The Survey here and keep up to date with insights by following #goodquestions. Make sure you have your say!

An update on the Youth Programme Quality Intervention (YPQI)
Last week, we launched a second cohort in England for the UK-wide pilot of the YPQI, a ground-breaking quality improvement process for youth practitioners. Staff representing 17 different organisations came together to learn how to use the programme quality assessment (PQA) tool and will now be ‘taking it back’ to their teams in preparation for the first of two cycles of Assess, Plan and Improve.
 
In response to feedback from existing participants, we recently launched an updated, shortened version of the PQA tool which we will be transitioning everyone to over the next few months. As before, the tool is focused on social and emotional learning (SEL). 

We’re looking forward to working with this new cohort and are delighted to be bringing them into a network of organisations already engaging with the YPQI across England, Scotland, Northern Ireland, and Wales. You can find more information about what’s been happening with the YPQI UK pilot here.
 
Reflections from the Second Practice Development Day
In this blog, Mary McKaskill, our Practice Development Manager, shares her thoughts from the Centre’s recent Practice Development Day and initial reflections on the feedback that was shared with us afterwards.
 
Act for Change: New research on youth-led social change
The Act for Change Fund provides resources for young people to challenge social injustice, find ways of overcoming inequality and give voice to issues they are experiencing. It is a £3.6 million joint initiative between Paul Hamlyn Foundation and Esmée Fairbairn, in partnership with the National Lottery Community Fund. Both foundations are acting as match funders and are awarding grants on behalf of the #iwill Fund.
 
Renaisi and the Centre for Youth Impact, are pleased to share our latest report in which we present a picture of the current context of youth-led social change activities, focusing on the following key questions:
  • How does social change fit within the social action agenda?
  • What does engagement in social change look like?
  • What factors drive young people’s engagement in social change?
  • What does a supportive infrastructure for social change look like?
  • What is the impact on individuals and the community when young people play a critical role in leading change?
The research draws on a number of sources including academic research and emerging findings from the #iwill Fund, and will inform the development of Act for Change.  Find the full report here or click here to read more about the Act for Change fund.

Ask the Centre
This is a new column in our newsletter where every month, we will respond to a question we receive about evaluating youth work and provision for young people. Few things in this area are straightforward and have a single solution, but we hope that this column can offer guidance on some of the key considerations in grappling with common evaluation challenges. 
 
This month, we’ll address the issue of selecting a sample of young people to take part in qualitative evaluation.
 
Question: Qualitative evaluation methods, like interviews and focus groups, require quite a bit of time and resources. We can’t hold interviews and focus groups with all the young people we work with, so how should we select a sample of young people to speak to?
 
When selecting a sample of young people to speak to, be clear what you want to learn and what you want others to learn. As with all evaluation, having a focused question that you’re setting out to answer will guide you towards choosing the right approach to collecting data. You may want to select a sample that represents a particular experience or community to explore their views in depth, or a sample that is broadly representative of all the young people you work with. You will also need to think about whether the sample is ‘self-selecting’ – that is, mainly made up of young people who volunteer or opt in, rather than a group that includes young people who wouldn’t normally do so. The number of young people you include in your sample will also influence how confident you can be about your findings, and how representative they may be of your total group.
 
When you have your sample of young people, consider how they represent the whole of the group that you’ve worked with and be upfront about any biases that could be present as a result of your sample. 
 
stratified sample involves selecting young people who had a range of positive and negative experiences with your provision and can lead to a balanced data set where you can learn both about what went well and how your provision could be improved. 
 
Often, the practicalities of working with young people mean that you end up talking to those who are available and willing to speak with you. These samples tend not to be representative of the whole group because you’re likely to have ‘helpful’ young people and those who have had a positive experience and are excited to talk about it. If your sample means that you are unlikely to hear ‘negative voices’, you may want to take an appreciative inquiry approach to your interviews and focus groups. 
 
Appreciative inquiry involves speaking with people who had very positive experiences with your service. This approach can help you gain insight about key mechanisms of change and promising practice. However, it’s important to recognise that this will be limited in terms of learning about what didn’t work or how provision may be experienced by unrepresented groups of young people.
 
Have a question you’d like us to respond to in this column? Send us an email at hello@youthimpact.uk and put ‘Ask the Centre’ in the subject line! 
 
The Adult Rating of Youth Behaviour (ARYB): Pilot
We are launching the next phase of our pilot programme for an observational approach to collecting evidence of positive outcomes for young people. The Adult Rating of Youth Behaviour (ARYB) is the first tool of its kind to be trialled in the UK. 
 
We are looking for organisations that are working very intentionally to support young people’s social and emotional learning (this doesn’t necessarily mean pre-defined outcomes, but that interactions with and experiences for young people are deliberately created to develop social and emotional skills), with the capacity to try out the tool over the course of 12 months, from April 2020. Find out more here.
 

Our Thoughts


In this section of the newsletter, our team members set out what’s currently occupying their thoughts. This month, Catherine Mitchell, Project Manager at the Centre, reflects on the role that questioning and listening plays in our work with organisations across different projects.
 
A fair chunk of my time at the Centre is spent getting to know organisations as they begin to engage with our various streams of work. The way in which we do this is multifaceted and will vary across projects. Whilst it will usually always start with a conversation, it will sometimes involve more in-depth methods such as the Impact Accelerator’s Confidence Framework – where organisations reflect on their behaviours and processes that support the design, delivery, evaluation and improvement of high quality youth provision – or an expression of interest and application process, such as for the  Enterprise Development Programme (EDP). Whatever the approach, more dialogue will always follow, and the ultimate goal is for us to get a good sense of where an organisation is ‘at’ and then tailor our work together accordingly.
 
We are about to embark on the next phase of the EDP, where our role as a Sector Partner will be to provide effective training, resources and guidance to youth organisations that are looking to make a transition to new enterprise models, in order to build their organisational resilience. Recently, when digging into what information we will need to help form this package of support – and, therefore, what questions we will need to be asking – it struck me that whilst contexts are different, there were many parallels with the questions that we already ask other organisations taking part in different projects, such as the Impact Accelerator, or the YPQI. These ‘getting to know you’ questions might explore an organisation’s goals and intentions, their previous experience, existing tools and processes, and the level of internal support, engagement and capacity. It might also include more objective information, such as their funding or governance models.
 
Our intention in this getting to know you process is to inform better, more efficient decision making in our supporting role. The organisations within our community are busy and we know that we need to be thoughtful in our questioning. We also need to be supportive in our conversations, encouraging trust and a frankness that will make our work together more effective. How can we ask these questions - which we need to do in order to do a good job - without making them over-burdensome or intrusive? What is the appropriate tone and format, and what do we do with that information once we’ve got it? Funnily enough, these are a lot of the questions that our partner organisations might find themselves asking, as they reflect on the role that questioning plays in their own work.
 
These questions are feeding into other streams of work here at the Centre. Our Data Manager, Josef, is currently busy bringing more structure to the way that we manage this type of information (and everything else). Our Research, Design and Insight team recently launched Asking Good Questions: The Survey, to enable those working with and for young people to share their thoughts on evaluation quickly and easily, with us and with each other.

For all of our questions, there will be a lot of answers, and we need to be listening as much as (if not more than) we are asking. I’m excited to be doing lots of this over the next few months, and to be able to adapt our work and approaches based on what the organisations that we work with are saying. We’ve been doing this most recently with the Impact Accelerator, as we work to expand and align the programme with our wider activities. Through calls and feedback sessions with past and current participants, we’ve been listening to what’s more and less helpful, where value is felt, and the things that might increase this value. All of this provides a great opportunity to demonstrate our values in action: to be considered and collaborativein our conversations, and to be challenging and supportivein how we take these forward. I’d love to hear from you if you have thoughts and suggestions on this - feel free to drop me a line at catherine.mitchell@youthimpact.uk.
 

What we're reading


This report, from the Alliance for Useful Evidence, delves into the role ‘experiments’ can play in developing social and public policy. It provides a framework for thinking about the choices available to a government, funder or delivery organisation that wants to experiment more effectively.   
 
This short accessible blog from Rachel Rank of 360Giving sets out the organisation’s understanding of what having a ‘data culture’ within an organisation looks and feels like. Rachel notes that she doesn’t think 360Giving has always talked enough about the culture shift involved in using data to support decision making and learning. We often feel the same about our own work at the Centre. As Rachel says, “opening up data isn’t enough if people don’t know how they can use it meaningfully, or struggle to make it relevant to their organisation’s wider work”. 
 
Interesting new research into the impact of youth work continues to grow in Scotland. This new report from Dumfries and Galloway Council uses the Transformative Evaluation approach developed by Dr Sue Cooper at Marjon University to gather the voices of young people as they reflect on the influence of community-based, universal youth work in their lives. The research reinforces other important studies that highlight youth work’s role in developing social and emotional skills, health and wellbeing, and active citizenship. It also looks at how youth work supports young people. 
 
We enjoyed this excellent blog from Sally McManus of NatCen, writing for the What Works Centre for Wellbeing. It highlights six complexities that really matter when thinking about evaluation and policy change to support wellbeing. They all feel highly relevant to our work at the Centre, and tackle some subjects that don’t often get an airing, including whether being ‘satisfied’ with services is a useful measurement, and the importance of considering societal context when we ask people how they feel. 
 
This blog from US network Measuring SEL got us very excited. It opens ‘can we measure social and emotional skills without using an assessment?’. Recognising the challenges (logistical, philosophical and technical) of measuring social and emotional skills, the blog suggests that measurement is actually getting the way of learning. We concur! It goes on to share the findings of recent research exploring the relationship between young people’s behaviours, interactions and social and emotional skills, primarily through observing the behaviours alone. 
 
A new paper from the Tamarack Institute focuses on building ‘evaluation literacy’, with a particular focus on participatory and collaborative approaches. The author, Pamela Teitelbaum, argues“often, simply having the language, concepts and basic understanding of evaluation approaches helps decision-makers discuss and identify how participatory and collaborative approaches to evaluation are not only valuable, but also how these may respond to the unique contexts in which evaluations are being planned’. The paper is thorough, detailed, theoretical and accessible, all at the same time. 
 
And finally, a nice little blog on the relationship between evaluation and vision. We particularly appreciated the idea that evaluation is more than just ‘social accounting’. 
 

Network News and Events


South West Regional Impact Network
Friday 6 March 10:30am – 3:30pm, Tiverton, Devon.  
Contact Gill Millar to attend.
 
West Midlands Regional Network Meeting
Tuesday 10 March, 10:30am, The Factory, Longbridge.  
Contact Ruth Rickman-Williams to attend.
 
North West Youth Work Conference
Thursday 12 March, Our Place Youth Facility, Huyton.
Contact Stuart Dunne to attend.
 
For more information on the regional networks and how you can get in touch with your local network lead, please get in touch with Steve Hillman
 
More from the Centre for Youth Impact
Subscribe to this newsletter | Visit our website | Follow us on Twitter
 
Copyright © 2018 Centre for Youth Impact, All rights reserved. 

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
YouthImpactUK · Suite 222 · 254 Pentonville Road · London, London N1 9JY · United Kingdom

Email Marketing Powered by Mailchimp