Copy
Understanding and improving our impact in young people's lives. 
View this email in your browser
News

We’re hosting another Practice Development Day!
Following our first ever Practice Development Day last November, which sold out in a couple of weeks, we are hosting another day-long workshop at our offices in Hackney, East London, on Monday 2 March.
 
The Practice Development Day is an intensive one-day introduction to the Centre’s approach to evaluation. It’s best suited to those working in organisations or agencies that work directly with young people, and who are looking to embed evaluation into their practice for continuous learning and improvement. Read Ed Anderton’s reflections on the first Practice Development Day that took place as part of our 2019 Gathering.
 
Tickets will go on sale next Monday 13 January and will cost £50, or £35 concession for organisations with a turnover of less than £100k. We will release the sign-up on our twitter page and our website.
 
Youth Programme Quality Intervention (YPQI) Pilot: Register your interest for training (February 2020, Midlands)
We are recruiting for the next English cohort for the UK pilot of the YPQI, a ground-breaking quality improvement process for youth organisations. This will be the second cohort of English youth organisations, who will join cohorts across Scotland, Wales and Northern Ireland currently testing the approach. This cohort will focus on organisations based in the Midlands. An introductory training session will be held locally in late February or early March. The date and venue will be confirmed by mid-January. If you would like to register your interest or find out more, please contact us on ypqi@youthimpact.uk. There’s also a lot more information about the YPQI on our website.
 
Informal Learning Group – the Quality Impact Protocol (QuIP)
Last month we published a blog from our Data Manager, Josef Fischer, reflecting on recent training he’d attended on the 'Quality Impact Protocol’ (QuIP) research methodology. The QuIP uses contribution analysis to address the question of attribution of impact in complex environments: in other words, it looks at how we as practitioners understand the relationship between the impact that we measure and our specific activities, drawing directly on the words of the people and communities we support. At the Centre we’re excited to consider how we could use the QuIP methodology to get a better understanding of how young people see and experience youth provision in their lives and how they think this provision affects them.
 
Following the blog post we’ve already received feedback from others who are keen to share their experiences of QuIP - and similar research methods - in a UK context, and so we’re considering setting up an informal learning group. If you’re interested in being involved or you’d like to know more about this work, get in touch with Josef Fischer.
 
Find out more about our work on our website
Across the last 12 months, we’ve announced several new projects for the Centre. If you are curious about any aspect of our work, you can visit this updated section of our website where we share information about the projects we’re working on across our two interconnected areas: Practice Development and Research & Learning. So, whether you want to find out what the ARYB stands for, what we’re learning from the Youth Investment Fund, or how we work with our networks, you can now find all the information you need in one place. Each page also provides contact details of a team member if you’d like to find out more. We’ll be reviewing other sections of our website this year, so please get in touch if there’s anything else you’d like to see us add.
 

Our Thoughts


In this section of the newsletter, our team members set out what’s currently occupying their thoughts. This month, Steve Hillman, our Chief Operating Officer, reflects on the concept of an ‘enablement mindset’ and how it is relevant for moving towards a different kind of evaluation practice across the youth sector.

At the second panel discussion at the Gathering 2019 – our annual conference last November - Nadine Smith from the Centre for Public Impact talked about how examining evidence of ‘what works’ can get in the way of enabling public servants to create the conditions for success. This was also a theme in the first panel discussion, where Geethika Jayatikala from Chance UK talked about the ‘paralysis’ engendered in her organisation when undergoing a three-year randomised controlled trial (RCT) study.
 
The Centre for Public Impact talk about the ‘enablement mindset’, and contrast it with the ‘delivery mindset’ that imposes centrally-set performance targets and KPIs, defines the outcomes to be measured and the metrics for them, in advance. We are all familiar with the negative consequences of this approach: frontline staff can feel disillusioned and that their expertise is being ignored, providers can ‘cherry-pick’ the easiest cases to work with, or ‘game’ the system to skew the data, and under-report unfavourable results.
 
The enablement mindset, by contrast, uses data to inform practice in situ in a process of continuous quality improvement. According to the Centre for Public Impact, there are three essential conditions for the enablement mindset to flourish:
  • Firstly, subsidiarity, or the devolution of decision-making rights to the lowest possible level in a system;
  • Secondly, localism in terms of accountability and decision-making;
  • Thirdly, the importance of place, and the recognition that solutions might look very different in different places.

The two other speakers on the panel with Nadine provided us with concrete examples of this process in action. Charles Smith from QTurn defined what he and his colleagues in the US help youth organisations to implement as ‘citizen science’, giving the tools and resources to staff working at the ‘point of service’ so that they can create immediate, concrete service quality improvements based on the observation of staff and young people’s behaviours. And Valerie Threlfall of Ekoute talked about Listen4Good, a feedback project that gathers perspectives, feelings and opinions from programme participants to assess fidelity of implementation and inform decision-making around programme improvement.  Both alluded to the ‘equity effect’ of their practice – Charles Smith in terms of the immediate effect of even a modest improvement in service quality on social and emotional skill level improvement in young people with adverse childhood experiences, and Valerie Threlfall in terms of perceptual feedback enabling the identification of inequities in service delivery in real time.
 
The Centre for Public Impact describe the enablement mindset as ‘viewing public systems more like a garden that requires cultivation rather than control’, as opposed to the delivery mindset that views the state as ‘a giant machine ready to be optimised’.  A gardener can readily see which plants are thriving and which require additional support in terms of fertilisation, water or light. But people need closer observation, and you need to ask them what they think, otherwise you can’t be sure that you know.
 
It is interesting that the Centre for Public Impact draw on Buurtzorg as an example of the enablement mindset in action, since they are one of the key examples of a Teal Organisation in Laloux’s Reinventing OrganisationsFor Laloux, Teal organisations are the next stage in organisational evolution and are characterised by self-organisation and self-management, with a decentralised structure consisting of small teams that take responsibility for their own governance and how they interact with other parts of the organisation.  The work of QTurn and Listen4Good are exemplars of the role that evaluation data can play in the evolution of organisational form and function.

 
 

What We're Reading


Two posts from the American Evaluation Association 365 Blog have caught our eye this month. Firstly, in this blog Sebastian Lemire considers the varied types of theory we use to underpin evaluation practice. Following a review of over 100 published theory-based evaluations, Sebastian shares some key insights into how we use theory in evaluation, and introduces some useful and relevant concepts such as ‘theory knitting’ and ‘theory labelling’. Next, Sarah Mason uses this blog to discuss the importance of context in evaluation. Sarah unpacks what we mean when we talk about ‘context’, and the many different ways it can influence evaluation, emphasising that there is no universal ‘right way’ for evaluation practice to take place as it is always dependent on the surrounding circumstances.
 
On a similar note, this guide, entitled ‘Putting Data to Work for Young People’, emphasises the wide variety of ways that evaluation data can be collected, and it suggests that many organisations waste time and energy “collecting the wrong data in the wrong way”. The report was produced by Every Hour Counts, a US network of organisations working with young people in out of school settings. A detailed ten-step guide is provided, to support the collection of useful data that are accurate, complete, and captured at the right scale.
 
At the Centre, through our work on the YPQI UK Pilot, we’ve been thinking about the role of youth organisations in supporting young people to foster the characteristics associated with a ‘growth mindset’ – a belief that one’s basic qualities and abilities can be developed, rather than being fixed. As such, we were interested to read this article from Best Evidence in Brief, an online newsletter of educational research. The article draws on findings of a recent US academic study that tested whether a growth mindset intervention could improve academic performance in 14 to 15 year olds. The study concluded that the intervention had a positive impact on young people’s average grades, and that it changed their self-reported mindset, their attitudes towards failure, and their views on challenges.
 
This article from Maggie Koerth is based around an interview with two neuroscientistis, and asks the unusual question: what can a dead fish teach you about statistics? Drawing on a scientific example that involves Atlantic salmon and MRI machines, Maggy explores how our biases and preconceptions deeply affect what we see. The article highlights the importance of having a specific hypothesis when doing research, because “if you don't know what you're looking for and you just want to see ‘what lights up’, then you're getting lots more chances to see things that could be just random”. We think this has important relevance when designing evaluations in the youth sector, to ensure that we are looking for evidence to back up the hypotheses we hold the impact of our work, such as a theory of change.
 
We enjoyed reading this recent article on the idea of ‘collective impact’ from the Stanford Social Innovation Review. The authors argue that organisations are often confused about how to describe their work with other organisations, and so they frequently “identify themselves as using the popular model of ‘collective impact’, whether or not they adopt its tenets in practice”. The article suggests there is an overreliance on the concept of collective impact because social impact coalitions lack the language to describe other types of networks. They propose a detailed new framework to capture the variations of how organizations can work together to solve community problems. 


 

Network News and Events


Upcoming Network Meetings 
 
North West Regional Impact Network
Wednesday 16 January, 10am – 3pm, Youth Focus North West, Micklehead Business Village, Unit 6b, St Michaels Road, St. Helens, Merseyside, WA9 4YU
Contact Stuart Dunne to attend.
 
Yorkshire and Humberside Regional Impact Network
Wednesday 5 February, 1:00-4:00pm, venue central Leeds or York. 
Contact Patrick Ambrose or Charlee Bewsher to attend.
 
London Regional Impact Network
Monday 2 March, 2:00 – 5:00pm, London Youth, Pitfield Street, London N1 6DA.
Contact Hazel Robertson to be added to the invitation list.
 
More from the Centre for Youth Impact
Subscribe to this newsletter | Visit our website | Follow us on Twitter
 
Copyright © 2018 Centre for Youth Impact, All rights reserved. 

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.






This email was sent to <<Email Address>>
why did I get this?    unsubscribe from this list    update subscription preferences
YouthImpactUK · Suite 222 · 254 Pentonville Road · London, London N1 9JY · United Kingdom

Email Marketing Powered by Mailchimp