Earlier this month the What Works Summit – Early Years in Belfast brought together local researchers, decision-makers and practitioners to meet four of the ten UK What Works Centres: Wales Centre for Public Policy (WCPP), Early Intervention Foundation (EIF), Education Endowment Fund (EEF) and NICE to discuss all things early years. Conversations sparked some questions around the use of evidence. What can What Works Centres do to make using evidence easier? What’s key to getting evidence used in real life? How can we best measure impact?
Hosted by Campbell Centre UK & Ireland in partnership with WCPP and the Alliance, the event aimed to explore how What Works Centres create, share and use high quality evidence to inform decision-making in policy and practice. We invited local stakeholders to share knowledge and help shape the early years research agenda. The event is part of a series of summits in Northern Ireland, Scotland and Wales. Three lessons in using evidence emerged over the day.
1) Connectedness – Seána Talbot, manager of Saol Úr Sure Start, Belfast explained that evidence suggests breastfeeding is best for the baby, yet, this isn’t always put into practice. She told us about a local postnatal project which saw breastfeeding rates at 6 months of age increase from 4% in 2014 to 43% in 2017. Seána explained that connectedness between midwives and mothers was key to success. Local mothers often reported stopping breastfeeding before they wanted to and found the support of midwives gave them confidence to continue as long as they wished. The room was in agreement with the importance of good connections but noted the challenge of measuring effective relationships. The EEF has made a start in evidencing the value of relationships, but more needs to be done. This would place a greater value on relationships and help make the case to commissioners or managers for relationship building to be integral to service design and delivery.
2) Communication – “Unless someone translates ‘p’ numbers into real life, research data is meaningless”. This quote from Seána highlights the need for evidence to be communicated thoughtfully. For evidence to be useful it must be communicated in a way that is meaningful to target audiences. So, get to know your audience! This means asking what information they need, what’s the best way for it to be presented, and whether the language used makes sense. Perhaps we can learn from NICE’s evidence based guidelines. James Jagroo, a NICE Implementation Facilitator, told us they are produced by consulting with stakeholders, ensuring they are accessible and useful. Tom McBride, Director of Evidence at EIF, explained that standards of evidence can also be a useful way to clearly communicate what is and isn’t working. The EIF Guidebook rates programmes based on the strength of evidence of impact and its relative costs.
3) Context – Evaluating in the early years context can be tricky. Matthew van Poortvliet from EEF identified that there is less experience of evaluating in the early years context, and that issues commonly encountered when evaluating, such as high mobility of service users between settings or high staff turnover, are particularly challenging. It’s a case of horses for courses. Yes, we need the appropriate research method to find the answer to our question, but we should consider the unique environment before evaluating. We can then know what challenges to expect and mitigate against them.
So, what’s next for What Works, Belfast? Our next What Works Summit in May will explore learning and challenges in using evidence in the context of mental health. Follow @CampbellUKIRE to get involved.