Squaring the circle

 

In this guest blog, Derrick Johnstone, author of the Alliance’s latest report, Squaring the Circle, develops the evidence discussion from the need for more robust standards of evidence and evaluation to explore the need to facilitate better use of evidence on the ground.

Much of the discussion around the Alliance for Useful Evidence has been taken up with the need for more robust standards of evaluation and the potential contribution of the new What Works Centres.

We have paid less attention to actions needed to facilitate better use of evidence on the ground, about:

  • what more leaders can do to foster cultures which value evidence;
  • how access to good quality data for evaluation can be enhanced; and
  • developing and making better use of the skills that are needed.

The latest report from the Alliance for Useful Evidence, Squaring the Circle – about how local authorities and their partners are using evidence to help them reconcile diminishing resources and increasing demands – begins to redress this balance.

Squaring the Circle shows how, for instance, better use of evidence has been informing hard decisions on priorities, how qualitative research has underpinned innovation, and how techniques such as Cost Benefit Analysis (CBA) are coming into play. Illustrations include Leicestershire’s Family Insight Review, Birmingham City Council’s use of Randomised Control Trials (RCTs) in evaluating early intervention programmes, and the use of CBA by Greater Manchester partners in determining how to make better use of their resources on shared priorities such as alternatives to custody and integrated social care.

The search for more robust evidence for decision-making places great demands on staff and highlights associated skill gaps. These are evident not only amongst analysts, who acknowledge learning needs around evaluation methods and modelling demand for services, but also amongst evidence users in interpreting data, commissioning research inputs, and judging what is likely to work in local circumstances. There are shining lights (e.g., CBA training in Greater Manchester and London, Project Oracle training for commissioners) but many critical needs are going untackled.

Organisational leaders must recognise needs around resources, skills and culture, not least in creating the conditions for better use of evidence. The Alliance’s work with SOLACE has pointed to merits, for instance, in adopting planning and decision-making models which drive research and analysis, in demanding more from analysts in bringing findings to life, and in ensuring that data gaps on service costs are plugged. Senior managers taking part in the SOLACE Summit last year acknowledged that they could do more to reflect self-critically on their own skills and comfort in promoting a more evidence-based culture.

Developments at national level can foster better use of evidence. The What Works Centres can offer a great deal in mobilising the evidence base and raising standards – though their role is not build skills and capacity directly. The new Knowledge Navigator service, supported by ESRC, the Local Government Association and SOLACE, is being launched to tap potential in relationships with universities which now have to pay much greater attention to the impact of their research.

LARIA is seeking to strengthen its role as a professional body for local research and intelligence, developing regional branches, introducing new skill standards and providing hard-edged examples where member activities have made a real difference. There is scope for local authorities to be more proactive in collaborating where there is a need to strengthen evidence on particular issues, to generate stronger ‘what works’ evidence and lend weight in their dealings with central government on public sector reform. Such developments may be stimulated by City Deal negotiations or through the new Public Service Transformation Network, intended to spread innovation from Community Budget pilots.

Improvements in access to administrative data are vital, in making evaluations more cost-effective to undertake and indeed, enabling some evaluations which wouldn’t otherwise be possible. These may be enabled, e.g., through anonymised data for use in identifying control or comparison groups or in the form of aggregated outcomes. More use can be made of secure data labs such as the ESRC-backed HM Revenue & Customs Data Lab, which allows access by approved researchers, while the new Justice Data Lab, which packages outcome data for voluntary organisations and social enterprises, points the way for other departments like DWP to follow. Lack of progress may jeopardise success of Payment by Results or Social Impact Bonds, where there are significant challenges in setting baselines and establishing how much of a difference the interventions are really making.

 

Derrick Johnstone
Derrick is a director at Educe, an organisation which specialises in policy, partnerships and performance improvement. His background in economic development and skills, widened through extensive strategy development and evaluation work across sectors and fields of policy. His experience includes a series of projects relating to technical assistance for the better use of evidence at a local level. This includes advisory and research work on the ground with local authorities and their partners. Published reports from projects for the Department for Communities and Local Government include ‘Supporting Local Information and Research: Understanding Demand and Improving Capacity‘, ‘Supporting Evidence for Local Delivery: National Research and Evaluation and Seeking the Lessons: Neighbourhood Renewal Skills and Knowledge Programme Evaluation. As part of these and other local and regional projects, he has researched analytical skills and capacity. Derrick has a particular interest in related organisational and partnership development issues, including data sharing.

(The views are the author’s own and do not necessarily represent those of the Alliance for Useful Evidence)