The NFER blog

Evidence for excellence in education

How do you solve a problem like impact?

Leave a comment

One of the more positive consequences of the darkening clouds of economic downturn for charities such as NFER has been a greater emphasis on impact: understanding it, measuring it, communicating it. In the face of reductions in funding, it becomes ever more important to be able to demonstrate that precious resources are being directed in the best possible way.

So these clouds come with a silver lining: they are enabling us to re-focus on our core purpose, to provide evidence to improve education and the lives of learners. But for NFER, any discussion about impact is in danger of becoming confusing and circular. What does impact look like for an organisation that, among other activities, itself helps others to evaluate their impact?

Evaluation: what’s the point?

Considering then the possible impact of an evaluation (many of the same principles apply also to NFER’s various other activities), the key lies in understanding what clients, partners and others do with our research. Unless it helps to inform policy and spending decisions, or to inform improvements to service delivery and frontline practice, then it becomes a purely academic exercise.

With this in mind, any evaluation must have a clear purpose. This is one of the messages from our forthcoming evaluation policy. For example, an impact evaluation is intended to measure whether a particular policy, activity or intervention is having the desired effect and therefore whether it should continue in its current form (and there are a variety of quantitative and qualitative methods to test this). A process evaluation will focus on why and how an impact is being achieved, and could be used to ensure that an intervention is rolled out with the important features intact, or to inform further development and improvement of the existing policy or practice (and again can be tested using a range of quantitative and qualitative methods).

In the same way, measuring any charity’s impact should just be the beginning – measurement comes into its own when it’s used as a tool to increase this impact further (analogous to the role of formative assessment in the classroom).  As a charity it is not enough simply to measure your impact; what matters is then doing something with this information to ensure it increases!

Theories of change

Two really useful (and related) tools for conducting evaluation are theories of change and logic models. These are different ways of capturing the objectives of an intervention for its stakeholders, the resources and activities involved in its delivery, and the mechanisms linking the two. In my view some sort of description along these lines is essential: a clear articulation of what impact looks like for your organisation is a prerequisite both for effective measurement and delivery of impact.

This is why last year we at NFER developed our own theory of change (you can find it on page 8 of last year’s impact review), and why we now have a dedicated impact team tasked with making it a reality. It highlights in particular the need for us to work more closely with a variety of partners to ensure that our research findings are useful, relevant, trusted, timely, and engaging; in order that they can support and inform positive changes in policy and practice.

This theory-of-change-thinking can be applied to the individual projects we carry out. By understanding more clearly the policy or practice objectives of a piece of research, and by articulating the mechanisms by which it can meet these objectives, then we are in a much better position to ensure the research creates an impact. This is why impact planning is becoming an increasingly important part of how we develop our research projects. This may affect the way in which we design and deliver a research study or the way in which findings are communicated or shared.

Here are some recent examples:

  • Our evaluation of DfE’s summer schools programme had some key messages for schools looking to run similar activities. In addition to the main report, we therefore also produced this short top tips guide for schools.
  • NFER have a long track record of research into the role of effective careers guidance. This has formed the basis for our recent thinkpiece, Careers guidance: if not an annual careers plan – then what?, designed to inform policy, and we are now working with partners across the sector to support schools further to improve frontline provision.
  • Another major area of expertise for us is curriculum and assessment. We have brought this to bear in a series of Practical Guides to the 2014 Curriculum in order to support schools adapting to the new programmes of study (note that, while the above two resources are free of charge, there is a small charge for the curriculum guides).

By applying to ourselves the tools we use to help others think more smartly about impact, we should move closer to realising our vision: evidence for excellence in education.

If you’ve used any research, tools, products or services from NFER to make a difference to learners, I would love to hear from you at b.durbin@nfer.ac.uk or you can find me on twitter @benpdurbin. Feel free also to get in touch if you’d like to find out more about how some of these ideas might apply to measuring and increasing the impact of your organisation.

Ben Durbin leads a new team at NFER tasked with increasing our impact on education and learners.

Author: Ben Durbin

Ben Durbin is Head of Impact at the National Foundation for Educational Research

Leave a Reply