We have extensive experience of evaluating all types of employment and workplace initiatives and programmes for both policy bodies and employers.

Our approach synthesises findings from a range of evidence sources to identify and elaborate the implications and lessons for policy development and delivery.

Our multidisciplinary staff includes qualitative researchers employing a wide range of theory-based designs, and a group of specialist research economists focusing on impact evaluation and other forms of quantitative analysis.

We use both experimental and non-experimental designs to assess the economic and wider impact of employment interventions as well as more focussed approaches to determine what works for whom in what circumstances and why.

One of our specialisms is the analysis of large-scale data using state-of-the-art parametric and non-parametric econometric and benchmarking methods.

Read about our range of evaluation methods below.

Contact: Becci Newton

Option appraisal

We seek to predict both the quantitative and qualitative impacts of initiatives, by linking information on programme designs, structure and sequence with secondary research and simulations of outcomes and impacts, in a process often referred to as ex-ante analysis and appraisal.

These evaluations can also include assessments of the potential economic benefits of programmes, which can help those consulting on policy and programme design.

Linking to the theory of change

Designers of policies and initiatives often have a model for how a particular intervention is expected to work and achieve impact with the intended target groups. This is known as the theory of change.

We are skilled in leading evaluations that closely focus on the theory of change or logic change behind the initiative and test key assumptions in practice. This can comprise both qualitative and quantitative data collection and analysis and often involves an emphasis on behavioural effects and change, which may be framed in the context of behavioural insights.

Impact evaluation

We have a strong research record on the quantitative analysis of the impacts of social programmes and policies comparing observed outcomes and estimated counterfactual outcomes of what would have happened in the absence of an intervention.

Our theory-based quantitative designs include experimental and non-experimental (econometric) estimators, such as random control trials, propensity score matching and other econometric models such as difference-in-differences designs.

Impact evaluation also uses qualitative designs relying on in-depth interviews, institutional analysis, focus groups and field visits.

Economic evaluation

We have extensive expertise in leading economic evaluations of employment and training-based initiatives. These link initiatives’ impacts to their costs and benefits to estimate the net monetary value (most often in terms of skills and human capital) to individuals, organisations and society.

Evidence is often expressed in return-to-investment ratios, including social returns, which can assist key decision-makers ranging from company managers to policy advisors, to understand the economic value of initiatives when planning their budgets.

Process evaluation

We have a long and well-established track record in understanding the process of the implementation of programmes. Such work relies primarily on the use of qualitative research methods such as in-depth interviews of programme participants and experts (including the Delphi method), focus groups, forums and discussion groups, case studies and field visits.

All process work is theory-based, using clear evaluation frameworks established at the onset of projects to obtain systematic implementation evidence, which can be linked to programme and policy design and configuration to learn how outcomes and impacts can be improved.


IES is leading the development of benchmarking as an analytical tool for policy and organisational learning. We generally start by identifying a coherent framework of indicators which directly links clear performance measures to strategic objectives – both at the level of policy decision-making or for employers.

We often use this approach in our public policy and international work, for example when evaluating progress made with employment policy and programmes. It also helps to promote mutual learning of policy (and other) decision-makers in Europe by feeding back tailored quantitative evaluation evidence to improve strategies, policy objectives and concrete programmes and initiatives.

IES experts

Becci Newton

Jonathan Buzzeo

Rosie Gloster

Becci Newton
Deputy Director
Jonathan Buzzeo
Senior Research Fellow
Rosie Gloster
Senior Research Fellow

Related projects

Evaluating mandatory maths and English training for young jobseekers: 18-21 Work Skills Pilot
Department for Business, Innovation and Skills

Evaluation of the Apprenticeship Trailblazers
Department for Business, Innovation and Skills

Employer Investment Fund (EIF) and Growth Investment Fund (GIF) Programme Level Evaluation
UK Commission for Employment and Skills (UKCES)

Evaluation of the Employer Ownership Pilot Round 2
Department for Business, Innovation and Skills