Monitoring and evaluation: how we look at impact

Measuring and consolidating results, learning from activities to inform future strategic directions, and being accountable to donors and partners: IIEP’s continuous improvement process is based on a strong monitoring and evaluation system. Find out more about our methodological approach and access the latest evaluations of our programmes and major projects.

 

 

 

Our approach to monitoring results

IIEP regularly conducts in-house data collection to monitor its programmes and activities. Progress is measured through 10 key performance indicators, defined in the Monitoring, Evaluation and Learning Strategy (también disponible en español) of the 11th Medium-Term Strategy.
 

To centralize and consolidate all of this data, IIEP relies on a range of practices supported by three information systems, including a project portfolio management solution. The Institute’s annual report to its Governing Board includes both a qualitative and quantitative assessment of results. The Institute is also subject to an annual financial audit by UNESCO's External Auditor. 

Our evaluation strategy

How do IIEP's trainings concretely impact the professional practices of the people who participate in them - and the capacities of the organizations where they work? To what extent do our technical support missions and research programmes influence the political decision-making and budgetary trade-offs of governments? 

The evaluation process set up at the Institute allows us to:

  • look at our practices and better understand the impact of our activities,
  • draw lessons for the development of our Medium Term Strategy and the design of our programmes,
  • better target IIEP's support to UNESCO Member States in achieving Sustainable Development Goal 4.

Conducted internally and/or entrusted to external evaluators, these evaluations aim either to draw up a complete assessment of a specific project, or to explore more generally our action in a given field of expertise or geographical area.


EVALUATION REPORTS OF SPECIFIC PROJECTS

 2022 - IIEP, Review of Advanced Training Program Outcomes, 2014-2018 Internal evaluation

☍ 2022 - Evaluation of IIEP/UNICEF Online Course: ‘Foundations of Disability-Inclusive Education Sector Planning’ South Asia

☍ 2022 - Evaluation récolte des effets - Améliorer l'éducation par la planification et le pilotage stratégique, Haïti - Partie 1 et partie 2

☍ 2021 - Evaluation of IIEP-UNESCO/UNICEF Online Course: ‘Foundations of Disability-Inclusive Education Sector Planning’

☍ 2020 - Support to the Education Sector Plans in Jordan and Guinea: Outcome harvesting evaluation

Internal outcome harvesting evaluation, endorsed by external experts 

☍ 2020 - Evaluation of the Training Program on Educational Policies for Early Childhood in Latin America, 2019
Internal evaluation

☍ 2018 - Capacity Development for Educational Planning and Management in Afghanistan, 2015-2018
External evaluation

☍ 2016 - Crisis-Sensitive Education Sector Planning: UNESCO-IIEP Support in South Sudan
Evaluation of UNESCO's Internal Oversight Service

☍ 2014 - Capacity Development Project in Education Sector Planning in Afghanistan, 2011-2013
External evaluation
 

EVALUATION REPORTS OF CROSS-CUTTING PROGRAMMES

☍ 2021 - IIEP, Research Programme of the 9th Medium Term Strategy, 2014-2017
 External evaluation

☍ 2018 - IIEP-Pôle de Dakar (in French only)
External evaluation

☍ 2017 - IIEP Paris, Technical Cooperation Programme and Management Response
External evaluation

☍ 2013 - IIEP Paris, Cost and Effectiveness of Training Modalities
Étude externe

☍ 2013 - IIEP-UNESCO Global Review
Evaluation of UNESCO's Internal Oversight Service

☍ 2012 - Evaluation of IIEP’s Research
External evaluation

CURRENT AND UPCOMING EVALUATIONS

2021 - IIEP, Audience analysis, 2020
Internal evaluation

2021 - IIEP, Crisis Sensitive Planning Capacity Building Project
External Evaluation of the European Union, Foreign Policy Instruments Department (FPI)

 

Our three evaluation methods

IIEP has opted for an evaluation approach adapted to each of its main areas of expertise and intervention.

Training: Kirkpatrick's model

IIEP uses Kirkpatrick's model to assess the quality of training courses and programmes delivered online or face-to-face. This four-level system examines the impact of the learning experience, taking into account immediate results, participant satisfaction, but also positive changes in the beneficiary organization in the longer term, in terms of both behaviour and performance.

Technical assistance: the Outcome Harvesting method

In addition to traditional evaluation approaches, IIEP now uses impact collection, a method suitable for evaluating complex programmes or projects in environments marked by high uncertainty. Outcomes are defined as changes in behaviour in a broad sense, in the actions, relationships, policies or practices of actors. This method does not measure progress against pre-determined indicators and targets, but rather collects evidence of actual changes. Where appropriate, it determines – in reverse – how IIEP's intervention has contributed to these changes.

Research & Development (R&D): IIEP's theory of change

How do the intellectual productions and tools developed by IIEP lead to positive change in educational planning and management practices? How do they contribute to the improvement of education systems? IIEP's theory of change seeks to provide objective answers to these questions. Thus, this reference framework inspired by the specialized literature establishes the expected relationships between:
- the R&D activities carried out by IIEP,
- the R&D products such as publications and methodologies,
- the results and finally the impact of each of the Institute's R&D projects.

IIEP's evaluation strategy is aligned with the broader UNESCO Evaluation Policy.