GSDRC

Governance, social development, conflict and humanitarian knowledge services

  • Research
    • Governance
      • Democracy & elections
      • Public sector management
      • Security & justice
      • Service delivery
      • State-society relations
      • Supporting economic development
    • Social Development
      • Gender
      • Inequalities & exclusion
      • Poverty & wellbeing
      • Social protection
    • Conflict
      • Conflict analysis
      • Conflict prevention
      • Conflict response
      • Conflict sensitivity
      • Impacts of conflict
      • Peacebuilding
    • Humanitarian Issues
      • Humanitarian financing
      • Humanitarian response
      • Recovery & reconstruction
      • Refugees/IDPs
      • Risk & resilience
    • Development Pressures
      • Climate change
      • Food security
      • Fragility
      • Migration & diaspora
      • Population growth
      • Urbanisation
    • Approaches
      • Complexity & systems thinking
      • Institutions & social norms
      • Theories of change
      • Results-based approaches
      • Rights-based approaches
      • Thinking & working politically
    • Aid Instruments
      • Budget support & SWAps
      • Capacity building
      • Civil society partnerships
      • Multilateral aid
      • Private sector partnerships
      • Technical assistance
    • Monitoring and evaluation
      • Indicators
      • Learning
      • M&E approaches
  • Services
    • Research Helpdesk
    • Professional development
  • News & commentary
  • Publication types
    • Helpdesk reports
    • Topic guides
    • Conflict analyses
    • Literature reviews
    • Professional development packs
    • Working Papers
    • Webinars
    • Covid-19 evidence summaries
  • Projects
  • About us
    • Staff profiles
    • International partnerships
    • Privacy policy
    • Terms and conditions
    • Contact Us
Home»Document Library»Using Evaluation for a Change: Insights from humanitarian practitioners

Using Evaluation for a Change: Insights from humanitarian practitioners

Library
Alistair Hallam, Francesca Bonino
2013

Summary

This paper seeks to motivate and encourage humanitarian evaluators by highlighting and discussing concrete ways to address the challenge of poor or ineffective use of evaluation. The use of insights and case studies from evaluation colleagues in the humanitarian sector is designed to reinforce the sense that much is possible. It is hoped that this framework will contribute to strengthening the capacities of individuals, teams and organisations to become better-informed commissioners, users and consumers of evaluative information and knowledge.

The three main approaches to this are to: highlight the key issues around improving use of evaluation of humanitarian action; present a framework for analysing and some practical suggestions for improving the capacity of agencies to fund, commission, support, carry out, and meaningfully use humanitarian evaluations; and further illuminate the issues by providing case studies and insights from evaluators that explore what has worked in practice.

This study is the result of a two-year consultation and peer-to-peer exchange with ALNAP Members, and an extensive literature review. It presents a simple framework on strengthening humanitarian evaluation capacities, which will allow individuals, teams and organisations to structure a debate around all the elements that are discussed as having an impact on humanitarian agencies’ ability to commission, conduct, and utilise evaluations in a more effective way.

Key Findings:

  • The framework is hierarchical, with the most important and fundamental issues of leadership, culture, structure and resources appearing in Capacity Area 1. Clarifying purpose, demand and strategy are also important but less significant and so appear in Capacity Area 2. Capacity Area 3 ‘Evaluation processes and systems’ focuses on processes and systems that, while useful in their own right, are considered less likely to bring about fundamental change on their own, without changes made elsewhere.
  • The three Capacity Areas are mutually reinforcing. Some of the easy-wins from a ‘lower’ capacity area may help catalyse interest in evaluation, which can promote change in more challenging capacity areas. Small changes in one area may trickle up, and overflow into other capacity areas.
  • No ‘one-size-fits-all’ approach could be derived from this exploration of humanitarian evaluation capacities. This is because increasing the impact of humanitarian evaluations is also about constructing pathways for the evaluation findings to make a difference within the organisation. Performing a good-quality evaluation is only the first step. The lessons then have to be absorbed, taken forward, and implemented in practice before organisational learning is achieved.

Recommendations:

Each capacity area includes its own series of recommendations of how to better facilitate evaluations in humanitarian assistance. These include:

  • Ensure leadership is supportive of evaluation and monitoring.
  • Create organisational structures that promote evaluation.
  • Develop a strategic approach to selecting what should be evaluated.
  • Disseminate findings effectively.
  • Strengthen follow-up and post-evaluation processes including linking evaluation to wider knowledge management.

Source

Hallam, A. and Bonino, F. (2013). Using Evaluation for a Change: Insights from humanitarian practitioners. ALNAP Study. London: ALNAP/ODI.

Related Content

Investment in Refugee Education
Helpdesk Report
2023
Humanitarian learning resource guide (2020 Update)
E-Learning
2020
Humanitarian Action (2020 Update)
E-Learning
2020
Coping mechanisms in South Sudan in relation to different types of shock
Helpdesk Report
2020

University of Birmingham

Connect with us: Bluesky Linkedin X.com

Outputs supported by DFID are © DFID Crown Copyright 2026; outputs supported by the Australian Government are © Australian Government 2026; and outputs supported by the European Commission are © European Union 2026

We use cookies to remember settings and choices, and to count visitor numbers and usage trends. These cookies do not identify you personally. By using this site you indicate agreement with the use of cookies. For details, click "read more" and see "use of cookies".