GSDRC

Governance, social development, conflict and humanitarian knowledge services

  • Research
    • Governance
      • Democracy & elections
      • Public sector management
      • Security & justice
      • Service delivery
      • State-society relations
      • Supporting economic development
    • Social Development
      • Gender
      • Inequalities & exclusion
      • Poverty & wellbeing
      • Social protection
    • Conflict
      • Conflict analysis
      • Conflict prevention
      • Conflict response
      • Conflict sensitivity
      • Impacts of conflict
      • Peacebuilding
    • Humanitarian Issues
      • Humanitarian financing
      • Humanitarian response
      • Recovery & reconstruction
      • Refugees/IDPs
      • Risk & resilience
    • Development Pressures
      • Climate change
      • Food security
      • Fragility
      • Migration & diaspora
      • Population growth
      • Urbanisation
    • Approaches
      • Complexity & systems thinking
      • Institutions & social norms
      • Theories of change
      • Results-based approaches
      • Rights-based approaches
      • Thinking & working politically
    • Aid Instruments
      • Budget support & SWAps
      • Capacity building
      • Civil society partnerships
      • Multilateral aid
      • Private sector partnerships
      • Technical assistance
    • Monitoring and evaluation
      • Indicators
      • Learning
      • M&E approaches
  • Services
    • Research Helpdesk
    • Professional development
  • News & commentary
  • Publication types
    • Helpdesk reports
    • Topic guides
    • Conflict analyses
    • Literature reviews
    • Professional development packs
    • Working Papers
    • Webinars
    • Covid-19 evidence summaries
  • Projects
  • About us
    • Staff profiles
    • International partnerships
    • Privacy policy
    • Terms and conditions
    • Contact Us
Home»Document Library»The Evaluation of Politics and the Politics of Evaluation

The Evaluation of Politics and the Politics of Evaluation

Library
Chris Roche, Linda Kelly
2012

Summary

How can development programmes that are thinking and working politically be assessed? This paper proposes that using a mixture of assessment methods, with an appreciation of the strengths and limitations of each, provides the most effective solution to the assessment challenge. It outlines a range of evaluation methods and illustrates their usefulness for different processes and interventions. It notes that single methods are more likely to privilege the voice and interests of particular groups and focus on only part of the developmental outcomes.

Political change is likely to be long-term, with intermediate steps that are important, but possibly intangible – such as changes in process, relationships and power. The context for such change is also likely to be dynamic and complex, making linear attribution between donor inputs and the results achieved even more difficult.

Using quantitative methods such as randomised controlled trials can contribute to an understanding of the causal effects of a limited number of variants of an intervention over relatively short time periods. Qualitative methods such as outcome mapping or action-reflection approaches can help explain why change happened and how donor inputs influenced change or supported key actors in this situation.

Used together, with an understanding of the value of the data that they produce, quantitative and qualitative methods can provide a more comprehensive picture of the changes that have happened, and the reasons for those changes, across short, medium, and longer-term time frames. This is particularly the case for complex programmes: a mixed methods approach provides different perspectives that together facilitate a more rounded view of non-linear changes.

However, the application of mixed methods requires skill and care. Good quality performance assessment starts with asking the right questions, using an appropriate mix of good quality data collection methods, and then bringing diverse information together. Particular skill is required to effectively synthesise and analyse the data.

Some donors have been reluctant to use a mixed methods approach. The authors argue that this is partly because development practitioners are poorly informed about various methodologies and their limitations. Further, evaluation itself is a political process. Political interests, such as a focus on short–term results, can encourage an emphasis on single assessment methods.

Development agency staff can manage the methodological and political nature of monitoring and evaluation processes more effectively when they:

  • Know the various purposes of their assessment systems, and how to ensure that all are attended to over time.
  • Know the value of the information provided by particular methodologies, and the limitations inherent in any single methodological approach.
  • Recognise the political nature of evaluative processes: by choosing single methods they are more likely to privilege the voice and interests of particular groups and focus on only part of the developmental outcomes.
  • Are given incentives to assess the quality of assessment processes and are empowered to resist pressures to narrowly define ‘results’ and ‘value for money’. They need to be able to ensure space for longer, wider, and multiple processes of change and assessment systems to capture the complexity of these processes.

Source

Roche, C. and Kelly, L. (2012) The Evaluation of Politics and the Politics of Evaluation. DLP Background Paper 11, Developmental Leadership Program

Related Content

Lessons from stabilisation, statebuilding, and development programming in South Sudan
Helpdesk Report
2020
Doing research in fragile contexts
Literature Review
2019
Designing, Implementing and Evaluating Public Works Programmes
Helpdesk Report
2018
Indicators and Methods for Assessing Entrepreneurship Training Programmes
Helpdesk Report
2018

University of Birmingham

Connect with us: Bluesky Linkedin X.com

Outputs supported by DFID are © DFID Crown Copyright 2026; outputs supported by the Australian Government are © Australian Government 2026; and outputs supported by the European Commission are © European Union 2026

We use cookies to remember settings and choices, and to count visitor numbers and usage trends. These cookies do not identify you personally. By using this site you indicate agreement with the use of cookies. For details, click "read more" and see "use of cookies".