GSDRC

Governance, social development, conflict and humanitarian knowledge services

  • Research
    • Governance
      • Democracy & elections
      • Public sector management
      • Security & justice
      • Service delivery
      • State-society relations
      • Supporting economic development
    • Social Development
      • Gender
      • Inequalities & exclusion
      • Poverty & wellbeing
      • Social protection
    • Conflict
      • Conflict analysis
      • Conflict prevention
      • Conflict response
      • Conflict sensitivity
      • Impacts of conflict
      • Peacebuilding
    • Humanitarian Issues
      • Humanitarian financing
      • Humanitarian response
      • Recovery & reconstruction
      • Refugees/IDPs
      • Risk & resilience
    • Development Pressures
      • Climate change
      • Food security
      • Fragility
      • Migration & diaspora
      • Population growth
      • Urbanisation
    • Approaches
      • Complexity & systems thinking
      • Institutions & social norms
      • Theories of change
      • Results-based approaches
      • Rights-based approaches
      • Thinking & working politically
    • Aid Instruments
      • Budget support & SWAps
      • Capacity building
      • Civil society partnerships
      • Multilateral aid
      • Private sector partnerships
      • Technical assistance
    • Monitoring and evaluation
      • Indicators
      • Learning
      • M&E approaches
  • Services
    • Research Helpdesk
    • Professional development
  • News & commentary
  • Publication types
    • Helpdesk reports
    • Topic guides
    • Conflict analyses
    • Literature reviews
    • Professional development packs
    • Working Papers
    • Webinars
    • Covid-19 evidence summaries
  • About us
    • Staff profiles
    • International partnerships
    • Privacy policy
    • Terms and conditions
    • Contact Us
Home»Document Library»Evaluating digital citizen engagement: A practical guide

Evaluating digital citizen engagement: A practical guide

Library
Digital Engagement Evaluation Team
2016

Summary

This guide provides practical steps to assess the extent to which digital tools have contributed to citizen engagement, and to understand the impact that the introduction of technology has had on the engagement processes. The guide draws on examples and lessons from case studies from Brazil, Uganda, Cameroon and Kenya. It can be used at many stages— to inform project design, as a tool for continual learning and improvement, and for undertaking mid-term or post-hoc evaluations. The guide was written primarily for practitioners – including task team leaders at the World Bank Group, project or program delivery staff at civil society organizations, and internal or external evaluators or consultants throughout the project cycle – but is also a helpful resource for anyone seeking to better understand the role of digital technology in citizen engagement.

The guide presents five ‘lenses’ through which digital citizen engagement (DCE) interventions might be viewed while undertaking an evaluation:

  • Objective: What are the goals of the initiative, and how well is the project designed to achieve those goals?
  • Control: Which actors exert the most influence over the initiative’s design and implementation, and what are the implications of this?
  • Participation: Which individuals participate in the initiative, and to what extent is their participation in line with their needs and expectations?
  • Technology: How appropriate was the choice of the technology, and how well was the technology implemented?
  • Effects: What effects did the project have, and to what extent can this impact be attributed to technology?

The guide is structured around the five stages of an evaluation lifecycle:

  • Scoping: This stage lays out the groundwork for the design and implementation of the evaluation by investing time and resources into understanding the project and its context, the operating environment and the recent developments and insights from the DCE evaluation field.
  • Designing: This stage involves agreeing the focus, goals and objectives, designing the evaluation questions, and deciding on an appropriate approach and method to achieve those goals.
  • Planning & Implementing: This section describes how the design process now moves to a more detailed level to decide what tools to use within the broad method for collecting new data, whether or not to use digital tools to collect new data, and how data collection can be implemented.
  • Analysing: This stage discusses how the DCE data can be analysed and provides pointers for quantitative, qualitative and mixed methods of analysis.
  • Sharing, Reflecting & Learning: This final section focuses on testing the findings, writing up the results and analysis of a DCE evaluation, considers methods of sharing findings (including discussing opening up evaluations and their data), and reflecting and learning on the lessons from evaluations.

The guide offers two toolkits:

  • DCE evaluation bank: examples of primary (assessment/analysis) and supplementary (information gathering) evaluation questions – grouped by lens – and some ‘satisfaction’ questions.
  • Using the lenses in scoping and design: a set of considerations and questions that an evaluator might ask during the scoping and design stages, again grouped by lens.

Source

Digital Engagement Evaluation Team. (2016). Evaluating digital citizen engagement: A practical guide. Washington, DC: World Bank.

Related Content

Responding to popular protests in the MENA region
Helpdesk Report
2020
Government Communication Capacity and Media Freedom
Helpdesk Report
2019
Strengthening the Financial Independence of Independent Media Organisations
Helpdesk Report
2018
Youth initiatives supporting citizen engagement with government
Helpdesk Report
2017

University of Birmingham

Connect with us: Bluesky Linkedin X.com

Outputs supported by DFID are © DFID Crown Copyright 2025; outputs supported by the Australian Government are © Australian Government 2025; and outputs supported by the European Commission are © European Union 2025

We use cookies to remember settings and choices, and to count visitor numbers and usage trends. These cookies do not identify you personally. By using this site you indicate agreement with the use of cookies. For details, click "read more" and see "use of cookies".