GSDRC

Governance, social development, conflict and humanitarian knowledge services

  • Research
    • Governance
      • Democracy & elections
      • Public sector management
      • Security & justice
      • Service delivery
      • State-society relations
      • Supporting economic development
    • Social Development
      • Gender
      • Inequalities & exclusion
      • Poverty & wellbeing
      • Social protection
    • Conflict
      • Conflict analysis
      • Conflict prevention
      • Conflict response
      • Conflict sensitivity
      • Impacts of conflict
      • Peacebuilding
    • Humanitarian Issues
      • Humanitarian financing
      • Humanitarian response
      • Recovery & reconstruction
      • Refugees/IDPs
      • Risk & resilience
    • Development Pressures
      • Climate change
      • Food security
      • Fragility
      • Migration & diaspora
      • Population growth
      • Urbanisation
    • Approaches
      • Complexity & systems thinking
      • Institutions & social norms
      • Theories of change
      • Results-based approaches
      • Rights-based approaches
      • Thinking & working politically
    • Aid Instruments
      • Budget support & SWAps
      • Capacity building
      • Civil society partnerships
      • Multilateral aid
      • Private sector partnerships
      • Technical assistance
    • Monitoring and evaluation
      • Indicators
      • Learning
      • M&E approaches
  • Services
    • Research Helpdesk
    • Professional development
  • News & commentary
  • Publication types
    • Helpdesk reports
    • Topic guides
    • Conflict analyses
    • Literature reviews
    • Professional development packs
    • Working Papers
    • Webinars
    • Covid-19 evidence summaries
  • Projects
  • About us
    • Staff profiles
    • International partnerships
    • Privacy policy
    • Terms and conditions
    • Contact Us
Home»Document Library»Sticking to the Numbers: Performance Monitoring in South Africa, 2009-2011

Sticking to the Numbers: Performance Monitoring in South Africa, 2009-2011

Library
Jonathan Friedman
2011

Summary

What can be learned from the South African government’s introduction of a service delivery monitoring system? This study examines the efforts of the Ministry of Performance Monitoring and Evaluation established by Jacob Zuma in 2009. The study shows that the chief ministers succeeded in encouraging departments to set measurable performance targets, but as political support waned, the sustainability of the system was put in doubt. Nevertheless, some officials believe that the system has changed the culture of planning, monitoring and evaluation of policies in South Africa to embrace data-based processes.

During the 15 years since the end of apartheid, South Africa had made strides in extending basic services to previously underserved communities, but frustration with the pace of progress boiled over in early 2009. President Jacob Zuma came to power in 2009 amid a wave of demonstrations by South Africans protesting at the government’s poor record in delivering basic services. During his first month in office, Zuma established a Ministry of Performance Monitoring and Evaluation to improve service delivery by ministries.

Two key officials in the new ministry, Ketso Gordhan and Ronette Engela, identified three major reasons for the government’s poor performance: (1) a lack of accountability at the upper levels of ministries, (2) decentralised and often ad hoc policy planning and (3) poor inter-ministerial coordination. They devised a system that reorganised ministries around 12 policy goals and set data-based performance targets for ministers and departments. Zuma signed performance agreements in April 2010 with his 34 ministers, who together framed 12 delivery agreements during the following months.

Performance and delivery agreements succeeded in their main purpose of specifying delivery targets against which ministers and departments could be assessed. The most important contribution of the outcomes approach was the use of data in formulating and assessing policies. In addition:

  • The approach made the process of formulating policies more coherent and more closely coordinated across ministries.
  • It gave an opportunity to improve policies within and across ministries. It incentivised partners to the delivery agreements to plan strategically, in terms of their own portfolios and targets and also amongst themselves.
  • Some ministries, including those dealing with health and education, embraced the system without pressure from the presidency. The minister of health even made some targets more ambitious than those suggested in order to motivate staff.
  • Other ministries did not embrace the outcomes approach with the same vigour.

Although implementation of the outcomes approach was uneven across departments, it was thought that successful implementation in some ministries would provide leverage in persuading other ministers to increase their participation. Nevertheless, the outcomes approach had several shortcomings:

  • Some delivery agreements were better than others. Although some assigned specific responsibilities to particular officials, others delegated tasks vaguely to entire departments.
  • Complications were caused by a lack of accurate baseline data in some sectors.
  • Without a legislative mandate, enforcement of performance and delivery agreements relied on Zuma’s sustained commitment to hold ministers and departments accountable.
  • The president’s support began to weaken after the politically bruising process of completing the agreements.
  • The designers were accused of failing to develop ministries’ ownership of the outcomes approach.

Source

Friedman, J., 2011, 'Sticking to the Numbers: Performance Monitoring in South Africa, 2009-2011', Innovations for Successful Societies, Princeton University

Related Content

Institutional partnerships and twinning between civil service organisations
Helpdesk Report
2017
Factors important to the establishment, renewal or rehabilitation of the civil service
Literature Review
2017
Public service reform
E-Learning
2015
Prioritising and sequencing public sector reform
Helpdesk Report
2014

University of Birmingham

Connect with us: Bluesky Linkedin X.com

Outputs supported by DFID are © DFID Crown Copyright 2026; outputs supported by the Australian Government are © Australian Government 2026; and outputs supported by the European Commission are © European Union 2026

We use cookies to remember settings and choices, and to count visitor numbers and usage trends. These cookies do not identify you personally. By using this site you indicate agreement with the use of cookies. For details, click "read more" and see "use of cookies".