home
the team
newsletters
about
cape town
stnews
downloads
links
 
   

This article is taken from the August 2001 Phatlalatsa newsletter

 

S&T design a national monitoring framework for South Africa

S&T recently completed a major national study of monitoring and evaluation practices at national and provincial level, and designed a national monitoring framework for South Africa. David Everatt describes the project and its outputs.

Introduction

In 2000, S&T joined a consortium with MXA, Simeka and Khanya, and successfully tendered for a major project: developing a national monitoring framework. The tender was issued by the Department of Provincial and Local Government. The project was large and ambitious, involving over twenty researchers working across the country. As ever, timelines and budgets were tight, and the team worked flat out to meet the deadline.

The project had two phases:

  1. An assessment of monitoring systems and evaluation strategies in 15 national departments, 6 provinces and 4 parastatals, followed by
  2. Designing a national monitoring framework for South Africa

The research phase

The project was based on questionnaires designed to elicit relevant information from Directors-General, Chief Directors and staff directly involved in monitoring and/or evaluation. The focus was on anti-poverty and development programmes, and related M&E activities. Researchers from the consortium interviewed over 120 government officials. Using a structured questionnaire, we probed issues relating to how poverty was defined and which programmes were ‘anti-poverty programmes’; we identified monitoring systems, and analysed their structure, IT platform, data quality and above all the extent to which monitoring data were both analysed and used as a management tool. We also asked about evaluation strategies and the extent to which evaluation results impacted on management decisions. Each researcher then produced a report analysing M&E in the department they had focused on. We included non-government institutions with a monitoring function as well, including the Human Rights Commission, the Independent Development Trust and othersAll of this data - some quantifiable, much of it qualitative - had to be captured on computer and then analysed. Doreen Atkinson of MXA was responsible for this part of the report. Her findings did not always make for comforting reading. While some departments and programmes had functional monitoring systems, many were the victims of high-tech/low-value systems that were unstable or inoperable. Others had good systems but rarely verified their data. Very few were able to report that monitoring data was regarded and used as a management tool: for too many, monitoring and evaluation were administrative requirements that had little relation to their day-to-day work.

A national monitoring framework

Once Doreen had produced the status quo report, S&T Senior Partner David Everatt had to develop a national monitoring framework. The starting point for a national monitoring framework is the conceptual approach one adopts. One can start by trying to identify the nuts and bolts (existing M&E systems) and tie them together. Alternatively – as for this study – one identifies the purpose of the monitoring framework and works backward from there. Much has been written about performance measurement, locally and internationally. The literature was useful in identifying the key partners in most national monitoring frameworks, although much of the literature focused on developing countries with smaller economies and weaker public services, and in which donor funds are far more significant. While ploughing through the international literature, it was difficult to stop the phrase ‘donor driven’ from circulating in the mind. The brief required us to devise a framework that would link existing monitoring systems and evaluation strategies and route their data to a central point where (with additional data) it could be analysed and reported on at cross-sectoral and other levels. The challenge was to make such an endeavour affordable and enabling – not another ‘good idea’ or ‘international best practice’ that in reality is not implementable or merely adds work to already over-worked public servants.For this project, accountability was regarded as the over-arching goal of a national monitoring framework. The purpose of monitoring – that is, regularly collecting and analysing performance-related data – is to measure performance against targets. When this occurs at government level, it is the citizenry to whom government have to account about how tax monies have been spent.

A central data bank

For a national monitoring framework to function, it needs good input data (the M&E systems of programmes and departments, backed up with additional data specifically tailored to fill knowledge gaps); rigorous analysis; and reporting within government and between government and the citizenry. At the heart of the national framework there must be a well-resourced and properly staffed unit that collates data from existing systems; commissions additional data to fill knowledge gaps; analyses the composite data set; and feeds its findings into a reporting and communication strategy.However, South Africa does not have such a unit. During the fieldwork phase of the project, it became clear that many delivery departments were wary of the unit being located in any single department; and none felt that institutions such as Stats SA or the HSRC were able to play such a role.

The ‘digital nerve centre’

As a result, the report recommended that the unit be merged with the ‘digital nerve centre’ that is meant to form part of the Integrated Sustainable Rural Development Strategy. The ‘digital nerve centre’ is not yet in place, and as such can be shaped and moulded to play an expanded role. Moreover, it will be located in the Presidency, giving it a supra-departmental status and allowing accountability to be structured into performance agreements signed by Ministers and the Deputy President.

The way forward

The national monitoring framework report was presented to the DPLG, which was also working on the Medium Term Strategic Framework (see story on the front page). The two processes are closely related, and the DPLG is now responsible for steering the process through to completion.

 

[top] [to ] [Previous page]
     
home
the team
newsletters
about
contact
stnews
downloads
links