home
the team
newsletters
about
cape town
stnews
downloads
links
 
   

This article is taken from the November 2001 Phatlalatsa newsletter

 

Reviewing M & E for the Gauteng CBPWP directorate

S&T was commissioned to review the monitoring system and evaluation strategy of the Community Based Public Works Programme Gauteng Directorate. Senior Partner Moagi Ntsime, who managed the project, outlines the project and its main findings.

The Community Based Public Works Programme (CBPWP) directorate within the Provincial Department of Public Transport, Roads and Works (DPTRW), in Gauteng undertook various policy initiatives intended to realign the programme for the 2000/2001 year. Most of the initiatives of the directorate took into consideration some of the strategic developments within the realigned national CBPWP.

Key activities and changes in the province included the following (amongst others):

  • realignment of delivery mechanisms in the province;
  • adoption of an integrated approach to infrastructure delivery;
  • review of financial procedures;
  • adoption of a new provincial targeting strategy, taking into consideration the current demarcation boundaries in the province; and
  • reviewing the monitoring and evaluation system within the directorate.

The need for an M & E review

Some of the activities outlined above had a higher priority than others. The M & E review was considered an urgent activity if not a necessity - if the impact of the programme was to be measured both regularly and accurately.

S&T was commissioned by the Gauteng CBPWP Directorate to review its current monitoring and evaluation requirements and the capacity required to ensure the efficient functioning of the system.

Purpose of the review

The monitoring and evaluation review focused on the following aspects of the system:

  • review the reporting requirements and data path from project level to directorate;
  • assess capacity requirements of users as well as data use for management purposes;
  • assess the frequency and accuracy of data capturing, reporting and analysis ;
  • assess whether there is a uniform M & E reporting system for the directorate; and
  • based on the findings of the review, design an M & E framework for the Directorate.

Methodology

The review used two methods: documentation review and in-depth interviews. For the first, we relied mostly on key strategic documents within the directorate. For the second, we developed structured guidelines about current M & E practices and needs in consultation with the Directorate. These were used to conduct in-depth interviews with key players within the process.

Findings

Firstly, monitoring was taking place within the Directorate, albeit in a less systematic fashion than required. We also noted that there were challenges were M&E were concerned, and some of these are outlined below.

We found a lack of uniformity in terms of what project co-ordinators (departmental officials responsible for project monitoring) were expected to do and/or monitor within the existing framework. As a result, in many instances a great deal of the information collected at project level did not serve the monitoring function.

Also, while some of the information was reported on regularly, the responsibility of reporting was located outside the directorate and with project managers (who were independent consultants). As a result, most of the monitoring data were not situated within the directorate but with these outside service providers. These raised questions about strengthening and developing the capacity of the directorate to function better and take ownership of the data.

Inaccuracy of data was fairly common. While some of the service providers tried to report on the actual performance of the projects, others did not update their data; often the data reported on were inaccurate and inconsistent. If the programme needed to be moved in a particular direction to maximise the impact or deal with unforeseen problems, it would be extremely difficult - and misleading - to do this on the basis of inaccurate and inconsistent data.

Furthermore, we noted that data analysis and tracing emerging trends in the programme did not seem to take place. The only analysis that we were informed about related to minutes and scanty reports written by outside consultants, which often did not focus on key impact issues relating to the programme.

Recommendations

We noted three key areas or levels of reporting and monitoring within the directorate. Firstly, the pre-implementation or baseline data; in other words, business plan data that needed to be separated from ongoing regular monitoring of the project progress and performance. This mostly referred to information captured on a once-off basis, for example, envisaged number of beneficiaries, intended jobs to be created, and overall project budget and so on. We stressed the importance of recording such targets and using it as the baseline against which performance can be measured (and for the use of programme evaluators).

The second level we identified was at project level. We specified indicators to be reported against and thus monitored at that level. For example, regular administering of a wage register to capture the number of people employed and their demographics, days worked, wage rates and so on.

The third level was at directorate and management level. The frequency of reporting, the type of information reported on, the report format, analysis and management of the programme using M&E outputs were some of the issues that were recommended to the directorate.

Way forward

It is important that monitoring and evaluation be viewed as an integral part of the entire programme cycle. It should not be viewed by those implementing the programme as an additional administrative burden, separate from the programme outputs and deliverables.

As such, for the recommendations to be realised it is important that the following activities be carried out by the Directorate:

  • Firstly, conduct a workshop with all key members of the Directorate responsible for project implementation, management and monitoring to understand the ethos and culture of M & E and allow them space and scope to make informed inputs.
  • Secondly, to design a simple, user-friendly and uniform tool for recording and analysing data for the Directorate. This system would assist the Directorate in producing regular reports pertinent to their needs and competencies.
  • Linked to the above would be to ensure that if a monitoring system were to be introduced, it must not lose its social and developmental component. For this to succeed, it would be important that the process of developing and designing the M & E system is an integrated process involving M & E specialists as well as (but not driven by) IT/software expertise.
  • It is important to put timeframe for the entire process of designing and developing an M & E system. Sufficient resources need to be sourced for the development and capacity building processes.
  • Finally, implementing a fully developed M & E system would require a training component. Therefore, it would be important for the M & E specialists to develop an M & E training manual that would be used by those involved in the implementation and monitoring of the programme. This would contain generic M & E principles as well as specific elements of the system.
 

[top] [to ] [Previous page]
home
the team
newsletters
about
contact
stnews
downloads
links