Evaluation for Development
S&T partner Nobayethi Dube recently attended an evaluation training workshop conducted by Prof Patton, which she describes.
Evalnet, a South African evaluation company, facilitated a training workshop conducted by the well-known Professor MQ Patton. The training workshop was aimed at programme developers, government officials and evaluators. The training took three days and was attended by about 180 people.
Day 1 (Basic evaluation concepts)
Day one of the workshop was spent discussing basic evaluation concepts. The focus was on evaluators and how to ensure that they maintain focus when conducting evaluations. It was stressed that evaluators should understand and always keep in mind what the evaluation is trying to accomplish - the importance of keeping their eyes on the main prize.
The point is that evaluators cannot do or look at everything, and must resist the temptation to do so - even where clients want you to find out everything. Also stressed was the importance for those involved in the evaluation to do 'reality testing' - identifying the key lessons from the evaluation and they can be used in future evaluations. This included the importance of selecting indicators that inform you when things go wrong and when you can take action.
Day 2 (Reflective patterns)
On the second day participants were grouped into sectors and areas of interest. Prof Patton gave assignments to the groups. One of the assignments was for participants to discuss personal or professional encounters, draw out common patterns, and identify lessons learned from those encounters.
The exercise sought to emphasise the importance of reflection as evaluators progress with an evaluation. Reflection can help draw conclusions based on experiential data and patterns in it. The evaluators can then determine the lessons learned and test lessons in future evaluations. Reflective patterns also ensure ongoing lesson-learning during evaluations.
Day 3 (Indicators)
Day three was spent discussing the usefulness of indicators in informing action. If robust indicators are in place, it is easy to find out why the indicator suddenly changed. It is however important to note that it is sometimes difficult to put useful indicators in place and these may be adjusted as time goes on.
Establishing a South African Evaluation Network
One highlight of the workshop was when participants were given a questionnaire to find out if they would be interested in helping establish a South African Evaluation Network. Most respondents mentioned that they were interested, and thus far it looks like the SA network will be in place by the end of 2002.