In theory, it should be easy to convert evaluation teachings and theories into practice, but often it is not. As Mastercard Foundation, Young Impact Associates (YIAs) on a learning exchange program with Cloneshouse, we were tasked to conduct a Rapid Evaluation of the organization’s online courses from 2020 to 2022. The Online Basic Monitoring and Evaluation (OBME) & Online Results-Based Monitoring and Evaluation (ORBME) courses are part of Cloneshouse’s strategy to enhance the practice of M&E in Africa.
Armed with our theoretical knowledge of evaluation, we set out, using the Utilized Focus Approach that focuses on the primary users of the evaluation’s findings and learnings. We conducted a desk review of online course materials and curriculum and administered an online survey to those who participated in the training from 2020 to 2022 to glean insights about the courses. The process of conducting the evaluation, however, taught us some things, and here are four of them.
- Not-so-rapid evaluation: The rapid evaluation was supposed to be done in 3 weeks, but it took us about 4 months to complete the evaluation and report the findings. The data collection period coincided with the holidays, and we also had issues reaching an acceptable response rate. There was also the back and forth with the commissioner with regard to evaluation products – a slide deck, a blog, and an evaluation report. This taught us that as evaluators, we need to be flexible as the evaluation process may not always go as planned and thus might need to be adjusted.
- There are many ways to skin a cat: As part of the rapid evaluation process, we administered questionnaires to participants that had enrolled in either the OBME or ORBME course during the period under study (2020-2022). We shared the questionnaires via email with the participants and initially recorded a low response rate despite several reminders. Initially, it seemed impossible to improve the response rate of the questionnaire until we reached out to Oludotun Babayemi, the Commissioner, who introduced a follow-up strategy that increased the response rate. This experience showed that there are often other alternatives that can be explored and that, as evaluators, when all that we know fails, it is best to reach out to people with experience before concluding that a task is impossible.
- Ongoing communication channel with the commissioners: While conducting the rapid evaluation, we constantly communicated with the commissioner, who, in this case, was the supervisor. It improved the evaluation process and showed that, as evaluators, it is important to keep your commissioner in the loop so that they are aware of your progress and can also assist with mitigating possible issues.
- Data, data, data: While conducting the document review, we requested data from the Cloneshouse training team and met with the team to discuss our data requirements. While they did a great job providing the data, it was revealed by the commissioner in one of the final meetings that some data could have been shared but was not. This taught us that while as evaluators you may do your best to request all required data from the commissioner, whether or not you get all that you need is still up to the commissioner. As evaluators, we need to probe and persuade commissioners to provide as much of our data needs as possible.
Conducting this rapid evaluation showed us some of the issues that may arise from the evaluation process and the practical steps that can be taken to mitigate these issues. While evaluations are not one size fits all, these key lessons can be applied to future evaluations with the aim of fine-tuning the evaluation process. In the meantime, could you share with us using the comment box, some of your stories on carrying out an evaluation? We will be glad to read them.
Written By Seember Ishuh, Godwin Kwaghngee, Khadija Yahaya Muhammad, Enitan Sophie Oluwa, Asheadzi Yusuf-Wasuku and Ochanya Okoh.
ABOUT THE AUTHORS
The authors are members of the MCF Young Impact Associates’ Programme, currently participating in a 6-month exchange program with Cloneshouse. As part of their program, these authors have conducted a rapid evaluation on the online courses offered by Cloneshouse.