The world is at a reset. As I write this piece, the COVID-19 virus has affected 199 countries of the world and a third of the global population is on lockdown. Some professions are already affected since the first case of the virus was announced on November 17, 2019 in Wuhan, China (Kindly roll over the map below to view total cases versus deaths as at March 27, 2020) . The Evaluation profession is not also immune to the devastating effect of the COVID-19. At Cloneshouse Nigeria, some of our field visits and in-person interviews are been redesigned to reflect current realities ; evaluation plans that never considered there might be restriction of movements, are being reviewed; and the restructuring of our evaluation team are being considered. In this article, I will be highlighting some possible changes to methods we use in conducting evaluation, as we adapt to a new world order. To continue sharing new ways you or your organization are approaching this, please fee free to share your methods in the comment box below.
To ensure the safety of teams, in the last two weeks we have been reviewing the terms of reference of our colleagues, especially our enumerators that are based in over 27 states of the country. Their work will now be reviewed to compose largely of desk reviews, and remote analysis. Of course this means deliverables will be revised, while the cost is reduced as well. But how do you conduct desk reviews where beneficiaries haven’t provided such information by default? or when the commissioner had no thought about this at first?. I am currently working on a British Council M&E system set up for beneficiaries, and it’s been difficult for the beneficiaries to provide information that will help to conduct remote analysis and desk research.
The profession will adapt itself to new and evolving evaluation approaches, methods, technologies and tools that are needed to continue our work during this era.
Data can be collected remotely, but this can be tedious when your evaluation target would not easily provide the information needed to process data. Pre-coronavirus, M&E tools that allow collecting data from the evaluators’ place has become common, but because of its limitations, not many evaluation plans consider a remote collection of data. In a just-concluded work on a DFID supported programme, we employed the use of phone interviews with 42 respondents. It took some time before we could agree with the commissioners, that phone interviews will be suitable for the beneficiaries. Nonetheless, it has its downsides as well. For example, beneficiaries complacency with strange phone calls, which we experienced in a few cases. But this is the coronavirus era, evaluators will need to adapt to this methodology to collect information from beneficiaries. Will it be convenient? Maybe yes, or no – but giving beneficiaries convincing reasons before initiating the calls might help. But, what can be more convincing at this time of our lives!
The coronavirus will have a profound impact on methods and key indicators used for evaluation plans. For example, are we still going to be interested in the number of participants that attended a programme at this period? Most face to face programmes have been canceled, some are now trying to go online. Our ongoing work on ascertaining the extent at which fake news is shared amongst citizens during this COVID-19 epidemic is documented using a geo-referenced Ushahidi map. The evaluand will need to revise and/or review indicators that have been earlier outlined on M&E plans, while those that cannot be achieved during this period, owing to working remotely, or conducting events online, challenges, will be put on hold.
As the world reset, evaluators can leverage on staying connected with networks, and voluntary associations. The profession will also adapt itself to new and evolving evaluation approaches, methods, technologies and tools that are needed to continue our work during this era. Some of the methodologies that will be prominent at this time will include online tools, while evaluators work remotely. Whether these approach forms rigorous methodology of evaluating programs, is something to think about for the near future.