Sometimes I think being an evaluator teaches you as much about people as it does about data. You start out believing your job is to track progress, measure outcomes, and write reports that help others learn. But somewhere along the way, you realize you are learning too, about patience, about how change actually unfolds, and about how numbers rarely tell the whole story.
Evaluation humbles you. You see how much effort goes into projects that do not always land as planned. You watch people stretch limited budgets, negotiate expectations, and still find creative ways to make things work. You sit in meetings where evidence meets emotion, and you start to understand that the journey is as important as the result.
Three years in, I have stopped seeing Monitoring and Evaluation ( M&E) as a technical process but a practice of curiosity; one that keeps asking, “What are we really learning here?” Some of these lessons came quietly, others are hard truths, but they have all shaped how I now see the field and my place in it.
Here are four that stand out:
Lesson 1: Results-Based M&E is easier said than done.
We hear it everywhere: “results-based,” “impact-driven,” “evidence-informed.” It is like the anthem of every development project. But the truth is, practising results-based M&E is hard. I have noticed that some NGOs stop at tracking activities and outputs: “number of training held” or “participants reached.” And to be fair, it is not entirely their fault. Doing deeper, outcome-level tracking requires resources, capacity, and, most importantly, time. Donors often want quick numbers. So, in the rush to meet reporting deadlines, we miss opportunities to actually understand what worked and why. Every time I read an annual report that lists activities as results, I am reminded how wide the gap still is between what we preach and what we practice.
Lesson 2: Maybe we need to hire differently.
Here is something that has been on my mind lately: most M&E job descriptions are designed for superhumans. They expect one person to design logical frameworks, manage data collection, clean and analyze datasets, build dashboards, conduct evaluations, and still have time to write polished reports. Data analysis, in particular, is a serious skill that deserves its own focus. I know M&E officers are supposed to understand their data inside out, but if the budget allows, hiring a dedicated data analyst could really improve quality and efficiency. Imagine how much stronger our insights would be if each part of the M&E chain had someone specialized in it. (And yes, I can already hear someone saying “but budgets!”, please do not come for my head).
Lesson 3: Facilitation is the quiet superpower.
If you asked me three years ago, I would have said M&E was mostly technical. Frameworks, indicators, databases, and all. But lately, I have realized that facilitation is becoming one of the most valuable skills in this field. Being able to lead learning sessions, facilitate training workshops, or simplify complex data for non-technical audiences? That is what makes your work actually useful. It is how evidence turns into understanding. And honestly, it is a skill that opens doors beyond M&E.
Lesson 4: AI is the new baseline.
And finally, AI. I cannot pretend it is not changing everything. From quick data cleaning to drafting reports or analyzing trends, AI tools are slowly becoming part of our everyday workflow. Gaining AI literacy is not just about staying trendy; it is becoming the new baseline for relevance, both in development and beyond.
Looking back, these lessons have shaped how I see my work and what it means to grow in this field. They have reminded me that the field of M&E is not static; it moves with the people, the tools, and the times. And I have learned that staying relevant is about staying curious, asking the right questions, and never losing sight of the purpose behind the data.
About the Author
Rachael Okoronkwo is a development professional and program manager with over five years of experience designing, implementing, and evaluating social impact projects for government and non-governmental organizations. She brings a versatile blend of expertise in program coordination, results-based management, and evidence generation, applying data-driven insights to strengthen systems and improve development outcomes.
She holds a PMD-Pro certification and has supported and led programs across sectors, including health, education, gender, and youth development. Her experience spans project design and delivery, stakeholder engagement, and monitoring and evaluation of large-scale initiatives, ensuring that interventions remain inclusive, impactful, and sustainable.
Rachael is also a skilled facilitator and trainer, having designed and delivered virtual and in-person sessions for government institutions and NGOs. Her sessions emphasize participatory learning and practical application, empowering teams to adopt adaptive management and evidence-based practices in their work.
A Fellow of Friedrich Ebert Stiftung (FES) Nigeria, Rachael is an active member of the EvalYouth Global Network, where she advocates for youth inclusion in evaluation. She remains passionate about advancing gender equity, leveraging technology for development, and promoting programs that create meaningful change.


