Hello! My name is Su Muhereza, I am the Monitoring and Evaluation (M&E) Manager at the Internet Society Foundation. I have been in the monitoring, evaluation and learning (MEL) world for over five years now, mainly working in the democracy, rights and governance space and most of my MEL life revolved around tracking results and reporting to funders. At the Foundation, my goal is to balance the Foundation’s need to demonstrate impact with testing our strategies for successfully delivering programs. Some of the lessons I am using as guiding principles in this effort are:
Recognizing that change takes time
As a new Foundation, we are in the early stages of determining what problems we want to solve in the world and how best our grant making can help to solve those problems. A lot of the world’s big problems require a long-term approach and perspective, and we have developed five core funding areas that we believe will strengthen the Internet in function and reach so that it can effectively serve all people. In practical terms, we have set ourselves a learning agenda with a set of questions at the program and portfolio level, along with periodic prompts to check in on what success looks like and what changes we might need to make to ensure that we are solving the problems we have identified.
Listening and a true commitment to learning
In my past experience running programs and doing MEL on the grantee side, there was an overemphasis on doing as the funder wanted. There was a hesitation around dialogue with funders over targets, unforeseen challenges and outright project failures. Working for a grantmaking organization during a global pandemic, I have noticed a significant shift in this dynamic, and find there is a willingness to discuss not just the what of project implementation but the how. The Foundation program team is holding reflection calls with grantee cohorts in which we encourage conversations around the reality of how the work gets done. We use feedback from our grantees and our learning agenda to ask ourselves: Are we doing the right thing? Are we achieving our expected outcomes? Is our current programming what we should be doing in the future? And then we use the answers to these questions to inform our decision making and program planning.
Thinking appropriately about impact
Recognizing the principles mentioned above, that 1) change takes time, and that 2) we are committed to listening to our grantees and learning from their feedback, is a modest approach to impact – which is broadly understood as a positive linear change due to our intervention. While we are certainly aiming for a world in which the Internet is for Everyone, as a new Foundation, in its first year of active programming, it is premature to talk meaningfully about impact.
We recognize and accept that we might not see the highest-level impact during the life of one grant project, so we make sure to use a specific kind of language to communicate our expectations to grantees, for example by asking: What do we hope that individuals and communities will be able to do differently from this intervention? In this way, we communicate an expectation to see a change facilitated by our grantmaking but rather than assigning a “good” or “bad” value on that change, we also care about how the change comes about and what it might have to teach us about certain types of projects. We are looking at results.
In my experience, MEL is usually seen as a rigid and slightly removed process from daily program implementation. However, I believe that while program teams are the true content experts, MEL specialists/managers complement their role by facilitating learning and putting an evaluative lens to program thinking and approach. In subsequent posts, I will share some of our thinking on making MEL an agile and adaptive process, facilitating a culture of internal learning and recognizing the power dynamics created by MEL processes.