For quite a long now, long enough (by far nine years) I have been engaged with many organisations to take care and deliver for programme monitoring and evaluation. In early days (as early as 2006) when I first got involved rather heard about Monitoring & Evaluation (commonly spoken as M&E), I knew little things about it, obviously it wasn’t something I was taught in my post grad days, nor had options to opt for internship programmes. This journey began with as simple as reading a booklet related to M&E, get familiar with concepts, terminology and most importantly the thinking – something I found unusual and temporary in those early days. Who would have thought (almost a decade back) terms like logframe, data collection tools, indicators, outputs, goal, frequency, means of verification, baseline, mid-term, end line surveys…would become an everyday affair, even if angles would have whispered, thoughts would have never been so – AND the journey begins. It took me quite some time to memorise these terms and initially nodding head was a common gesture when someone used to speak these words – internally the confidence was turbulent. Eventually, I could not just understand the words, means and also their role in programme design and implementation. While this challenge was fading slowly, other ones were in queue – the database management, cloud computing, use of SharePoint etc. These emerged because by now organisations (the then employer) have been talking for quite some time to standardize monitoring & evaluation tools, methods, practices and needed to customise these considering local contexts. With lots of discussions on coffee hours (the webinars with colleagues emerging as dedicated community of practice), mainly in-house database designs and could computing options (in low budgeted programmes) were developed and adopted, catering to individual country programme needs – where data collection was big and widespread. This was a great innovation of those times and got wide acceptance across agencies and programmes. While adapting these practices, the demand for new innovations couldn’t stop as part of changing global phenomenon. One could say in simple words, this was the time for making M&E solutions simple, catering to higher data demand, work at scale, have real-time solutions run by mobile devices – a phone or a tablet. Keeping use of cloud computing as constant, the demand for infographics, real-time data collection and evidence based reporting became more popular, also because time has progressed well to use mobile technology, making daily life easy yet efficient – the BIG value emerged for a variable was time. As human race progressed from use of locomotive engines to cars now running on lithium (rechargeable) batteries making journeys driven by innovations and so did other spheres. Specifically, to make monitoring & evaluation real-time, innovations have to be set in and it was possible quite easily, just that M&E practitioners need to think about out of box solutions while working under budgetary constraints. And changing technology emerged relief for all, so was for M&E practitioners, hence they introduced the use of mobile applications like CommCare for data collection, use of Google forms, sheets, MS Power BI for dashboards, develop infographics, factsheets etc. using open source solutions and What’s App for evidence building needed on daily basis. All of this is a tiny piece of what M&E practitioners use across agencies given diverse programmes and data needs.
For all of us, the demand for evidence based data has now reached to a point where thinking an extra mile becomes inevitable. For example, under Indian Public Library Movement (IPLM) hosted by NASSCOM Foundation (part of Global Libraries programme by Gates Foundation), the idea to have a comprehensive M&E system in place is to help librarians know how users use public access computers in public libraries vis-à-vis generating data for indicators of seven issue areas, includes digital inclusion, culture & leisure, education, communication, economic development, health and government & governance as part of Common Impact Measurement System (CIMS) – a measurement practice of Global Libraries programme. Under IPLM, we initiate the practice of pop-up survey tool, supporting to generate data for selected indicators of above given seven issue areas and CommCare mobile app for real time data collection for output level indicators, both complementing each other. These being quick, keep feeding the management decisions and also build evidence for policy change. The rationale is, more frequently the evidence based data is generated, higher would be value proposition for such data to share and interact with stakeholders for policy change and partnership building.