Understanding the Value of Programme Impact Learnings

The loose, common understanding of Monitoring and Evaluation (M&E) is that it is the systematic tracking of implementation and outputs to measure the effectiveness of programmes. This determines whether or not a programme is on track and anticipate programme changes and timelines. M&E reports, which are the standard outputs of an M&E process, help us to determine the impact of programmes/interventions, the value for money of such programmes, as well as to decide on necessary changes to the way in which programmes are designed.

The business support environment in South Africa has been increasingly adopting the “MERL” approach to programme impact measurement.  This is Monitoring, Evaluation, Research and Learning and the main goal is to build quick learning into the programme implementation and measurement process to enable swift and current adaptations to the programme implementation itself. This approach also incorporates external learnings into the process to inform the rigour of the programmes and better ensure high impact.

The consensus seems to be that it is not possible to determine whether or not a programme will be impactful and be worth the money without its design being informed, its implementation being monitored, and its outputs/impact being measured.  In the current economic climate, this is all the more relevant in view of the major efforts being made towards opening up access to finance and markets for SMMEs but having common standards by which to do and measure this.

Where we still need to find consensus is in terms of a more collaborative approach to our MERL. We still operate in silos with unique definitions and uncommon metrics which makes it difficult for us to set a common tone and standard. We have no way of commonly defining good programme design and implementation, never-mind being able to define, therefore, what “better” or quality” programme design would look like.

SAMEA, a key M&E body in South Africa, defines their vision this way: “SAMEA strives to cultivate a vibrant community that will support, guide and strengthen the development of monitoring and evaluation (M&E) as an important discipline, profession and instrument for empowerment and accountability in South Africa.”

Their goal is to make M&E more of a profession and discipline than it currently is, and this points to the recognition of the need for a more intentional and standardised approach to M&E in this country. We all claim to have the interests of SMME’s at heart. We all claim the desire to see our small business community boosting our ailing economy. Now, we just need to find a way to commonly define and measure our efforts.

One of the key elements of C4G’s M&E platform, is its enablement of business support programme monitoring and evaluation against common metrics, regardless of industry, and cognisant of programme types. The ability to measure across different programme types enables us to see the differences in impact of the various programme types and better align programme design and participants to the desired impact. This growing community’s data allows for ecosystem measurement growth at large and the ability for programmes to benchmark themselves (anonymously) against this ecosystem. This collective effort will prove to be effective in pooling our learnings ad attracting funder resources to quality, high-impact programmes.

Nomsa Langa

Leave a Reply

Your email address will not be published. Required fields are marked *