The principle is unassailable: Canadians have the right to know whether their foreign aid money is being effectively spent.
To meet this challenge, the Canadian International Development Agency (CIDA) uses a sophisticated results-based management (RBM) methodology that, in principle, shows how each project’s inputs (money) produce outputs (x teachers trained) that translate into outcomes (percentage increase in enrolment).
Much time and effort goes into preparing, reporting on, and evaluating these schematics. But do they reveal whether our foreign aid money is being well spent? Numerous obstacles preclude an unequivocal “yes” response to this question. RBM systems are anything but simple, typically requiring lots of investment and years of refinement.
In practice, RBM reports to donors have significant credibility problems. One reason is overload: development projects typically place significant time demands on counterparts above and beyond those of their day jobs. The incentives to treat the labour-intensive reporting requirements as anything more than a pro forma obligation are minimal. In one particularly egregious case, district administrators in Uganda were reporting monthly on as many as 2,000 performance indicators.
Another obstacle is technical. A single project is just one influence among many. The desired outcome will inevitably be affected by numerous actions and actors outside the confines of the project. This is particularly true when RBM enthusiasts overreach as they move up the results chain. Projects supporting a modest number of exchanges between two governments somehow miraculously, at least in their results matrix, lay claim to increased GDP.
Relevance can also be problematic. To be meaningful, monitoring, reporting and evaluation must inform decision making. Project progress reports and assessments can easily become lost in an internal review cycle involving consultants and mid-level government and donor officials, and never come near a real decision maker. Although the determination of outcomes and impacts tends to occur in the medium term, the majority of evaluations are performed at a project’s conclusion, before most of the intended outcomes have been realized.
What then is to be done? The opposite is hardly enticing: abandon RBM, explain to Canadians that development results are just too hard to measure, and accept that Homer Simpson was right when he said “trying is the first step towards failure.” In fact, the results game must be played.
The answer is in how rigorously the rules are applied; think of it as quantitative easing.
Commonsense accommodations might include: a focus on project outputs that the government has already identified (often in budget submissions); considering “collectability” and “reporting burden” when choosing measures; linking reporting to the government’s own reporting cycles; using common indicators when multiple donors are working in the same area; being realistic about the project’s reach (it will not single-handedly deliver a Millennium Development Goal commitment); deferring the project evaluation until a few years after its conclusion; considering the political relevance of what is to be reported and finding ways to present conclusions in an appropriate format for decision makers (we would never expect a Canadian minister to read a 50-page report or decipher a log frame).
Most important, recognize that RBM is not the only results game in town.
Beyond RBM lies the world of ethics, relationships and networks. Harried senior managers in developing countries generally will tolerate projects, but will only embrace them when it is clear that they are offering value for time. True value does not come from a PowerPoint presentation, but from the personal rapport and trust that are built with one’s counterpart over a sustained period of time.
This is not to suggest that RBM frameworks should devise performance indicators to capture the degree to which project managers suss out, befriend and support influential change agents. Tangible outputs will still need to be identified and monitored.
But back to the main question: will we ever know if our foreign aid money is being effectively deployed? Well, probably not. But to the degree that the Canadian public and private sectors offer good models and people willing to engage counterparts at a professional and personal level, the likelihood that something positive will result, however unquantifiable, remains strong.
Gordon Evans is an international consultant who has worked in over 25 countries focusing on government decision making, strategic planning and policy formulation. He co-authored Helping Governments Keep Their Promises for the World Bank.
Progress through relationships
In the late 1990s, a project supporting the introduction of Ontario’s business planning system in newly-independent Lithuania veered outside its terms of reference when it agreed to assist the deputy speaker of Lithuania’s parliament reform the parliamentary committee system. A good rapport was struck between the project and the deputy speaker. In 1999, the deputy speaker was included in a senior Lithuanian delegation which saw, first hand, how Ontario’s planning and decision-making system functioned. Highly enthused, he concluded that this was something his country badly needed. In 2000, the former deputy speaker, now Prime Minister, introduced an Ontario-style strategic planning and cabinet committee system. The Lithuanian model has since been widely praised by the World Bank and remains in place today.