Canadian Government Executive - Volume 23 - Issue 1
M&E M&E M&E M&E M&E M&E M&E M&E Importance of the institutional arrangements to connect M, E, RBM and the Deliverology process TBS Monitoring (M) Evaluation (E) PCO (Results & Delivery Unit) Political Support Deliverology 8 / Canadian Government Executive // January 2017 that in many cases where deliverology has been introduced, a capacity for systematic evaluation either did not exist or was in its infancy). There is a potential for conflicts between departmental managers and the requirements imposed by deliverology that may adversely impact the building of Mand E into the organizational culture and op- erations and, by extension, decision-making and planning processes. Thirdly, the poten- tial burden imposed by the deliverology process on targeted departments associated with government priorities could impair the effort. Finally, there may be a perceived lack of openness. What does Canada need to do to maximize the positives and minimize the potential negatives? Some Considerations in Going Forward Chart 2 depicts the relationships being es- tablished among the major players impli- cated with the introduction of deliverol- ogy in Canada – PCO; TBS; departments/ agencies; and, the political support of the government. Their smooth functioning however is not guaranteed, and so some consider- ations addressing practical implementa- tion issues are offered. They address both the tools to measure results and the insti- tutional arrangements, including roles, responsibilities and coordination across major players. (i) Recognizing the importance of E in measuring and understanding ‘results’: What the lengthy experience with both M and E in the Canadian fed- eral scene has demonstrated is that the expectation for M, as a tool to measure ‘results,’ is likely overstated. To a large extent, ‘results’ are still not being mea- sured by M, in spite of a significant in- vestment and a concerted level of effort to develop monitoring schemes. This is a function, in part, of unrealistic expecta- tions regarding the ability of M to deliver a cost-effective approach to measuring outcomes. (ii) Working with the current M and E infrastructure, adjusting as needed: As noted above, unlike most previous experiences with deliverology, Canada offers a relatively mature M and E infrastructure, though it is not with- out its issues. The new Policy on Results is in theory addressing these, but this is an opportunity to find the right balance between ‘learning’ and ‘accountability’ in demanding and using results/M and E information. (iii) Incorporating evaluative think- ing and adaptive management into the measurement and use of results information: Associated with the above, more flexibility, rather than command and control, would allow an enabling en- vironment for evidence-based learning and innovation, i.e. regularly using infor- mation on results to improve delivery. (iv) Collaboration and coordina- tion between PCO (RDU) and TBS (Results Policy) - Aligning and stream- lining the measurement and reporting requirements: Both PCO and TBS have an interest in measuring ‘results.’ PCO is the central authority behind deliverology focusing on a few priorities of government, TBS has a broader scope as it rolls out its new Policy on Results . Both are imposing new requirements insofar as results mea- surement and reporting is concerned. To avoid burden on the system, ongoing col- laboration and coordination between the two central agencies, as well as the occa- sional ‘health check’ on individual depart- ments and the system in general, would help avoid the new schemes becoming overly burdensome. (v) Addressing cross-boundary col- laboration: Since government priorities likely involve several programs across several department (and, possibly differ- ent levels of government), accelerating cross-boundary collaboration could be an important entry point for deliverology. The Canadian experience of linking E to policy development (in departments and centrally) or horizontal initiatives of gov- ernment has for the most part not been strong. The realities of results measure- ment typically find programs at different levels of maturity, M and E readiness, and data availability. These factors underline the importance of cooperation and coordi- nation across several players. (vi) Learning and adjusting as need- ed: Given the importance of implemen- tation to the success of deliverology, a ‘formative’ type study would be useful in the short-term, offering advice on adjust- ments, as needed. In the medium-term, the system would benefit from a broader assessment of cost and effectiveness issues of deliverology, conducted by the Auditor General or Parliamentary Budget Officer, to ensure its objectivity and transparency. While the above considerations will not guarantee success for the deliverology pro- cess – that would involve factors far broader than the issues of measurement and orga- nization – they do represent a fundamental starting point in its implementation, and so working from the outset to getting it right ought to be paramount. R obert L ahey is founding Head of Canada’s Centre of Excellence for Evaluation, the federal government’s policy centre for evaluation, and, over three decades had headed the evalu- ation function in four government departments and agencies in Canada. He is a member of the Canadian Evaluation Society’s (CES) Credentialing Board. Chart 2 Instituting Deliverology in the Canadian Context
Made with FlippingBook
RkJQdWJsaXNoZXIy NDI0Mzg=