Canadian Government Executive - Volume 25 - Issue 03

August/September 2019// Canadian Government Executive / 9 INNOVATION eral public service has identified key learn- ings related to program management. For example, program complexity drives ad- ministrative costs. If one compares two similar programs, in general, the one with less complexity will have lower costs. This was borne out in numerous evaluations of public pension programs, namely the Canada Pension Plan and Old Age Secu- rity. While recognizing that a number of factors are involved in determining the cost per beneficiary of a program, the dif- ference between the CPP ($66.20 in 2014- 15) and OAS ($39 in 2014-2015) is certainly related to varying complexity. Along similar lines, the more policy changes made to a program the more com- plexity is introduced as there are knock- on effects for communicating changes to citizens, information technology systems, delivery officer training, etc. The in-depth analysis of Employment Insurance service delivery showed that as policy changes were made significant impacts on other as- pects of administration, such as staff train- ing and information technology, ensued. Finally, collecting program data retro- spectively via, for example surveys, can be costly, subject to dated information (e.g., contact information), and less reliable as memory recall deteriorates over time. In applying the framework organiza- tions need to be cognizant of which box their various activities would fall into. From there a clear distinction between Box 1 day-to-day monitoring activities and Boxes 2 and 3 which are strategic in the sense described here. By drawing on knowledge of what worked and for whom organizations are able to move along in a strategic direction towards Box 3 work and non-linear advances or fundamental innovation. Organizations can ask them- selves, are we organized to focus on stra- tegic efforts? Do our people spend time on Box 3 activities? Is there support for engaging in non-linear ways that may or may not succeed? What is the incentive to put effort into Box 3? What are the risks? What are the rewards? References 1. The views presented in the article are those of the author and do not represent those of the Trea- sury Board of Canada Secretariat or the Government of Canada. 2. https://www.canada.ca/en/ innovation-hub/services/reports- resources/experimentation-direc- tion-deputy-heads.html 3.3Vijay Govindarajan (2016) The Three Box Solution: A Strategy for Leading Innovation. Harper Business Review Press. http:// www.3boxsolution.com/ 4. https://www.tbs-sct.gc.ca/pol/ doc-eng.aspx?id=31306 5. Employment and Social Devel- opment Canada (2017) Evalu- ation of the Canada Pension Plan Retirement Pension and Survivor Benefits https://www. canada.ca/en/employment-social- development/corporate/reports/ evaluations/2016-summative- cpp-retirement-pension-survivor- benefits.html 6. Employment and Social Develop- ment Canada (2018) Evaluation of the Old Age Security Program, Phase I https://www.canada.ca/ en/employment-social-develop- ment/corporate/reports/evalua- tions/oas-program-phase-01.html 7. Employment and Social Develop- ment Canada (2016) Evaluation of Employment Insurance Automa- tion and Modernization https:// www.canada.ca/en/employment- social-development/corporate/ reports/evaluations/2016-employ- ment-insurance-automation-and- modernization.html see Annex C Christine Minas is Director, Results Division at Treasury Board of Canada data beyond activities and outputs to imme- diate, intermediate and ultimate outcomes. The second box looks back over time to un- derstand what did or did not work, and fur- ther, how progress was hindered. A key pub- lic administration function which is charged with this type of work is evaluation (along with other assurance functions such as inter- nal audit). Typically, programs are evaluated on a 5-year cycle in order to informdecisions regarding their renewal. Often, the planning and analytical work involved in a formal evaluation is lengthy – partly due to insuf- ficient or poor quality data which require retrospectively re-creating the analytical foundation to analyze outcomes. It is in this evidence-based process that in-depth under- standing of program effectiveness – what worked, for whom, and in which condi- tions – is learned. By knowing what worked and did not work one can move forward to box three. Without eliminating approaches which did not work in the past, progress is fundamentally hindered. Applying what has been learned from Evaluation Strategic work takes place in boxes two and three as organizations need to recog- nize barriers to progress in order to ad- vance. Evaluation, as a strategic function, is about identifying how various elements of policies, programs and service delivery perform in general, and specifically, for particular populations, communities, sec- tors, etc. Presenting evidence which iden- tifies attitudes, habits, or practices that limit advancement can be challenging as organizations are often dominated by Box 1 thinking and working. The body of evaluation work in the fed- Strategic work is in Boxes 2 and 3

RkJQdWJsaXNoZXIy NDI0Mzg=