Within the last year, the Treasury Board of Canada Secretariat (TBS) released two new requirements for federal departments relating to efficiency metrics. This focus on efficiencies is not new: through strategic reviews, departments have been asked to report on program efficiencies, and these are certainly core to conducting proper program evaluations. What is new, however, are the TBS efforts to better embed efficiency tracking into departments’ management practices and reporting mechanisms.
To this end, in late 2012 TBS released a set of proposed efficiency measures for internal services. In addition, as part of the evolution of the policy on Management, Resources and Results Structure (MRRS), TBS has asked departments to develop efficiency measures for programs in their 2014-15 Performance Measurement Frameworks.
To help stimulate collaborative discussion on the topic, the Performance and Planning Exchange hosted a facilitated, interactive learning event on developing effective efficiency measures in January 2013.
A pre-session survey and the discussions at the tables raised a number of interesting questions and issues that helped participants identify some common challenges around measuring program efficiency indicators. Among these were:
• What costs should be included? There is a need for more work on standard approaches for costing as well as considerations such as direct versus indirect costs, short- versus long-term costs, central (internal services) versus program costs, and costing for mature versus nascent programs. Without prescribed methodologies that all departments could follow (to ensure consistency), it must be recognized that the differences resulting from department-driven methodologies will make any department-to-department comparisons difficult.
• Comparing apples to oranges: With the understanding that federal departments and their programs might eventually be compared based on efficiency data, there was widespread concern about how to ensure consistency in program definition within departments, or across similar departments (e.g., science departments). Closely linked to the question of what costs should be included, participants noted that it is also a question of looking at similar program requirements, outcomes and delivery mechanisms, not just similar program outputs.
• Measuring the unmeasurable: The MRRS guidance describes measuring efficiencies in terms of a basic costing exercise. However, this requires a tangible, quantifiable output. How does one make this calculation in programs where it is difficult to measure the output, such as policy or communications? How do we define what is or is not tangible? What is the impact of “lifecycle costing” in the context of long-term outputs (e.g., science programs that deliver on 5- or 10-year or longer time horizons)?
• Perverse incentives: When considering the costing of outputs, efficiency does not equate to effectiveness. Could a focus on efficiency indicators possibly perversely incent departments to compromise effectiveness and service quality to achieve efficiency targets or other benchmarks between organizations?
• Conveying the whole performance story: Bringing meaningful context to efficiency measurement is needed to support informed decision-making. Despite the quantitative nature of efficiency measures, departments must continue to bring the important qualitative elements of a program into focus for a more holistic decision-making process.
• Building awareness, engagement, capacity and support: Ensuring that senior management is aware of and engaged in the process is paramount. In addition, it is critical that the right people from functions across the department (e.g., evaluation, finance, planning or economic analysis) all collaborate in the development of good quality efficiency indicators. Finally, to succeed there needs to be sufficient investment in individuals and technology to consistently access the right data and processes to analyse and report.
In summary, the road to building strong intra- and inter-departmental comparable efficiency indicators will be a long one. However, the destination may not be as important as the journey. Focusing attention on the resource requirements to deliver on program outputs and ultimately program outcomes should strengthen an organization’s understanding of what and how it meets stakeholder expectations.
For more information on this event, including presentation downloads and links to ongoing discussion forums on the topic, please visit www.ppx.ca.