On December 3, 2013 the Performance and Planning Exchange held a workshop entitled, “Putting Performance Reporting to the Test: The 2013 DPR Rating Bee.” It was an opportunity for students in the Diploma in Policy and Program Evaluation program at Carleton University to put their evaluative skills to the test.
They were asked to assess five Departmental Performance Reports (DPR) according to two lenses: two of 13 criteria (reliability and validity, and relevance) from the Guide to Preparing Public Performance Reports (May 2007) developed by the Public Sector Accounting Board of Canada (PSAB); and, whether overall, they thought the DPRs told a coherent performance story according to John Mayne’s excellent discussion paper to the Office of the Auditor General of Canada and the guidance provided by the Report of the Standing on Public Accounts in September 2010, where the committee made recommendations for the improvement of departmental performance reports.
Mayne sets out five elements of a coherent performance story: i) sufficient context to allow readers to understand the report, including a results chain, and risks faced; ii) a description of planned spending linked to expected outputs and outcomes; iii) a comparison of expected results with actual results; iv) indication of lessons learned, and how the organization will benefit from these in future iterations of programming; and, v) whether steps are in place to ensure the quality of the data used in the report.
The students came to several important observations, but here are some of their major themes.
First, in terms of Mayne’s first element, there is no clear sense of purpose for these reports. The result, according to one student, is an “incoherent jumble of data streams that are not linked in any understandable way to plain language outcomes.” The reports ought to inform public debate about governmental action in addition to informing on the expenditure of public funds under the Financial Administration Act.
Second, the reports do not present a sense of vision and mission that provides the reader with where programming is aimed. According to another student commenting on Mayne’s second element, the reports “provided a ton of information on political tactics…which is not all that important to [citizens]. We need to start talking about high level policy goals and objectives that are relevant and agreed upon by all stakeholders.” These reports demonstrate little coherence on overall progress toward public policy objectives such as reducing poverty or crime.
Third, the reports lack evidentiary rigour. The measures tend to be at the activities level and do not shed light on results or outcomes. According to one student commenting on Mayne’s third and fourth elements, “a review of the performance indicators…does not yield much useful information on how [the department] used the information to inform decision-making and shape programs.” That is, the links between activities and results are unclear, which also leads one to question whether the links between planned and actual spending is reliable and valid. The lack of results chains, logic models, or links to vision statements show a clear divergence between departmental activities and collective progress toward coherently expressed results.
Fourth, given the political climate of the day, all of the reports examined provide a good news story that is unsupported by the limited evidence. According to a student commenting on Mayne’s fifth element, “most of these reports spend more time describing the few specific things they are doing well, while purposely avoiding shedding any bad light on what they are not doing well.”
Several improvements were forwarded by the students including:
• Prepare separate sections and messages for specific audiences, including citizens;
• Greater use of third party assessors, within/without government, to gauge report accuracy;
• Coherent guidance for departments on how to prepare DPRs in a standard format;
• DPRs should be focused more on results, and progress toward long-term policy outcomes in a coordinated and multi-jurisdictional way; and
• Use of plain language using valid evidentiary performance stories.
If trained students in evaluation are having difficulty with the limited rigour, coherence and readability of these reports, one can only imagine what the general public must think. The assumption is that Canadians want to engage in public policy conversations in an informed way. That we have a vehicle such as the DPR is a positive contribution, but in their current state, one must question their value for democratic discussion. If this workshop told us anything, it is that we can collectively do much better.