CGE Vol.13 No.6 June 2007

Is it 1995? Sometimes I think so. With internet use rates hovering around 80%, it is just a matter of time before all Canadians use the internet. (It is inevitable that tweens will become teens, teens will become adults, and so on.) Moving beyond the circle of life, YouTube, wikis, MySpace, iGoogle are everyday examples of how the internet is evolving. It would be nice to say that the lion’s share of government online consultations are moving along as well – even at a fraction of the pace.

I am not suggesting government move at the same pace as teenage online whims. But is posting a 20-page discussion paper and asking five questions that five people will answer the best way to engage citizens, stakeholders or experts? If you answered yes, read on. If you answered no, then stick around to hear examples of how some government departments are doing things differently. I am not asking for a miracle, but I think that we all need to consider, just consider, moving beyond the 1995 approaches.

Online consultations shouldn’t be driven by the shiniest new tool. Take the wiki. Wikis are great but they are not appropriate for the large majority of public consultations. Wikis are a collaborative tool that work best when there is a preexisting relationship between users – Wikipedia is the behemoth exception to this rule.

Wikis are great for a community of practice to collaborate on an issue, but I can’t think of too many government departments that are willing to blue sky a policy and leave it to the masses to shape. It is neither palatable nor responsible for a government to do this. Think of the Stockwell Day/Doris Day referendum idea by Rick Mercer. This is the basis to compare an erroneous government sponsored wiki experiment that I just heard about in the UK.

Are there alternatives? Yes, but before we go much further we need to clarify what is wrong with a 20-page discussion paper and five questions. The first challenge is for the potential contributors – expert, stakeholder, citizen or otherwise – to find the consultation on fifth sub-site of the third directorate of the departmental website. This is even the case when they are made aware of the opportunity to contribute. To be fair, good and bad in-person and online consultations are often cursed by atrociously small communications budgets.

Getting back to the point, many of the discussion documents are incomprehensible and needlessly inaccessible. If 500 Canadians can make a meaningful online contribution to the future of the Canadian Geospatial Data Infrastructure, I might not hesitate to say that any issue is “consultable.”

What about the discussion questions? How good are they? One of my all time favourites is, “Tell us your thoughts?” When five people answer, this is manageable. How does this work with 50, 500, 5000, especially when a suggested word length is not communicated? Even great qualitative analysis tools like Nvivo (rebranded from original Nudist software – you have to love Australians) are hard pressed to help you out with a 1000-page manifesto.

The real reason the discussion document approach is a let down is the signal that it conveys: “We really don’t want to hear from you, but we have to ask you, so here you go.” These types of token engagements that serve as the “consultation checkmark” create almost the same level of cynicism as the phrase, “mission accomplished.”

Let’s take a scenario that we are all familiar with and see if we can apply a new approach. A program is in the last year of its funding, but is going to rise like a phoenix into a new and improved program. At this stage of the policy lifecycle there is a lot of breathing room but common questions might be: what currently works and what doesn’t; what’s missing; what do you think about these options; and what are your priorities?

A market research approach might be to ask 25 questions to answer these questions. This presupposes complete understanding of the issue as well as the government perspective.

What are the alternatives? A properly designed consultation (online or otherwise) can allow you to provide a baseline of information so that participants can make an informed contribution. This is not 50 pages in a discussion document, but carefully condensed and balanced information or scenarios that allow a participant to work through an issue. Online consultations provide you an opportunity to not only engage stakeholders, experts, and citizens, but also provide a chance to inform or even educate.

How do you do this online? What’s worked for HRSDC, Health Canada, Agriculture and Agri-Food Canada, NRCan, Environment Canada, the Senate, the House of Commons and a host of others is a consultation workbook or choicebook. (Workbook sounds like work, a choicebook sounds more fun.) It allows participants to work through the issue, consider background information and facts with questions sprinkled in along the way. There doesn’t have to always be discrete choices for it to work, but any policy, program, or legislation can be broken down into smaller bite sized pieces. Keep in mind that the need to distill is a major reason why PowerPoint is the most popular piece of software in the GoC. Can anyone say version 42?

How is this any different than a web survey, you ask? It is a survey on “steroids,” because it provides information to consider before questions. Does this make a difference? When properly designed, completion rates hover around 80%. The average web survey will have a completion rate of around 15 to 20%. That means for every survey started, 15% of participants will complete it. Contrast this with the 80% for the workbook and you can see the difference. These workbooks take about 25-30 minutes to complete on average. In a recent initiative with the CMA, 4000 doctors complete a 45-minute choicebook. Doctors are busy people, but with an appropriate and meaningful online approach, participants are willing to make the investment of their time.

The quantitative workbook model has a qualitative partner – the share-your-story or idea tool. Some participants want to have more flexibility in their contribution – especially stakeholders. So a structured means is provided that allows a participant to pick a theme/topic, create a title, and enter their content. This may sound overly simplistic, but by asking these questions you have completed the first stage of sorting your qualitative data. A rule of thumb is a word limit of 250 to 500 words. Stakeholders might be provided the option of uploading a 10-page document. If their document is 12 pages you might continue, but setting an expectation on length allows for analytical responsibility.

The workbook and the share-your-idea tools are simple baby steps to holding an effective online consultation. However, the tools are not enough on their own. Communications and outreach are essential to getting the word out. More and more departments are creating stakeholder databases so that they can get the word out to the interested public. Email is the best way to get people to participate. Just think, when was the last time you took a piece of paper, walked to your computer and entered in a URL?

These are the basics. We haven’t even covered discussion forums, chats, or small group dialogues, to name a few. They are happening, though. Not too long ago Minister Flaherty held a real time Q&A with citizens that, from all accounts, was a success. We also haven’t discussed the feedb