Tag Archives: public service sector renewal

The PM’s Advisory Committee on the Public Service: The Good, The Bad, The Hopeful

On February 25th Paul Tellier and David Emerson – two men whose understanding of Ottawa I have a tremendous amount of respect for – released The Fourth Report of the Prime Minister’s Advisory Committee on the Public Service. It is a document that is worth diving into as these reports will likely serve as reference points for (re)thinking on renewing government for the foreseeable future.

The Bad:

On the rough side, I have a single high-level comment: These reports are likely to be as close as we are going to get in Canada to Australia’s Government 2.0 Taskforce (on which I served as part of the international reference group) or Britain’s Cabinet Office Power of Information Taskforce Report (which would have been tremendous to have been involved in).

To be clear, this is not the fault of the committee. Its terms of reference appear to be much broader. This has to predictable consequences. First, relatively little time is dedicated to the general reorganizing of society being prompted by the now 40 year old internet revolution is only carving out a small role. The committee is thus not able to dive into any detail on how the changing role of information in society, on open data, on the power of self-organization, or the rising power and influence of social media could and should re-shape the public service.

Second, much more time is dedicated to thinking about problems around HR and pay. These are important issues. However, since the vision of the public service remains broadly unchanged, my sense is the reforms, while sometimes large, are ultimately tweaks designed to ensure the continuation of the current model – not prompting a rethink (or the laying of groundwork) for a 21st century public service which will ultimately have to look different to stay relevant.

The best example of the implications of this limited scope can be found under the section “Staying Relevant and Connected.” Here the report has two recommendations, including:

The Public Service must take full advantage of collaborative technologies to facilitate interaction with citizens, partners and stakeholders.

The Public Service should adopt a structured approach to tapping into broad-based external expertise. This includes collaboration and exchanges with universities, social policy organizations, think tanks, other levels of government and jurisdictions, private sector organizations and citizens.

These are good! They are also pretty vague and tame. This isn’t so much renewal as it is a baseline for a functioning 20th century public service. More importantly, given some of the other pieces in the report these appear to be recommendations about how the government can engage in pretty traditional manners (exchanges). Moreover, they are externally focused. The main problem with the public service is that its members aren’t even allowed to use collaborative technologies to interact among themselves so how can they possible be ask to collaborate externally? As I say in my OCAD lecture and my chapter in Open Government: Collaboration, Transparency, and Participation in Practice – a digital citizenry isn’t interested in talking to an analogue government. The change required is first and foremost internal. But advocating for such a change is a major effort – one that will require significant culture and process change – which I haven’t found so far in the report and which is probably beyond its scope.

The Good:

That said, when the report does talk about technology and/or collaboration – it broadly says the right things. For example, in the section Creating A Modern, Enabled Workplace the report says:

creating a workplace that will attract, retain and empower public servants to innovate, collaborate and be responsive to the public. Among other things, this must include the adoption of collaborative technologies that are increasingly widespread in other sectors.

And, perhaps more importantly, under the section Strengthening Policy Capacity: A Relevant and Connected Public Service the report states:

A public service operating in isolation runs the risk of becoming irrelevant. We believe that the quality of policy thinking must be enhanced by additional perspectives from citizens, stakeholders and experts from other jurisdictions and other sectors (e.g. business, academia, non-governmental organizations). We believe sound government policy should be shaped by a full range of perspectives, and policy makers must consistently reach beyond the National Capital Region for input and advice.

Furthermore, the Public Service now has an opportunity to engage Canadians, especially younger ones, through the use of Web 2.0 collaborative technologies such as wikis, blogs and social networking. These offer an excellent way for the Public Service to reach out and connect.

Again, great stuff. Although, my concerns from above should also be reiterated. A networked public service is one that will need new norms as it will function very differently. The task force has little to say about this (again because of their expansive purview and not through their own fault). But this issue must be addressed in full. I frequently argue that one reason public servants are so stressed is that they live double lives. They already live in a networked workplace and play by network rules in order to get their job done, however, they are perpetually told they live in a hierarchy and have to pretend they abide by that more traditional rule set. Double lives are always stressful…

The Hope:

As the committee moves forward it says it will:

…consider and advise on new business models for the Public Service with a view to creating an innovative and productive workforce that continues to deliver increasing value for money by taking advantage of new tools and technology;

I hope that open data, open systems and some of the ideas around a network government I’ve been advocating and talking about along with numerous others, get in front of the committee – these all represent building blocks for a significantly more flexible, innovative and product public service.

Withholding FOI requests: In the Private Sector, that's fraud

It was with enormous interest I read on the Globe’s website about a conservative Ministerial Aide “unrealeasing” a document requested by The Canadian Press through an Access to Information request (The Access to Information Act ensures that citizens can request information about the government’s activities).

A federal cabinet minister’s aide killed the release of a sensitive report requested under freedom-of-information in a case eerily similar to a notorious incident in the sponsorship scandal.

What I find fascinating is the neither the minister (now at Natural Resources Canada) or the aide have been asked to resign.

Let’s be abundantly clear, if this were the private sector and a CEO was caught deliberately withholding material information from a shareholder… that would constitute either fraud and/or a violation of whichever provincial securities laws he/she was bound by. Moreover, such a crime that could carry with it a prison sentence.

And yet here, in the most cavalier manner, one of the most basic trusts that ensure accountability in our system is violated with almost no repercussions.

The story does have its dark humour (and a embarrassingly feeble attempt at an excuse):

Mr. Paradis’s current communications director said Mr. Togneri’s intervention was to suggest the Access to Information section offer fewer pages to the requester without charge rather than the entire 137 pages for a fee of $27.40, which had already been paid.

“He went through and thought that a huge section of a very big report wasn’t relevant and that you should be given the option of paying to get it or get the (smaller) chapter” without charge, Margaux Stastny said in an interview. “No one can overrule Access officers.”

The options were never provided to the requester, however. Instead, the department simply sent the censored report and refunded the fee.

Yes, I too am always comforted to know that my government is thinking of me and trying to save me a few pennies by ensuring I don’t see information they know I need not waste my time on.

I, of course, have another solution for how the photo copying money could be saved. What about emailing a digital copy of the report? Of course Access to Information requests (called ATIP or FOI for those in the US) are always handed out in paper, just to ensure you can’t do anything too useful with them… oh and to help ensure that they are late in delivering them.

So while, in this case, the Minister’s staff has committed an enormous gaffe – one that should have (and yet probably won’t) political implications, it is also a window into a broader problem:

FOI = broken.

I belong to a generation that gets information in .3ms (length of a google search) if you take 80 days to get my request to me (and edit it/censor it), you are a bug I will route around. This isn’t just the end of accountability in government, this is the end of the relevancy of government.

Eaves.ca Blogging Moment #3 (2009 Edition): Australia Likes eaves.ca

Back in 2007 I published a list of top ten blogging moments – times I felt blogging resulted in something fun or interesting. I got numerous notes from friends who found it fun to read (though some were not fans) so I’m giving it another go. Even without these moments it has been rewarding, but it is nice to reflect on them to understand why spending so many hours, often late at night, trying to post 4 times a week can give you something back that no paycheck can offer. Moreover, this is a chance to celebrate some good fortune and link to people who’ve made this project a little more fun. So here we go…

Eaves.ca Blogging Moment #3 (2009 Edition): Australia Likes eaves.ca

Perhaps one of the biggest surprises of the year was an email from the chair of the Australian Government’s Government 2.0 Taskforce asking me if I would sit on their International Reference Group.

Fascinating to see a government wrestle with how it can reinvent itself and to ask for thoughts and ideas. I hope my own country contemplates doing something along similar lines soon. Also exciting to be able to help review and edit the final report, offer advice and feedback and better understand the challenges and opportunities as their government sees them.

You can download the report here. It is a great read.

Eaves.ca Blogging Moment #8 (2009 Edition): Blogging leads to Book Chapters!

Back in 2007 I published a list of top ten blogging moments – times I felt blogging resulted in something fun or interesting. I got numerous notes from friends who found it fun to read (though some were not fans) so I’m giving it another go. Even without these moments it has been rewarding, but it is nice to reflect on them to understand why spending so many hours, often late at night, trying to post 4 times a week can give you something back that no paycheck can offer. Moreover, this is a chance to celebrate some good fortune and link to people who’ve made this project a little more fun. So here we go…

Eaves.ca Blogging Moment #8 (2009 Edition): Blogging leads to Book Chapters!

First, my blogging, writing, work, consulting and public speaking on public service sector renewal earns me the opportunity to write a chapter in O’Reilly Media’s upcoming book on Open Government.

Needless to say, I’m excited.

(Shameless plug within a shameless list: I’ll be giving a talk about open, technology social change and the future of government, some of the themes covered in the chapter, at the Ontario College of Art and Design on January 14th. Details and tickets here, 200 gone, about 60 left.)

Second, after passing it under the noses of numerous magazine editors who expressed interest but ultimately pass it up, Taylor and I decide to simply publish Missing The Link: Why Old Media still doesn’t get the Internet as a website.

The sad news: We wrote it 3 years ago And I think it is just as relevant today.

The good news: Looks like an academic publisher is very interested and will be turning it a chapter for a book. Hurray for just putting stuff out there.

Eaves.ca blogging moment #10 (2009 Edition): The CPSR Rat Pack

Back in 2007 I published a list of top ten blogging moments – times I felt blogging resulted in something fun or interesting. I got numerous notes from friends who found it fun to read (though some were not fans) so I’m giving it another go. Even without these moments it has been rewarding, but it is nice to reflect on them to understand why spending so many hours, often late at night, trying to post 4 times a week can give you something back that no paycheck can offer. Moreover, this is a chance to celebrate some good fortune and link to people who’ve made this project a little more fun. So here we go…

#10 Finding the Canadian Public Service Sector Renewal Rat Pack

Through blogging and twitter I discovered a community of public service sector renewal geeks who are equally driven by passion and a belief in the importance of a vibrant, successful and modern public service.

I’ve met some of this crew in the flesh, others I know only through email, comments, reading their blogs or twitter. But whether I’ve met them or not they have been a real community – a group of people with whom I can bounce ideas off of and explore new thoughts. More importantly, we support one another.

Without my blog, I’m not sure I’d have found them or them me.

So three cheers to the Rat Pack of Public Service Sector Renewal. Everyone should be so lucky as to find peers like these.

Some of the Rat Pack members include: Nick Charney, Etienne Laliberte, Peter Cowan, Thomas Kearney, Laura Wesley, Chelsea Edgell, David Hume, Doug Bastien, Tariq Piracha, Jeff Braybrook, Richard Smith, Stephanie Hayes and Bowen Moren.

MuniForge: Creating municipalities that work like the web

Last month I published the following article in the Municipal Information Systems Association’s journal Municipal Interface. The article was behind a firewall so now that the month has gone by I’m throwing it up here. Basically, it makes the case for why, if government’s applied open source licenses to the software they developed (or paid to develop), they could save 100’s of millions, or more likely billions of dollars, a year. Got a couple of emails from municipal IT professionals from across the country

MuniForge: Creating Municipalities that Work like the Web

Introduction

This past May the City of Vancouver passed what is now referred to as “Open 3”.This motion states that the City will use open standards for managing its information, treat open source and proprietary software equally during the procurement cycle, and apply open source licenses to software the city creates.

While a great deal of media attention has focused on the citizen engagement potential of open data, but the implications of the second half of the motion – that relating to open source software – has gone relatively unnoticed. This is all the more surprising since last year the Mayor of Toronto’s also promised his city would apply an open source license to software it creates. This means that two of Canada’s largest municipalities are set to apply open source licenses to software they create in house. Consequently, the source code and the software itself will be available for free under a license that permits users to use, change, improve and redistribute it in modified or unmodified forms.

If capitalized upon these announcements could herald a revolution in how cities currently procure and develop software. Rather than having thousands of small municipalities collectively spending billions of dollars to each recreate the own wheel the open sourcing of municipal software could weave together Canada’s municipal IT departments into one giant network in which expertise and specialized talents drive up quality and security to the benefit of all while simultaneously collapsing the costs of development and support. Most interestingly, while this shift will benefit larger cities, its benefit and impact could be most dramatic and positive among the country’s smaller cities (those with populations under 200K). What is needed to make it happen is a central platform where the source code and documentation for software that cities wish to share can be uploaded and collaborated on. In short, Canada needs a Sourceforge, or better, a GitHub for municipal software.

The cost

For the last two hundred years one feature has dominated the landscape for the majority if municipalities in Canada: isolation. In a country as vast and sparsely populated as ours villages, towns, and cities have often found themselves alone. For citizens the railway, the telegraph, then the highway and telecommunications system eroded that isolation, but if we look at the operations of cities this isolation remains a dominant feature. Most Canadian municipalities are highly effective, but ultimately self contained islands. Municipal IT departments are no different. One municipality rarely talks to that of another, particularly if they are not neighbours.

The result of this process is that in many cities across Canada IT solutions are frequently developed in one of two manners.

The first is the procurement model. Thankfully, when the product is off the shelf, or easily customized, deployment can occur quickly, this however, is rarely the case. More often, larger software and expensive consulting firms are needed to deploy such solutions frequently leaving them beyond the means of many smaller cities. Moreover, from an economic development perspective the dollars spent on these deployments often flow out of the community to companies and consultants based elsewhere. On the flip side, local, smaller firms, if they exist at all, tend to be untested and frequently lack the expertise and competition necessary to provide a reliable and affordable product. Finally, regardless of the firms’ size, most solutions are proprietary and so lock a city into the solution in perpetuity. This not only holds the city hostage to the supplier, it eliminates future competition and worse, should the provider go out of business, it saddles the city with an unsupported system which will be painful and expensive to upgrade out of.

The second option is to develop in-house. For smaller cities with limited IT departments this option can be challenging, but is often still cheaper than hiring an external vendor. Here the challenge is that any solution is limited by the skills and talents of the City’s IT staff. A small city, with even a gifted IT staff of 2-5 people will be challenged to effectively build and roll out all the IT infrastructure city staff and citizens need. Moreover, keeping pace with security concerns, new technologies and new services poses additional challenges.

In both cases the IT services a city can develop and support for staff and citizens is be limited by either the skills and capacity of its team or the size of its procurement budget. In short, the collective purchasing power, development capacity and technical expertise of Canada’s municipal IT departments is lost because we remain isolated from one another. With each city IT department acting like an island this creates enormous constraints and waste. Software is frequently recreated hundreds of times over as each small city creates its own service or purchases its own license.

The opportunity

It need not be this way. Rather than a patchwork of isolated islands, Canada’s municipal IT departments could be a vast interconnected network.

If even two small communities in Canada applied an open source license to a software they were producing, allowed anyone to download it and documented it well the cost savings would be significant. Rather than having two entities create what is functionally the same piece of software, the cost would be shared. Once available, other cities could download and write patches that would allow this software to integrate with their own hardware/software infrastructure. These patches would also be open source making it easier for still more cities to use the software. The more cities participate in identifying bugs, supplying patches and writing documentation, the lower the costs to everyone becomes. This is how Linus Torvalds started a community whose operating system – Linux – would become world class. It is the same process by which Apache came to dominate webservers and it is the same approach used by Mozilla to create Firefox, a web browser whose market share now rivals that of Internet Explorer. The opportunity to save municipalities millions, if not billions in software licensing and/or development costs every year is real and tangible.

What would such a network look like and how hard would it be to create? I suspect that two pieces would need to be in place to begin growing a nascent network.

First, and foremost, there need to be a handful of small projects. Often the most successful source projects are those that start collaboratively. This way the processes and culture are, from the get go, geared towards collaboration and sharing.  This is also why smaller cities are the perfect place to start for collaborating on open source projects. The world’s large cities are happy to explore new models, but they are too rich, too big and too invested in their current systems to drive change. The big cities can afford Accenture. Small cities are not only more nimble, they have the most to gain. By working together and using open source they can provide a level of service comparable to that of the big cities, at a fraction of the cost. An even simpler first step would be to ensure that when contractors sign on to create new software for a city, they agree that the final product will be available under and open source license.

Second, MISA, or another body, should create a Sourceforge clone for hosting open sourced municipal software projects. Sourceforge is an American based open source software development web site which provides services that help people build cool and share software with coders around the world. It presently hosts more than 230,000 software projects has over 2 million registered users. Soureforge operates as a sort of market place for software initiatives, a place where one can locate software one is interested in and then both download it and/or become part of a community to improve it.

A Soureforge clone – say Muniforge – would be a repository for software that municipalities across the country could download and use for free. It would also be the platform upon which collaboration around developing, patching and documenting would take place. Muniforge could also offer tips, tools and learning materials for those new to the open source space on how to effectively lead, participate and work within an open source community. This said, if MISA wanted to keep costs even lower, it wouldn’t even need to create a sourecforge clone, it could simply use the actual sourceforge website and lobby the company to create a new “municipal” category.

And herein lies the second great opportunity of such a platform. It can completely restructure the government software business in Canada. At the moment Canadian municipalities must choose between competing proprietary systems that lock them into to a specific vendor. Worst still, they must pay for both the software development and ongoing support. A Muniforge would allow for a new type of vendor modeled after Redhat – the company that offers support to users that adopt its version of the free, open source Linux operating system. Suddenly while vendors can’t sell software found on Muniforge, they could offer support for it. Cities would not have the benefit of outsourcing support, without having to pay for the development of a custom, proprietary software system. Moreover, if they are not happy with their support they can always bring it in house, or even ask a competing company to provide support. Since the software is open source nothing prevents several companies from supporting the same piece of software – enhancing service, increasing competition and driving down prices.

There is another, final, global benefit to this approach to software development. Over time, a Muniforge could begin to host all of the software necessary to run a modern day municipality. This has dramatic implications for cities in the developing world. Today, thanks to rapid urbanization, many towns and villages in Asian and Africa will be tomorrow’s cities and megacities. With only a fraction of the resources these cities will need to be able to offer the services that are today common place in Canada. With Muniforge they could potentially download all the infrastructure they need for free – enabling precious resources to go towards other critical pieces of infrastructure such as sewers and drinking water. Moreover, a Muniforge would encourage small local IT support organizations to develop in those cities providing jobs fostering IT innovation where it is needed most.  Better still, over time, patches and solutions would flow the other way, as more and more cities help improve the code base of projects found on Muniforge.

Conclusion

The internet has demonstrated that new, low cost models of software development exist. Open source software development has shown how loosely connected networks of coders/users from across a country, or even around the world can create world class software that rivals and even outperforms software created by the largest proprietary developers. This is the logic of the web – participation, better development and low-cost development.

The question cities across Canada need to ask themselves is: do we want to remain isolated islands, or do we want to work like the web, working collaboratively to offer better services, more quickly and at a lower cost. If even only some cities choose the later answer an infrastructure to enable collaboration can be put in place at virtually no cost, while the potential benefits and the opportunity to restructure the government software industry would be significant. Island or network – which do we want to be?

If I could start with a blank sheet of paper… (part 2)

The other week Martin Stewart-Weeks posted this piece on the Australian Government’s Web 2.0 Taskforce blog. In it he asked:

“…imagine for a moment it was your job to create the guidelines that will help public servants engage online. Although you have the examples from other organisations, you are given the rare luxury to start with a blank sheet of paper (at least for this exercise). What would you write? What issues would you include? Where would you start? Who would you talk to?”

Last week I responded with this post which explained why my efforts would focus on internal change. This week I want to pick the thread back up and talk about what applications I would start with and why.

First, Social Networking Platform (this is essential!):

An inspired public service shouldn’t ban Facebook, it should hire it.

A government-run social networking platform, one that allowed public servants to list their interests, current area of work, past experiences, contact information and current status, would be indispensable. It would allow public servants across ministries to search out and engage counterparts with specialized knowledge, relevant interests or similar responsibilities. Moreover, it would allow public servants to set up networks, where people from different departments, but working on a similar issue, could keep one another abreast of their work.

In contrast, today’s public servants often find themselves unaware of, and unable to connect with, colleagues in other ministries or other levels of government who work on similar issues. This is not because their masters don’t want them to connect (although this is sometimes the case) but because they lack the technology to identify one another. As a result, public servants drafting policy on interconnected issues — such as the Environment Canada employee working on riverbed erosion and the Fisheries and Oceans employee working on spawning salmon — may not even know the other exists.

If I could start with a blank sheet of paper… then I’d create a social networking platform for government. I think it would be the definitive game changer. Public servants could finally find one another (saving millions of hours and dollars in external consultants, redundant searches and duplicated capacity. Moreover if improving co-ordination and the flow of information within and across government ministries is a central challenge, then social networking isn’t a distraction, it’s an opportunity.

Second, Encourage Internal Blogs

I blogged more about this here.

If public servants feel overwhelmed by information one of the main reasons is that they have no filters. There are few, if any bloggers within departments that are writing about what they think is important and what is going on around them. Since information is siloed everybody has to rely on either informal networks to find out what is actually going on (all that wasted time having coffee and calling friends to find out gossip) or on formal networks, getting in structured meetings with other departments or ones’ boss to find out what their bosses, bosses, boss is thinking. What a waste of time and energy.

I suspect that if you allowed public servants to blog, you could cut down on rumours (they would be dispelled more quickly) email traffic and, more importantly, meetings (which are a drain on everybody’s time) by at least 25%. Want to know what my team is up to? Don’t schedule a meeting. First, read my blog. Oh, and search the tags to find what is relevant to you. (You can do that on my blog too, if you are still reading this piece it probably means you are interested in this tag.)

Third, Create a Government Wide Wiki

The first reason to create a wiki is that it would give people a place to work collectively on documents, within their departments or across ministries. Poof, siloes dissolved. (Yes, it really is that simple, and if you are middle management, that terrifying).

The second reason to provide some version control. Do you realize most governments don’t have version control software (or do, but nobody uses it, because it is terrible). A wiki, if nothing else, offers version control. That’s reason enough to migrate.

The third reason though is the most interesting. It would change the information economics, and thus culture, of government. A wiki would slowly come to function as an information clearing house. This would reduce the benefits of hoarding information, as it would be increasingly difficult to leverage information into control over an agenda or resource. Instead the opposite incentive system would take over. Sharing information or your labour (as a gift) within the public service would increase your usefulness to, and reputation among, others within the system.

Fourth, Install an Instant Messaging App

It takes less time than a phone call. And you can cut and paste. Less email, faster turn-around, quicker conversations. It isn’t a cure all, but you’ve already got young employees who are aching for it. Do you really want to tell them to not be efficient?

Finally… Twitter

Similar reasons to blogs. Twitter is like a custom newspaper. You don’t read it everyday, and most days you just scan it – you know – to keep an eye on what is going on. But occasionally it has a piece or two that you happen to catch that are absolutely critical… for your file, your department or your boss.

This is how Twitter works. It offers peripheral vision into what is going on in the areas or with the people that you care about or think are important. It allows us to handle the enormous flow of information around us. Denying public servants access to Twitter (or not implementing it, or blogs, internally) is essentially telling them that they must drink the entire firehose of information that is flowing through their daily life at work. They ain’t going to do it. Help them manage. Help them tweet.

Toronto Innovation Summit on Open Government

Today I’m at Toronto City Hall doing a panel on Open Government for the Innovation Showcase. If you are reading this before 10am EST you can catch a webcast of the panel at the above link.

I’ve pasted in my slides for those who would like to follow along. Down below I’ve included a few links that those who are new to my site (or who haven’t read my writing on government 2.0) might find interesting.

Some of my favourite posts of open government, open data and gov 2.0:

The Three Laws of Open Government Data

Open Data: USA vs Canada

Create the Open Data Bargain in Cities

Globe and Mail Op-Ed: Don’t Ban Facebook

If I could start with a blank sheet of paper… (written for the Australian Government’s Web 2.0 Taskforce)

Mapping Government 2.0 against the Hype Curve

Feeding the next economy – Give us a stimulus that stimulates, not placates

Why the Government of Canada needs bloggers

Why StatCan could be like Google

The Public Service as Gift Economy

Public Service Sector Renewal and Gen Y: Don’t be efficient

Public Service Sector Renewal: Starting at the APEX

Open Data – USA vs. Canada

open-data-300x224When it comes to Open Data in Canada and the United States, things appear to be similar. Both countries have several municipalities with Open Data portals: Washington, D.C., San Francisco, and now New York City in the US, Vancouver and Nanaimo in Canada with Toronto, Edmonton, Calgary and Ottawa thinking about or initiating plans.

But the similarities end there. In particular there is a real, yawning gap at the federal level. America has data.gov but here in Canada there is no movement on the Open Data front. There are some open data sets, but nothing comprehensive, and nothing that follows is dedicated to following the three laws of open data. No data.gc.ca in the works. Not even a discussion. Why is that?

As esoteric as it may sound, I believe the root of the issues lies in the country’s differing political philosophies. Let me explain.

It is important to remember that the United States was founded on the notion of popular sovereignty. As such its sovereignty lies with the people, or as Wikipedia nicely puts it:

The American Revolution marked a departure in the concept of popular sovereignty as it had been discussed and employed in the European historical context. With their Revolution, Americans substituted the sovereignty in the person of the English king, George III, with a collective sovereign—composed of the people. Henceforth, American revolutionaries by and large agreed and were committed to the principle that governments were legitimate only if they rested on popular sovereignty – that is, the sovereignty of the people. (italics are mine)

Thus data created by the US government is, quite literally, the people’s data. Yes, nothing legally prevents the US government from charging for information and data but the country’s organizing philosophy empowers citizens to stand up and say – this is our data, we’d like it please. In the United States the burden is on the government to explain why it is withholding that which the people own (a tradition that admittedly is hardly perfect as anyone alive from the years 2000-2008 will attest to).  But don’t underestimate the power of this norm. Its manifestations are everywhere, such as in the legal requirement that any document created by the United States government be published in the public domain (e.g. it cannot have any copyright restrictions placed on it) or in America’s vastly superior Freedom of Information laws.

This is very different notion of sovereignty than exists in Canada. This country never deviated from the European context described above. Sovereignty in Canada does not lie with the people, indeed, it resides in King George the III’s descendant, the present day Queen of England. The government’s data isn’t your, mine, or “our” data. It’s hers. Which means it is at her discretion, or more specifically, the discretion of her government servants, to decide when and if it should be shared. This is the (radically different) context under which our government (both the political and public service), and its expectations around disclosure, have evolved. As an example, note that government documents in Canada are not public domain, they are published under a Crown Copyright that, while less restrictive than copyright, nonetheless constrains reuse (no satire allowed!) and is a constant reminder of the fact that Canadian citizens don’t own what their tax dollars create. The Queen does.

The second reason why open data has a harder time taking root in Canada is because of the structure of our government. In America, new projects are easier to kick start because the executive welds greater control over the public service. The Open Data initiative that started in Washington, D.C. spread quickly to the White House because its champion and mastermind, the District’s of Columbia’s CTO Vivek Kundra, was appointed Federal CIO by President Obama. Yes, Open Data tapped into an instinctual reflex to disclose that (I believe) is stronger down south than here, but it was executed because America’s executive branch is able to appoint officials much deeper into government (for those who care, in Canada Deputy Ministers are often appointed, but in the United States appointments go much deeper, down into the Assistant Deputy and even into the Director General level). Both systems have merits, and this is not a critic of Canada’s approach, simply an observation. However, it does mean that a new priority, like open data, can be acted upon quickly and decisively in the US. (For more on these difference I recommend reading John Ibbitson’s book Open & Shut).

These difference have several powerful implications for open data in Canada.

As a first principle, if Canadians care about open data we will need to begin fostering norms in our government, among ourselves, and in our politicians, that support the idea that what our government creates (especially in terms of research and data) is ours and that we should not only have unfettered access to it, but the right to analyze and repurpose it. The point here isn’t just that this is a right, but that open data enhances democracy, increases participation and civic engagement and strengthens our economy. Enhancing this norm is a significant national challenge, one that will take years to succeed. But instilling it into the culture of our public service, our civic discourse and our political process is essential. In the end, we have to ask ourselves – in a way our American counterparts aren’t likely to (but need to) – do we want an open country?

This means that secondly, Canadians are going to have to engage in a level of education of – particularly senior – public servants on open data that is much broader and more comprehensive than our American counterparts had to. In the US, an executive fiat and appointment has so far smoothed the implementation of open data solutions. That will likely not work here. We have many, many, many allies in the public service who believe in open data (and who understand it is integral to public service sector renewal). The key is to spread that knowledge and support upwards, to educate senior decision-makers, especially those at the DG, ADM and DM level to whom both the technology and concept is essentially foreign. It is critical that these decision-makers become comfortable with and understand the benefits of open data quickly. If not we are unlikely to keep pace with (or even follow) our American counterparts, something, I believe is essential for our government and economy.

Second, Canadians are going to have to mobilize to push for open data as a political issue. Even if senior public servants get comfortable with the idea, it is unlikely there will be action unless politicians understand that Canadians want both greater transparency and the opportunity to build new services and applications on government data.

(I’d also argue that another reason why Open Data has taken root in the US more quickly than here is the nature of its economy. As a country that thrives on services and high tech, open data is the basic ingredient that helps drive growth and innovation. Consequently, there is increasing corporate support for open data. Canada, in contrast, with its emphasis on natural resources, does not have a corporate culture that recognizes these benefits as readily.)

Emergent Systems in Government: Let's put the horse before the cart

Yesterday Paul McDowall, Knowledge Management Advisor at the Government’s School of the Public Service and chairperson of the Interdepartmental Knowledge Management Forum, wrote the following comment in response to a blog post from several months ago entitled “How GCPEDIA will save the public service.”

I’ve posted his comment – feel free to read it or skip it and go straight to my analysis below. In summary, what makes McDowall’s comments interesting isn’t just the argument (or its reactionary nature) but the underlying perspective/assumptions that drives it. It serves as a wonderful example of the tension between how the traditional hierarchical nature of the public service and some evolving emergent models that challenging this approach.

So first, McDowall:

Will GCPEDIA save the public service, or capture all the tacit knowledge that will walk out the door? No, of course not! To suggest otherwise is, frankly, naive hyperbole.

As great and as promising as GCPEDIA and other Web 2.0 tools are, tools will never save the public service. People are the public service and only people have the capacity to save the public service, and it will take a whole lot more to improve the weak areas of the public service than a tool. Things like leadership play a pretty important role in organizational effectiveness. There are many good Organizational Excellence models (I have researched this area) and they all include people and leadership as two elements, but funny enough, tools aren’t included. Why? Because it is not so much a tool issue as it is a craftsman issue.

With respect to your comment about tacit knowledge and social capital (not the same things by the way), I think it may be helfpul to brush up on what tacit knowledge is, and what Knowledge Management is.

It is unquestionably true that the public service continues to face a potential impact from demographic changes that are both extremely significant and yet unquantified. It is also unquestionably true that most public service organizations haven’t truly understood or addressed these potential impacts, to say nothing of the potential of improving their effectiness right NOW from better Knowledge Management (productivity, innovation, etc).

These issues need to be addressed by public service leaders in an intelligent and thoughtful manner. Tools can and certainly should help but only when wielded by craftsmen and women. For too long vendors have made grandiose and unrealizable promises about their ‘solutions’. I thought we had learned our lessons from all that experience.
Let’s not get the cart before the horse, shall we?

Paul McDowall
Knowledge Management Advisor and chairperson of the Interdepartmental Knowledge Management Forum

McDowall’s main concern appears to be that GCPEDIA doesn’t have a clear purpose and, more importantly, doesn’t serve a specific leadership objective. (If you are wondering how I gleaned that from the above, well, I cheated, I called McDowall to ask him more about his comment since the nature of his concern wasn’t clear to me). For those used to an era where IT projects were planned out from the beginning, everything was figured out in advance, and the needs of the leadership were the paramount priority, GCPEDIA would be disconcerting. Indeed, the very idea of unleashing people willy-nilly on a system would be an anathema. In short, when McDowall says, don’t put the horse before the cart, what he’s saying is, “you’ve rolled out a tool, and you don’t even know what you are going to use it for!”

This would appear to be rational concern. Except, many of the rules that underlay this type of thinking are disappearing. Indeed, had this type of thinking been adhered to, the web would not have developed.

First, The economics have changed. There was a time when IT projects necessarily costed tens of millions of dollars.  But GCPEDIA was built on a (free) open source platform using a handful of internal FTEs (making McDowell’s comments about vendors even more confusing). Indeed GCPEDIA has cost the public service virtually nothing to create. One invests in planning so as to avoid expensive or ineffective deployments. But if the costs of deployment are virtually zero and failure really isn’t that traumatic then… why waste time and years planning? Release, test, and adapt (or kill the project).

Second, with projects like this become cheap to deploy another important shift takes place. Users – not their bosses or a distant IT overlord – decide a) if they want to participate and b) co-develop and decide what is useful. This has powerful implications. It means that you had better serve a real (not perceived or mandated) need, and that, if successful, you’d better be prepared to evolve quickly. This, interestingly, is how that usefully little tool called the World Wide Web evolved. Read the original proposal to create the World Wide Web. IT departments of the world didn’t all collectively and suddenly decide that people should be made to use the web. No! It grew organically responding to demand. In addition, there is very little in it that talks about how we use the web today, users of the web (us!) have helped it evolve so that it serves us more effectively.

This is probably the biggest disconnect between McDowell and myself. He believes GCPEDIA is problematic (or at least won’t do the things I think it will do) because it doesn’t serve the leadership. I think it will work because it does something much better, it serves actual users – public servants (and thus, contrary to his argument, is very much about people). This includes, critically, capturing tacit knowledge and converting it into formal – HTML encoded – knowledge that helps build social capital (I do, actually, know the difference between the two).

Indeed, the last thing we need is a more leadership oriented public service, what we need is an employee centric public service. One that enables those who are actually doing the work to communicate, collaborate and work, more effectively. In this regard, I think GCPEDIA is demonstrating that it is effective (although it is still is very early days) with logarithmic growth, 8000+ users and 200 more signing up every week (all with virtually no promotional budget). Clearly some public servants are finding it to be at worst interesting, and at best, deeply enabling.