Tag Archives: public policy

The State of Open Data in Canada: The Year of the License

Open Data now an established fact in a growing list of Canadian cities. Vancouver, Toronto, Edmonton, Ottawa have established portals, Montreal, Calgary, Hamilton and some other cities are looking into launching their own and a few provinces are rumored to be exploring open data portals as well.

This is great news and a significant accomplishment. While at the national level Canadian is falling further behind leaders such as England, the United States, Australia and New Zealand, at the local and potentially provincial/state level, Canada could position itself as an international leader.

There is however, one main obstacle: our licenses.

The current challenge:

So far most Open Data portals adopt what has been termed the Vancouver License (it was created by Vancouver for its open data portal and has subsequently been adopted, with occasional minor changes, by virtually every other jurisdiction).

The Vancouver license, however, suffers from a number of significant defects. As someone who was involved in its creation these “bugs” were a necessary tradeoff. If we were looking for a perfect license that satisfied all stakeholders, I suspect we’d still be arguing about it and there’d be no open data or data portal with the Vancouver license. Today, thanks in part to the existence of these portals our politicians, policy makers and government lawyers understanding of this issue has expanded. This fact, in combination with a growing number of complaints about the licenses from non-profits and businesses interested in using open data, has fostered growing interest in adjusting it.

This is encouraging. And we must capitalize on the moment. I wish to be clear: until Canadian governments get the licensing issue right, Open Data cannot advance in this country. Open Data released by governments will not enjoy significant reuse undermining one of the main reasons for doing Open Data.

There are a few things everyone agrees a new license needs to cover. It must establish there is no warranty to the data and that the government cannot be held liable for any reuse. So let’s focus on the parts that governments most often get wrong.

Here, there are 3 things a new license needs to get right.

1. No Attribution

NASCAR-2-300x199

Nascar Jeff Gordon #24 by Dan Raustadt licensed CC-NC-ND

We need a license that does not require attribution. First, attribution gets messy fast – all those cities logos crammed in on a map, on a mobile phone? It’s fine when you are using data from one or two cities, but what happens when you start using data from 10 different governments, or 50? Pretty soon you’ll have NASCAR apps, that will look ugly and be unusable.

More importantly, the goal of open data isn’t to create free advertising for governments, its to support innovation and reuse. These are different goals and I think we agree on which one is more important.

Finally, what government is actually going to police this part of the license? Don’t demand what you aren’t going to enforce – and no government should waste precious resources by paying someone to scour the internet to find websites and apps that don’t attribute.

2. No Share alike

One area the Vancouver license falls down is on the share is in this clause:

If you distribute or provide access to these datasets to any other person, whether in original or modified form, you agree to include a copy of, or this Uniform Resource Locator (URL) for, these Terms of Use and to ensure they agree to and are bound by them but without introducing any further restrictions of any kind.

The last phrase is particularly problematic as it makes the Vancouver license “viral.” Any new data created through a mash up that involves data with the Vancouver license must also use the Vancouver license. This will pretty much eliminate any private sector use of the data since any new data set a company creates they will want to be able to license in manner that is appropriate to their business model. It also has a chilling effect on those who would like to use the data but would need to keep the resulting work private, or restricted to a limited group of people. Richard Weait has an unfortunately named blog post that provides an excellent example of this problem.

Any new license should not be viral so as to encourage a variety or reuses of any data.

3. Standardized

The whole point of Open Data is to encourage the reuse of a public asset. So anything a government does that impedes this reuse will hamper innovation and undermine the very purpose of the initiative. Indeed, the open data movement has, in large part, come to life because one traditional impediment to using data has disappeared: data can now usually be downloaded and available in open formats that anyone can use. The barriers to use have declined so more and more people are interested.

But the other barrier to re-use is legal. If licenses are not easily understood then individuals and businesses will not reuse data, even when it is easily downloadable from a government’s website. Building a businesses or a new non-profit activity on a public asset to which your rights are unclear is simply not viable for many organizations. This is why you want every government should want its license to be easily understood – lowering the barriers to access means making data downloadable and reducing the legal barriers.

Most importantly, it is also why it is ideal if there is a single license in the whole country, as this would significantly reduce transaction and legal costs for all players. This is why I’ve been championing Canada’s leading cities to adopt a single common license.

So, there are two ways of doing this.

The easiest is for Canadian governments to align themselves with several of the international standardized open data licenses that already exist. There are a variety out there. My preference is the Open Commons’ Public Domain Dedication and License (PDDL), although they also publish the Open Database License (ODC-ODbL) and the Attribution License (ODC-By). There is also Creative Commons CC-0 license which Creative Commons suggests to use for open data (I actually recommend against all of these except the PDDL for governments, but more on that later).

These licenses has several advantages.

First, standardized licenses are generally well understood. This means people don’t have to educate themselves on the specifics of dozens of different licenses.

Second, they are stable. Because these licenses are managed by independent authorities and many people use them, they evolve cautiously, and balance the interest of consumers and sharers of data or information.

Third, these licenses balance interests responsible. The creators of these licenses are thought through all the issues that pertain to open data and so give both consumers of data and distributors of data comfort in knowing that they have a licenses that will work.

A second option is for governments in Canada to align around a self-generated common license. Indeed, this is one area where the Federal Government could show (some presently lacking) leadership.(although GeoGratis does have a very good license). This, for example appears to be happening in the UK, where the national government has created an Open Government Licence.

My hope is that, before the year is out, jurisdictions in Canada began to move towards a common licenses, or begin adopting some standard licenses.

Specifically, it would be great to see various Canadian jurisdictions either:

a) Adopt the PDDL (like the City of Surrey, BC). There are some reference to European Data Rights in the PDDL but these have no meaning in Canada and should not be an obstacle – and may even reassure foreign consumers of Canadian data. The PDDL is the most open and forward looking license.

b) Adopt the UK government’s Open Government Licence. This license is the best created by any government to date (with the exemption of simple making the data public domain, which, of course, is far more ideal.

c) Use a modified version of the Geogratis license that adjusts the “3.0 PROTECTION AND ACKNOWLEDGEMENT OF SOURCE” clause to prevent the NASCAR effect from taking place.

What I hope does not happen is that:

a) More and more jurisdictions continue to use the Vancouver License. There are better options and it is an opportunity to launch an open data policy and leapfrog the current leaders in the space.

b) Jurisdictions adopt a Creative Commons license. Creative Commons was created to help license copyrighted material. Since data cannot be copyrighted, the use of creative commons risks confusing the public about the inherent rights they have to data. This is, in part, a philosophical argument, but it matters, especially for governments. We – and our governments especially – cannot allow people to begin to believe that data can be copyrighted.

c) There is no change to the current licenses being used, or a new license, like Open Database License (ODC-ODbL) which goes against the attributes described above, is adopted.

Let’s hope we make progress on this front in 2011.

A Response to a Ottawa Gov 2.0 Skeptic

So many, many months ago a Peter R. posted this comment (segments copied below) under a post I’d written titled: Prediction, The Digital Economy Strategy Will Fail if it Isn’t Drafted Collaboratively on GCPEDIA. At first blush Peter’s response felt aggressive. I flipped him an email to say hi and he responded in a very friendly manner. I’ve been meaning to respond for months but, life’s been busy.  However, over the break (and my quest to hit inbox 0) I finally carved out some time –  my fear is that this late response will sound like a counter attack – it isn’t intended as such but rather an effort to respond to a genuine question. I thought it would be valuable to post as many of the points may resonate with supporters and detractors of Gov 2.0 alike. Here’s my responses to the various charges:

The momentum, the energy and the excitement behind collaborative/networked/web 2.0/etc is only matched by the momentum, the energy and the excitement that was behind making money off of leveraged debt instruments in the US.

Agreed, there is a great deal of energy and excitement behind collaborative networks, although I don’t think this is sparked – as ensued, by something analogous to bogus debt instruments. People are excited because of the tangible results created by sharing and/or co-production networks like Wikipedia, Mozilla, Flickr Google search and Google Translate (your results improve based on users data) and Ushahidi inspire people because of the tremendous results they are able to achieve with a smaller footprint of resources. I think the question of what these types of networks and this type of collaboration means to government is an important question – that also means that as people experiment their will be failures – but to equate the entire concept of Gov 2.0 and the above cited organizations, tools and websites with financial instruments that repackaged subprime mortgages is, in my mind, fairly problematic.

David, the onus lies squarely with you to prove that policymakers across government are are incapable of finding good policy solutions WITHOUT letting everyone and his twitting brother chime in their two cents.

Actually the onus doesn’t lie squarely with me. This is a silly statement. In fact, for those of us who believe in collaborative technologies such as GCPEDIA or yammer this sentence is the single most revealing point in the Peter’s entire comment. I invite everyone and anyone to add to my rebuttal, or to Peter’s argument. Even those who argue against me would be proving my point – tools like blogs and GCPEDIA allow ideas and issues to be debated with a greater number of people and a wider set of perspectives. The whole notion that any thought or solution lies solely with one person is the type of thinking that leads to bad government (and pretty much bad anything). I personally believe that the best ideas emerge when they are debated and contested – honed by having flaws exposed and repaired. Moreover this has never been more important than today, when more and more issues cross ministerial divides. Indeed, the very fact that we are having this discussion on my blog, and that Peter deemed it worthy of comment, is a powerful counterpoint this statement.

Equally important, I never said policymakers across government are are incapable of finding good policy solutions. This is serious misinterpretation of what said. I did say that the Digital Economy Strategy would fail (and I’ll happily revise, and soften to say, will likely not be meaningful) unless written on GCPEDIA. I still believe this. I don’t believe you can have people writing policy about how to manage an economy who are outside of and/or don’t understand the tools of that economy. I actually think our public servants can find great policy solutions – if we let them. In fact, most public servants I know spend most of their time trying to track down public servants in other ministries or groups to consult them about the policy they are drafting. In short, they spend all their time trying to network, but using tools of the late 20th century (like email), mid 20th century (telephone), or mid 3rd century BC (the meeting) to do it. I just want to give them more efficient tools – digital tools, like those we use in a digital economy – so they can do what they are already doing.

For the last 8 years I’ve worked in government, I can tell you with absolute certainty (and credibility!) that good policy emerges from sound research and strategic instrument choice. Often (select) public consultations are required, but sometimes none at all. Take 3 simple and highly successful policy applications: seat belts laws, carbon tax, banking regulation. Small groups of policymakers have developed policies (or laws, regs, etc) to brilliant effect….sans web 2.0. So why do we need gcpedia now?

Because the world expects you to do more, faster and with less. I find this logic deeply concerning coming from a public servant. No doubts that government developed policies to brilliant effect before the wide adoption of the computer, or even the telephone. So should we get rid of them too? An increasing number of the world’s major corporations are, or are setting up an internal wiki/collaboration platform, a social networking site, even using microblogging services like Yammer to foster internal collaboration. Indeed, these things help us to do research and develop ideas faster, and I think, better. The question isn’t why do we need GCPEDIA now. The question is why aren’t we investing to make GCPEDIA a better platform? The rest of the world is.

I’ll put this another way: tons of excellent policy solutions are waiting in the shared drives of bureaucrats across all governments.

I agree. Let’s at least put it on a wiki where more people can read them, leverage them and, hopefully, implement them. You sitting on a great idea that three other people in the entire public service have read isn’t a recipe for getting it adopted. Nor is it a good use of Canadian tax dollars. Socializing it is. Hence, social media.

Politics — being what it is — doesn’t generate progressive out solutions for various ideological reasons (applies equally to ndp, lib, con). First, tell us what a “failed” digitial economy strategy (DES) looks like. Second, tell us what components need to be included in the DES for it be successful. Third, show us why gcpedia/wikis offer the only viable means to accumulate the necessary policy ingredients.

For the last part – see my earlier post and above. As for what a failed digital economy strategy looks like – it will be one that is irrelevant. It is one that will go ignored by the majority of people who actually work in the digital economy. Of course, an irrelevant policy will be better than a truly bad one which, which I suspect, is also a real risk based on the proceedings of Canada 3.0. (That conference seemed to be about “how do we save analog business that are about to be destroyed by the digital economy” – a link to one of my favourite posts). And of course I have other metrics that matter to me. That all said, after meeting the public servant in charge of the process at Canada 3.0, I was really, really encouraged – she is very smart and gets it.

She also found the idea of writing the policy on GCPEDIA intriguing. I have my doubts that that is how things are proceeding, but it gives me hope.

Hello 2011! Coding for America, speaking, open data and licenses

Eaves.ca readers – happy new year! Here’s a little but of an overview into how we are kicking off 2011/

Thank you

First, if you are a regular reader…. thank you. Eaves.ca has just entered its 4th year and it all keeps getting more and more rewarding. I write to organize my thoughts and refine my writing skills but it is always most rewarding to see others benefit from/find it of value to read these posts.

Code for America

I’ve just landed San Francisco (literally! I’m sitting on the floor of the airport, catching the free SFO wifi) where I’ll be spending virtually all of January volunteering to help launch Code for America. Why? Because I think the organizations matters, its projects are important and the people are great. I’ve always enjoyed hanging out with technologists with a positive agenda for change (think Mozilla & OpenMRS) and Code for America takes all that fun and combines it with government – one of my other great passions. I hope to update you on the progress the organizations makes and what will be happening over the coming weeks. And yes, I am thinking about how Code for Canada might fit into all this.

Gov 2.0

I’ll also be in Ottawa twice in January. My first trip out is to present on a paper about how to write collaboratively using a wiki. With a little bit of work, I’ll be repositioning this paper to make it about how to draft public policy on a wiki within government (think GCPEDIA). With luck I’ll publish this in something like Policy Options or something similar (maybe US based?). I think this has the potential of being one of my most important pieces of the year and needless to say, I’m excited and will be grateful for feedback, both good and negative.

Open Data and Government

On my second trip to Ottawa I’ll be presenting on Open Data and Open Government to the Standing Committee on Access to Information, Privacy and Ethics. Obviously, I’m honored and thrilled they’ve asked me to come and talk and look forward to helping parliamentarians understand why this issue is so important and how they could make serious progress in short order if they put their minds to the task.

Licenses

So for the last two years we’ve been working hard to get cities to do open data with significant success. This year, may be the year that the Feds (more on that in a later post) and some provinces get in on the game, as well as a larger group of cities. The goal for them will be to build on, and take to the next level, the successes of the first movers like Vancouver, Edmonton, Toronto and Ottawa. This will mean one thing. Doing the licensing better. The Vancouver license, which has been widely copied was a good starting point for governments venturing into unknown territory (e.g. getting their toes wet). But the license has several problems –and there are several significantly better choices out there (I’m look over at you PDDL – which I notice the City of Surrey has adopted, nice work.). So, I think one big goal for 2011 will be to get governments to begin shifting to (ideally) the PDDL and (if necessary) something more equivalent. On this front I’m feeling optimistic as well and will blog on this in the near future.

Lots of other exciting things going on as well – I look forward to sharing them here in the blog soon.

All in all, its hard to to be excited about 2011 and I hope you are too. Thank you so much for being part of all of this.

The False choice: Bilingualism vs. Open Government (and accountability)

Last week a disturbing headline crossed my computer screen:

B.C. RCMP zaps old news releases from its website

2,500 releases deleted because they weren’t translated into French

1) The worst of all possible outcomes

This is a terrible outcome for accountability and open government. When we erase history we diminish accountability and erode our capacity to learn. As of today, Canadians have a poorer sense of what the RCMP has stood for, what it has claimed and what it has tried to do in British Columbia.

Consider this. The Vancouver Airport is a bilingual designated detachment. As of today, all press releases that were not translated were pulled down. This means that any press release related to the national scandal that erupted after Robert Dziekański – the polish immigrant who was tasered five times by the (RCMP) – is now no longer online. Given the shockingly poor performance the RCMP had in managing (and telling the truth about) this issue – this concerns me.

Indeed, I can’t think that anyone thinks this is a good idea.

The BC RCMP does not appear to think it is a good idea. Consider their press officer’s line: “We didn’t have a choice, we weren’t compliant.”

I don’t think there are any BC residents who believe they are better served by this policy.

Nor do I think my fellow francophone citizens believe they are better served by this decision. Now no one – either francophone or anglophone can find these press releases online. (More on this below)

I would be appalled if a similar outcome occurred in Quebec or a francophone community in Manitoba. If the RCMP pulled down all French press releases because they didn’t happen to have English translations, I’d be outraged – even if I didn’t speak French.

That’s because the one thing worse than not having the document in both official languages, is not having access to the document at all. (And having it hidden in some binder in a barracks that I have to call or visit doesn’t event hint of being accessible in the 21st century).

Indeed, I’m willing to bet almost anything that Graham Fraser, the Official Languages Commissioner – who is himself a former journalist – would be deeply troubled by this decision.

2) Guided by Yesterday, Not Preparing for Tomorrow

Of course, what should really anger the Official Languages Commissioner is an attempt to pit open and accountable government against bilingualism. This is a false choice.

I suspect that the current narrative in government is that translating these documents is too expensive. If one relies on government translators, this is probably true. The point is, we no longer have to.

My friend and colleague Luke C. pinged me after I tweeted this story saying “I’d help them automate translating those news releases into french using myGengo. Would be easy.”

Yes, mygengo would make it cheap at 5 cents a word (or 15 if you really want to overkill it). But even smarter would be to approach Google. Google translate – especially between French and English – has become shockingly good. Perfect… no. Of course, this is what the smart and practical people on the ground at the RCMP were doing until the higher ups got scared by a French CBC story that was critical of the practice. A practice that was ended even though it did not violate any policies.

The problem is there isn’t going to be more money to do translation – not in a world of multi-billion dollar deficits and in a province that boasts 63,000 french speakers. But Google translate? It is going to keep getting better and better. Indeed, the more it translates, the better it gets. If the RCMP (or Canadian government) started putting more documents through Google translate and correcting them it would become still more accurate. The best part is… it’s free. I’m willing to bet that if you ran all 2500 of the press releases through Google translate right now… 99% of them would come out legible and of a standard that would be good enough to share. (again, not perfect, but serviceable). Perhaps the CBC won’t be perfectly happy. But I’m not sure the current outcome makes them happy either. And at least we’ll be building a future in which they will be happy tomorrow.

The point here is that this decision reaffirms a false binary: one based on a 20th century assumptions where translations were expensive and laborious. It holds us back and makes our government less effective and more expensive. But worse, it ignores an option that embraces a world of possibilities – the reality of tomorrow. By continuing to automatically translate these documents today we’d continue to learn how to use and integrate this technology now, and push it to get better, faster. Such a choice would serve the interests of both open and accountable governments as well as bilingualism.

Sadly, no one at the head office of the RCMP – or in the federal government – appears to have that vision. So today we are a little more language, information and government poor.

Three asides:

1) I find it fascinating that the media can get mailed a press release that isn’t translated but the public is not allowed to access it on a website until it is – this is a really interesting form of discrimination, one that supports a specific business model and has zero grounding in the law, and indeed may even be illegal given that the media has no special status in Canadian law.

2) Still more fascinating is how the RCMP appears to be completely unresponsive to news stories about inappropriate behavior in its ranks, like say the illegal funding of false research to defend the war on drugs, but one story about language politics causes the organization to change practices that aren’t even in violation of its policies. It us sad to see more evidence still that the RCMP is one of the most broken agencies in the Federal government.

3) Thank you to Vancouver Sun Reporter Chad Skelton for updating me on the Google aspect of this story.

An Open Letter on Open Government to the Access to Information, Privacy & Ethics Parliamentary Committee

The other week I received an invitation from the Canadian Standing Parliamentary Committee on Access to Information, Privacy & Ethics to come and testify about open government and open data on February 1st.

The Committee has talked a great deal about its efforts to engage in a study of open government and since February 1st is quite a bit away and I’d like to be helpful before my testimony, I thought I draft up some thoughts and suggestion for the committee’s strategy. I know these are unsolicited but I hope they are helpful and, if not, that they at least spark some helpful thoughts.

1. Establish a common understanding of the current state of affairs

First off, the biggest risk at the moment is that the Committee’s work might actually slow down efforts of the government to launch an open data strategy. The Committee’s work, and the drafting of its report, is bound to take several months, it would be a shame if the government were to hold back launching any initiatives in anticipation of this report.

Consequently, my hope is that the committee, at is earliest possible convenience, request to speak to the Chief Information Officer of the Government of Canada to get an update regarding the current status of any open government and open data initiatives, should they exist. This would a) create a common understanding regarding the current state of affairs for both committee members and witnesses; b) allow subsequent testimony and recommendations to take into consideration the work already done and c) allow the committee to structure its work so as to not slow down any current efforts that might be already underway.

2. Transform the Committee into a Government 2.0 Taskforce – similar to the Australian effort

Frankly, my favourite approach in this space has been the British. Two Government’s, one Labour, one Conservative have aggressive pursued an open data and open government strategy. This, would be my hope for Canada. However, it does not appear that is is presently the case. So, another model should be adopted. Fortunately, such a model exists.

Last year, under the leadership of Nicholas Gruen, the Australian government launched a Government 2.0 taskforce on which I had the pleasure of serving on the International Reference Group. The Australian Taskforce was non-partisan and was made up of policy and technical experts and entrepreneurs from government, business, academia, and cultural institutions. More importantly, the overwhelming majority of its recommendations were adopted.

To replicate its success in Canada I believe the Committee should copy the best parts of the Australian taskforce. The topic of Canadians access to their government is of central importance to all Canadians – to non-profits, to business interests, to public servants and, of course, to everyday citizens. Rather than non-partisan, I would suggest that a Canadian taskforce should be pan-partisan – which the Committee already is. However, like the Australian Taskforce it should include a number of policy and technical experts from outside government. This fill committee would this represent both a political cross-section and substantive knowledge in the emerging field of government 2.0. It could thus, as a whole, effectively and quickly draft recommendations to Parliament.

Best of all, because of step #1, this work could proceed in parallel to any projects (if any) already initiated by the government and possibly even inform such work by providing interim updates.

I concede such an approach may be too radical, but I hope it is at least a starting point for an interesting approach.

3. Lead by Example

There is one arena where politicians need not wait on the government to make plans: Parliament itself. Over the past year, while in conversations with the Parliamentary IT staff as well as the Speaker of the House, I have worked to have Parliament make more data about its own operations open. Starting in January, the Parliamentary website will begin releasing the Hansard in XML – this will make it much easier for software developers like the creators of Openparliament.ca as and howdtheyvote.ca to run their sites and for students, researchers and reporters to search and analyze our country’s most important public discussions. In short, by making the Hansard more accessible the Speaker and his IT staff are making parliament more accessible. But this is only the beginning of what parliamentarians could do to make for a truly Open Parliament. The House and Senate’s schedules and agendas, along with committee calendars should all be open. So to should both chambers seating arrangement. Member’s photos and bios should be shared with an unrestricted license as should the videos of parliament.

Leadership in this space would send a powerful message to both the government and the public service that Canada’s politicians are serious about making government more open and accessible to those who elect it. In addition, it could also influence provincial legislature’s and even municipal governments, prompting them to do the same and so enhance our democracy at every level.

4. Finally, understand your task: You are creating a Knowledge Government for a Knowledge Society

One reason I advise the Committee to take on external members is because, laudably, many admit this topic is new to them. But I also want the committee members to understand the gravity of their task. Open Government, Open Data and/or Government 2.0 are important first steps in a much larger project.

What you are really wrestling with here is what government is going to look like in an knowledge economy and a knowledge society. How is going to function with knowledge workers as employees? And, most importantly, how is it going to engage with knowledge citizens, many of whom can and want to make real contributions beyond the taxes they pay and don’t need government to self-organize?

In short, what is a knowledge based government going to look like?

At the centre of that question is how we manage and share information. The basic building block of a knowledge driven society.

Look around, and you can see how the digital world is transforming how we do everything. Few of us can imagine living today without access to the internet and the abundance of information it brings to us. Indeed, we have already become so used to the internet we forget how much it has radically changed whole swaths of our life and economy from the travel and music industry to the post to political fund-raising and to journalism.

If today our government still broadly looks and feels like an institution shaped by the printing press it is because, well it is. Deputy Ministers and Ministers still receive giant briefing binders filled with paper. This is a reflection of how we deal within information and knowledge in government, we move it around (for good reasons) in siloes, operating as though networks, advance search, and other innovations don’t exist (even though they already do).

How our government deals with information is at the heart of your task. I’m not saying you have to re-invent government or dismantle all the silos and ministries. Quite the contrary, I believe small changes can be made that will yield significant benefits, efficiencies and savings while enhancing our democracy. But you will be confronting decades, if not centuries of tradition, culture and process in an institution that is about to go through the biggest change since the invention of the printing press. You don’t have to do it all, but even some small first steps will not come easily. I share this because I want you going into the task with eyes wide open.

At the very least we aren’t going first, our cousins both across the Atlantic, the Pacific and our southern border have already taken the plunge. But this should add urgency to our task. We cannot afford to stand by while others renew their democratic institutions while simultaneously enhancing an emerging and critical pillar of a new knowledge economy and knowledge society.

Opening up parliament and getting government IT right

Last week I received two invitations to present.

The first was an invitation to present to the Parliamentary Standing Committee on Access to Information, Privacy and Ethics. They are preparing a report on Open Government and would like me to make a short presentation and then answer questions for a couple of hours. This is a ways out but obviously I’m treating it with a significant amount of seriousness – so if you have thoughts or comments on things you think I should share, please feel free to ping me or comment below.

(Speaking of parliament… as an aside, I want again to let developers there know that through some engagement I’ve been having with the parliamentary IT staff they’ve informed me they will be releasing a number of data sets in January including the Hansard.)

Second is, next week, I’ll be at the United Nations as part of the Expert Group Meeting on the 2012 e-Government Survey: Towards a More Citizen-Centric Approach. My main goal here is to stop getting governments to compare themselves to one another on how “successful” they are in delivering services and information online. With a few notable exceptions, most government websites are at best functional at worst, unnavigable.  Consequently, comparing themselves to one another allows them to feel like all is okay, when really they are collectively trapped in a world of design mediocrity.

Yes, they aren’t pretty words, but someone has to say them.

So any thoughts on this subject are welcome as well.

More soon on the hackathon and the census.

The Open Data Debate Arrives in Ottawa

The Liberals are promising to create an open data portal – opendata.gc.ca – much like President Obama has done in the United States and both Gordon Brown and David Cameron have done in the United Kingdom.

It’s a savvy move.

In May 2010 when it launched a public consultation on the Digital Economy, the government invited the public to submit proposals and vote on them. Two of the top three most voted ideas involved asking the government to open up access to government collected data. Three months after the submissions have closed it appears the opposition has decided to act on Canadians wishes and release a 21st century open government strategy that reflects these popular demands.

Today, at 1pm EST, I’ve discovered the Liberals will announce that, if elected, they will adopt a government-wide directive in which “the default position for all departments and agencies will be for the release of information to the public, both proactively and responsively, after privacy and other legal requirements are met.”

There is much that both ordinary citizens and advocates of greater government transparency will like in the proposal. Not only have the Liberals mirrored the most aggressive parts of President Obama’s transparency initiatives they are also promising some specific and aggressive policies of their own. In addition to promising to launching opendata.gc.ca to share government data the document proposes the creation of accesstoinformation.gc.ca where citizens could search past and current access to information requests as well as see response times. A third website, entitled accountablespending.gc.ca is also proposed. It would allow government grants, contributions and contracts to be searched.

The announcement brings to the Canadian political debate an exciting issue that first gained broad notoriety in early 2009 when Tim Berners-Lee, the inventor of the world wide web, called on the world’s governments to share their data. By May of that year the United States launched data.gov and in September of 2009 the British Government launched data.gov.uk both of which garnered significant domestic attention. In addition, dozens of cities around the world – including Vancouver, Edmonton and, most recently, Ottawa – have launched websites where they shared information that local charities, non-profits, businesses and ordinary citizens might find useful.

Today, citizens in these jurisdictions enjoy improved access to government information about the economy, government spending, access to information requests, and statistical data. In the United States developers have created websites that empower citizens by enabling them to analyze government data or see what government data exists about their community while a British program alerts citizens to restaurant’s health inspections scores.  The benefit however, not limited to improved transparency and accountability. An independent British estimated that open data could contribute as much as £6 billion to British economy. Canada’s computer developers, journalists and entrepreneurs have been left wondering, when will their government give them access to the data their tax dollars paid to collect?

One obvious intent of the Liberals is to reposition themselves at the forefront of a debate around government transparency and accountability. This is ground that has traditionally been Conservative, but with the cancellation of the long form census, the single source jet fighter contract and, more recently, allegations that construction contracts were awarded to conservative party donors, is once again contestable.

What will be interesting to see is the Conservative response. It’s been rumored the government has explored an open data portal but to date there has been no announcement. Open data is one area where, often, support exists across the political spectrum. In the United Kingdom Gordon Brown’s Labour government launched data.gov.uk but David Cameron’s Conservative government has pursued the project more aggressively still, forcing the release of additional and higher value data to the public. A failure to adopt open data would be tragedy – it would cause Canada to lag in an important space that is beginning to reshape how governments work together and how they serve and interact with citizens. But perhaps most obviously, open data and open government shouldn’t be a partisan issue.

Congratulations to Naheed & other fabulous people

(On a separate note, I’m giving a talk tomorrow at 3pm at UBC.)

For those who weren’t paying attention to the Calgary municipal election last night, Naheed Nenshi came out of third place and won the mayoral race. Of course, the articles are already focusing on the wrong things: he’s Muslim, his a minority, etc…

What really matters about Naheed is that he smart, he is about ideas and he’s progressive. That he’s managed to capture the imagination of a place like Calgary speaks volumes both about how hard he campaigned and how cosmopolitan Canada’s urban centres are becoming.

But back to ideas. I first met Naheed way back when he served as lead author of Building Up: Making Canada’s Cities Magnets for Talent and Engines of Development for Canada25. Essentially for as long as I’ve known him he’s cared about cities (and his passion predates my meeting him). There isn’t much more you could want from someone who is about to become your mayor. For me personally, his work became the template for me later when I worked as lead author first on Canada25’s report written at the request of the Privy Council Office and then, of course, on From Middle to Model Power.

It also speaks volumes about the types of people I had the pleasure to meet through Canada25 and watch grow over the years. Indeed, yesterday I ended up having lunch with Chris Kennedy – another Canada25 alum – who as Superintendent of Schools with the West Vancouver School District is also driven by a sense of public service and policy. Alison Loat, Executive Director of Samara, is another passionate believer in public service and public policy. I’m not sure whether to be more impressed by her own work or simply grateful for her unfailing belief and support of me and my work. And Andrew Medd, who gave me what may have become the best advice about blogging when I first started eaves.ca years ago: “Write for yourself, as though no one will read it.” (advice that actually was fact for the first while – you should only blog if you’re prepared to be alone with your thoughts). Of course there are so many I’m not mentioning like Ross Wallace, Debbie Chachra, Mike Morgan…

Watching the celebrations taking place in Calgary, all I can think of is how lucky I was to get to meet some of these people early on and how much I can’t wait to watch them going forward.

On a separate note, it is very much worth looking at MasterMaq’s election website powered by open election data from the city of Edmonton. From Naheed’s election (in which social media paid a powerful role), to the coverage through Twitter (that’s how I followed the events), social media continues to evolve and have an impact, especially at the local level.

World Bank Discussion on Open Data – lessons for developers, governments and others

Yesterday the World Bank formally launched its Apps For Development competition and Google announced that in addition to integrating the World Bank’s (large and growing) data catalog into searches, it will now do it in 34 languages.

What is fascinating about this announcement and the recent changes at the bank is it appears to be very serious about open data and even more serious about open development. The repercussions of this shift, especially if the bank starts demanding that its national partners also disclose data, could be significant.

This of course, means there is lots to talk about. So, as part of the overall launch of the competition and in an effort to open up the workings of the World Bank, the organization hosted its first Open Forum in which a panel of guests talked about open development and open data. The bank was kind enough to invite me and so I ducted out of GTEC a pinch early and flew down to DC to meet some of the amazing people behind the world bank’s changes and discuss the future of open data and what it means for open development.

Embedded below is the video of the event.

As a little backgrounder here are some links to the bios of the different panelists and people who cycled through the event.

Our host: Molly Wood of CNET.

Andrew McLaughlin, Deputy Chief Technology Officer, The White House (formerly head of Global Public Policy and Government Affairs for Google) (twitter feed)

Stuart Gill, World Bank expert, Disaster Mitigation and Response for LAC

David Eaves, Open Government Writer and Activist

Rakesh Rajani, Founder, Twaweza, an initiative focused on transparency and accountability in East Africa (twitter)

Aleem Walji, Manager, Innovation Practice, World Bank Institute (twitter)

How Governments misunderstand the risks of Open Data

When I’m asked to give a talk about or consult on policies around open data I’ve noticed there are a few questions that are most frequently asked:

“How do I assess the risks to the government of doing open data?”

or

“My bosses say that we can only release data if we know people aren’t going to do anything wrong/embarrassing/illegal/bad with it”

I would argue that these question are either flawed in their logic, or have already been largely addressed.

Firstly, it seems problematic to assess the risks of open data, without also assessing the opportunity. Any activity – from walking out my front door to scaling Mount Everest carries with it risks. What needs to be measured are not the risks in isolation but the risks balanced against the opportunity and benefits.

But more importantly, the logic of the question is flawed in another manner. It suggests that the government only take action if every possible negative use can be prevented.

Let’s forget about data for a second – imagine you are building a road. Now ask: “what are the risk’s that someone might misuse this road?” Well… they are significant. People are going to speed and they are going to jay walk. But it gets worse. Someone may rob a bank and then use the road as part of their escape route. Of course, the road will also provide more efficient transportation for 1000s of people, it will reduce costs, improve access, help ambulances save peoples lives and do millions of other things, but people will also misuse it.

However, at no point in any policy discussion in any government has anyone said “we can’t build this road because, hypothetically, someone may speed or use it as an escape route during a robbery.”

And yet, this logic is frequently accepted, or at least goes unchallenged, as appropriate when discussing open data.

The fact is, most governments already have the necessary policy infrastructure for managing the overwhelming majority of risks concerning open data. Your government likely has provisions dealing with privacy – if applied to open data this should address these concerns. Your government likely has provisions for dealing with confidential and security related issues – if applied to open data this should address these concerns. Finally, your government(s) likely has a legal system that outlines what is, and is not legal – when it comes to the use of open data, this legal system is in effect.

If someone gets caught speeding, we have enforcement officials and laws that catch and punish them. The same is true with data. If someone uses it to do something illegal we already have a system in place for addressing that. This is how we manage the risk of misuse. It is seen as acceptable for every part of our life and every aspect of our society. Why not with open data too?

The opportunity, of both roads and data, are significant enough that we build them and share them despite the fact that a small number of people may not use them appropriately. Should we be concerned about those who will misuse them? Absolutely. But do we allow a small amount of misuse to stop us from building roads or sharing data? No. We mitigate the concern.

With open data, I’m happy to report that we already have the infrastructure in place to do just that.