Category Archives: public service sector renewal

The "I Lost My Wallet" Service – Doing Government Service Delivery Right

A couple of years ago I was in Portugal to give a talk on Gov 2.0 at a conference the government was organizing. After the talk I went for dinner with the country’s CIO and remember hearing about a fantastic program they were running that – for me – epitomized the notion of a citizen centric approach. It was a help desk called: I Lost My Wallet.

Genius.

Essentially, it was a place you went when… you lost your wallet. What the government had done was bring together all the agencies that controlled a document or card that was likely to have been in your wallet. As a result, rather than running around from agency to agency filling out your name and address over and over again on dozens of different forms, you went to a single desk, filled out one set of forms to get new copies of say, your social insurance card, your drivers license, healthcare card and library card.

But get this. From the briefing I had, my understanding was that this service was not limited to government cards, they’d also partnered with several private entities. For example, I notice that the service also works for replacing Portugal’s Automotive Club Card. In addition – if I remember correctly – I was told the government was negotiating with the banks so that you could also cancel and replace your ATM/bank card and visa cards at this counter as well.

Now this is citizen centric service. Here the government is literally molded itself – pulling together dozens of agencies and private sector actors around a single service – so that a citizens can simply and quickly deal with a high stress moment. Yes, I’d love to live in a world where all these cards disappeared altogether and were simply managed by a single card of your choosing (like say your Oyster card in the UK – so that your subway fare card was also your healthcare card, government ID, and credit card). But we are a few years away from that still and so this is a nice interim service.

But more importantly it shows a real ability to shed silos and build a service around a citizen/customer need. I believe they had a similar service for “I bought a house” since this is a moment when a number of different government services become relevant simultaneously. I of course, can imagine several others – most notably a “my partner just died” service could be invaluable at helping people manage a truly terrible moment when dealing with government bureaucracy is the last thing they want to be doing.

You can find the website for I lost my Wallet here (it is, naturally, in Portuguese). You can also read more about it, as documented by the European Union here. Lots of food for thought here for those of you designing programs to serve citizens, be it in the public or private sector.

Mainstreaming The Gov 2.0 Message in the Canadian Public Service

A couple of years ago I wrote a Globe Op-Ed “A Click Heard Across the Public Service” that outlined the significance of the clerk using GCPEDIA to communicate with public servants. It was a message – or even more importantly – an action to affirm his commitment to change how government works. For those unfamiliar, the Clerk of the Privy Council is the head of the public service for the federal government, a crude analogy would be he is the CEO and the Prime Minister is the Chairman (yes, I know that analogy is going to get me in trouble with people…)

Well, the clerk continues to broadcast that message, this time in his Nineteenth Annual Report to the Prime Minister on the Public Service of Canada. As an observer in this space what is particularly exciting for me is that:

  • The Clerk continues to broadcast this message. Leadership and support at the top is essential on these issues. It isn’t sufficient, but it is necessary.
  • The role of open data and social media is acknowledged on several occasions

And as a policy entrepreneur, what is doubly exciting is that:

  • Projects I’ve been personally involved in get called out; and
  • Language I’ve been using in briefs, blog posts and talks to public servants is in this text

You can, of course, read the whole report here. There is much more in it than just talk of social media and rethinking the public service, there is obviously talk about the budget and other policy areas as well. But bot the continued prominence given to renewal and technology, and explicit statements about the failure to move fast enough to keep up with the speed of change in society at large, suggests that the clerk continues to be worried about this issue.

For those less keen to read the whole thing, here are some juice bits that mattered to me:

In the section “The World in Which We Serve” which is basically providing context…

At the same time, the traditional relationship between government and citizens continues to evolve. Enabled by instantaneous communication and collaboration technologies, citizens are demanding a greater role in public policy development and in the design and delivery of services. They want greater access to government data and more openness and transparency from their institutions.

Under “Our Evolving Institution” which lays out some of the current challenges and priorities we find this as one of the four areas of focus mentioned:

  • The Government expanded its commitment to Open Government through three main streams: Open Data (making greater amounts of government data available to citizens), Open Information (proactively releasing information about Government activities) and Open Dialogue (expanding citizen engagement with Government through Web 2.0 technologies).

This is indeed interesting. The more this government talks about open in general, the more it will be interesting to see how the public reacts, particularly in regards to its treatment of certain sectors (e.g. environmental groups). Still more interesting is what appears to be a growing recognition of the importance of data (from a government that cut the long form census). Just yesterday the Health Minister, while talking about a controversial multiple sclerosis vein procedure stated that:

“Before our government will give the green light to a limited clinical trial here in Canada, the proposed trial would need to receive all necessary ethical and medical approvals. As Minister of Health, when it comes to clinical issues, I rely on advice from doctors and scientists who are continually monitoring the latest research, and make recommendations in the best interests of patient health and safety.”

This is, interestingly, an interesting statement from a government that called doctors “unethical” because of their support for the insite injection site which, the evidence shows, is the best way to save lives and get drug users into detox programs.

For evidence based policy advocates – such as myself – the adoption of the language of data is one that I think could help refocus debates onto a more productive terrain.

Then towards the bottom of the report there is a call out that mentions the Open Policy conference at DFAIT I had the real joy of helping out convene and that I served as the host and facilitator for.

Policy Built on Shared Knowledge
The Department of Foreign Affairs and International Trade (DFAIT) has been experimenting with an Open Policy Development Model that uses social networking and technology to leverage ideas and expertise from both inside and outside the department. A recent full-day event convened 400 public and private sector participants and produced a number of open policy pilots, e.g., an emergency response simulation involving consular officials and a volunteer community of digital crisis-mappers.

DFAIT is also using GCConnex, the Public Service’s social networking site, to open up policy research and development to public servants across departments.

This is a great, a much deserved win for the team at DFAIT that went out on a limb to run this conference and we rewarded with participation from across the public service.

Finally, anyone who has seen me speak will recognize a lot of this text as well:

As author William Gibson observed, “The future is already here, it’s just unevenly distributed.” Across our vast enterprise, public servants are already devising creative ways to do a better job and get better results. We need to shine a light on these trailblazers so that we can all learn from their experiments and build on them. Managers and senior leaders can foster innovation—large and small—by encouraging their teams to ask how their work can be done better, test out new approaches and learn from mistakes.

So much innovation in the 21st century is being made possible by well-developed communication technologies. Yet many public servants are frustrated by a lack of access to the Web 2.0 and social media tools that have such potential for helping us transform the way we work and serve Canadians. Public servants should enjoy consistent access to these new tools wherever possible. We will find a way to achieve this while at the same time safeguarding the data and information in our care.

I also encourage departments to continue expanding the use of Web 2.0 technologies and social media to engage with Canadians, share knowledge, facilitate collaboration, and devise new and efficient services.

To be fully attribute, the William Gibson quote, which I use a great deal, was something I first saw used by my friend Tim O’Reilly who is, needless to say, a man with a real ability to understand a trend and explain an idea to people. I hope his approach to thinking is reflected in much of what I do.

What, in sum, all these call outs really tell us is that the Gov 2.0 message in the federal public service is being mainstreamed, at the very least among the most senior public servants. This does not mean that our government is going to magically transform, it simply means that the message is getting through and people are looking for ways to push this type of thinking into the organization. As I said before, this is not sufficient to change the way government works, but it is necessary.

Going to keep trying to see what I can do to help.

Open Data Movement is a Joke?

Yesterday, Tom Slee wrote a blog post called “Why the ‘Open Data Movement’ is a Joke,” which – and I say this as a Canadian who understands the context in which Slee is writing – is filled with valid complaints about our government, but which I feel paints a flawed picture of the open data movement.

Evgeny Morozov tweeted about the post yesterday, thereby boosting its profile. I’m a fan of Evgeny. He is an exceedingly smart and critical thinker on the intersection of technology and politics. He is exactly what our conversation needs (unlike, say, Andrew Keen). I broadly felt his comments (posted via his Twitter stream) were both on target: we need to think critically about open data; and lacked nuance: it is possible for governments to simultaneously become more open and more closed on different axis. I write all this confident that Evgeny may turn his ample firepower on me, but such is life.

So, a few comments on Slee’s post:

First, the insinuation that the open data movement is irretrievably tainted by corporate interests is so over the top it is hard to know where to begin to respond. I’ve been advocating for open data for several years in Canada. Frankly, it would have been interesting and probably helpful if a large Canadian corporation (or even a medium sized one) took notice. Only now, maybe 4-5 years in, are they even beginning to pay attention. Most companies don’t even know what open data is.

Indeed, the examples of corporate open data “sponsors” that Slee cites are U.S. corporations, sponsoring U.S. events (the Strata conference) and nonprofits (Code for America – of which I have been engaged with). Since Slee is concerned primarily with the Canadian context, I’d be interested to hear his thoughts on how these examples compare to Canadian corporate involvement in open data initiatives – or even foreign corporations’ involvement in Canadian open data.

And not to travel too far down the garden path on this, but it’s worth noting that the corporations that have jumped on the open data bandwagon in the US often have two things in common: First, their founders are bona fide geeks, who in my experience are both interested in hard data as an end unto itself (they’re all about numbers and algorithms), and want to see government-citizen interactions – and internal governmental interactions, too – made better and more efficient. Second, of course they are looking after their corporate interests, but they know they are not at the forefront of the open data movement itself. Their sponsorship of various open data projects may well have profit as one motive, but they are also deeply interested in keeping abreast of developments in what looks to be a genuine Next Big Thing. For a post the Evgeny sees as being critical of open data, I find all this deeply uncritical. Slee’s post reads as if anything that is touched by a corporation is tainted. I believe there are both opportunities and risks. Let’s discuss them.

So, who has been advocating for open data in Canada? Who, in other words, comprises the “open data movement” that Slee argues doesn’t really exist – and that “is a phrase dragged out by media-oriented personalities to cloak a private-sector initiative in the mantle of progressive politics”? If you attend one of the hundreds of hackathons that have taken place across Canada over the past couple years – like those that have happened in Vancouver, Regina, Victoria, Montreal and elsewhere – you’ll find they are generally organized in hackspaces and by techies interested in ways to improve their community. In Ottawa, which I think does the best job, they can attract hundreds of people, many who bring spouses and kids as they work on projects they think will be helpful to their community. While some of these developers hope to start businesses, many others try to tackle issues of public good, and/or try to engage non-profits to see if there is a way they can channel their talent and the data. I don’t for a second pretend that these participants are a representative cross-section of Canadians, but by and large the profile has been geek, technically inclined, leaning left, and socially minded. There are many who don’t fit that profile, but that is probably the average.

Second, I completely agree that this government has been one of the most – if not the most – closed and controlling in Canada’s history. I, like many Canadians, echo Slee’s frustration. What’s worse, is I don’t see things getting better. Canadian governments have been getting more centralized and controlling since at least Trudeau, and possibly earlier (Indeed, I believe polling and television have played a critical role in driving this trend). Yes, the government is co-opting the language of open data in an effort to appear more open. All governments co-opt language to appear virtuous. Be it on the environment, social issues or… openness, no government is perfect and indeed, most are driven by multiple, contradictory goals.

As a member of the Federal Government’s Open Government Advisory Panel I wrestle with this challenge constantly. I’m try hard to embed some openness into the DNA of government. I may fail. I know that I won’t succeed in all ways, but hopefully I can move the rock in the right direction a little bit. It’s not perfect, but then it’s pretty rare that anything involving government is. In my (unpaid, advisory, non-binding) role I’ve voiced that the government should provide the Access to Information Commissioner with a larger budget (they cut it) and that they enable government scientists to speak freely (they have not so far). I’ve also advocated that they should provide more open data. There they have, including some data sets that I think are important – such as aid data (which is always at risk of being badly spent). For some, it isn’t enough. I’d like for there to be more open data sets available, and I appreciate those (like Slee – who I believe is writing from a place of genuine care and concern) who are critical of these efforts.

But, to be clear, I would never equate open government data as being tantamount to solving the problems of a restrictive or closed government (and have argued as much here). Just as an authoritarian regime can run on open-source software, so too might it engage in open data. Open data is not the solution for Open Government (I don’t believe there is a single solution, or that Open Government is an achievable state of being – just a goal to pursue consistently), and I don’t believe anyone has made the case that it is. I know I haven’t. But I do believe open data can help. Like many others, I believe access to government information can lead to better informed public policy debates and hopefully some improved services for citizens (such as access to transit information). I’m not deluded into thinking that open data is going to provide a steady stream of obvious “gotcha moments” where government malfeasance is discovered, but I am hopeful that government data can arm citizens with information that the government is using to inform its decisions so that they can better challenge, and ultimately help hold accountable, said government.

Here is where I think Evgeny’s comments on the problem with the discourse around “open” are valid. Open Government and Open Data should not be used interchangeably. And this is an issue Open Government and Open Data advocates wrestle with. Indeed, I’ve seen a great deal of discussion and reflection come as a result of papers such as this one.

Third, the arguments around StatsCan all feel deeply problematic. I say this as the person who wrote the first article (that I’m aware of) about the long form census debacle in a major media publication and who has been consistently and continuously critical of it. This government has had a dislike for Statistics Canada (and evidence) long before open data was in their vocabulary, to say nothing of a policy interest. StatsCan was going to be a victim of dramatic cuts regardless of Canada’s open data policy – so it is misleading to claim that one would “much rather have a fully-staffed StatsCan charging for data than a half-staffed StatsCan providing it for free.” (That quote comes from Slee’s follow-up post, here.) That was never the choice on offer. Indeed, even if it had been, it wouldn’t have mattered. The total cost of making StatsCan data open is said to have been $2 million; this is a tiny fraction of the payroll costs of the 2,500 people they are looking to lay off.

I’d actually go further than Slee here, and repeat something I say all the time: data is political. There are those who, naively, believed that making data open would depoliticize policy development. I hope there are situations where this might be true, but I’ve never taken that for granted or assumed as much: Quite the opposite. In a world where data increasingly matters, it is increasingly going to become political. Very political. I’ve been saying this to the open data community for several years, and indeed was a warning that I made in the closing part of my keynote at the Open Government Data Camp in 2010. All this has, in my mind, little to do with open data. If anything, having data made open might increase the number of people who are aware of what is, and is not, being collected and used to inform public policy debates. Indeed, if StatsCan had made its data open years ago it might have had a larger constituency to fight on its behalf.

Finally, I agree with the Nat Torkington quote in the blog post:

Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.

Moreover, it is possible for a policy to have two different possible drivers. It can even have multiple contradictory drivers simultaneously. In Canada, my assessment is that the government doesn’t have this level of sophistication around its thinking on this file, a conclusion I more or less wrote when assessing their Open Government Partnership commitments. I have no doubt that the conservatives would like to turn government provided services over to the private sector, and open data has so far not been part of that strategy. In either case, there is, in my mind, a policy infrastructure that needs to be in place to pursue either of these goals (such as having a data governance structure in place). But from a more narrow open data perspective, my own feeling is that making the data open has benefits for public policy discourse, public engagement, and economic reasons. Indeed, making more government data available may enable citizens to fight back against policies they feel are unacceptable. You may not agree with all the goals of the Canadian government – as someone who has written at least 30 opeds in various papers outlining problems with various government policies, neither do I – but I see the benefits of open data as real and worth pursuing, so I advocate for it as best I can.

So in response to the opening arguments about the open data movement…

It’s not a movement, at least in any reasonable political or cultural sense of the word.

We will have to agree to disagree. My experience is quite the opposite. It is a movement. One filled with naive people, with skeptics, with idealists focused on accountability, developers hoping to create apps, conservatives who want to make government smaller and progressives who want to make it more responsive and smarter. There was little in the post that persuaded me there wasn’t a movement. What I did hear is that the author didn’t like some parts of the movement and its goals. Great! Please come join the discussion; we’d love to have you.

It’s doing nothing for transparency and accountability in government,

To say it is doing nothing for transparency seems problematic. I need only cite one data set now open to say that isn’t true. And certainly publication of aid data, procurement data, publications of voting records and the hansard are examples of places where it may be making government more transparent and accountable. What I think Slee is claiming is that open data isn’t transforming the government into a model of transparency and accountability, and he’s right. It isn’t. I don’t think anyone claimed it would. Nor do I think the public has been persuaded that because it does open data, the government is somehow open and transparent. These are not words the Canadian public associates with this government no matter what it does on this file.

It’s co-opting the language of progressive change in pursuit of what turns out to be a small-government-focused subsidy for industry.

There are a number of sensible, critical questions in Slee’s blog post. But this is a ridiculous charge. Prior to the data being open, you had an asset that was paid for by taxpayer dollars, then charged for at a premium that created a barrier to access. Of course, this barrier was easiest to surmount for large companies and wealthy individuals. If there was a subsidy for industry, it was under the previous model, as it effectively had the most regressive tax for access of any government service.

Indeed, probably the biggest beneficiaries of open data so far have been Canada’s municipalities, which have been able to gain access to much more data than they previously could, and have saved a significant amount of money (Canadian municipalities are chronically underfunded.) And of course, when looking at the most downloaded data sets from the site, it would appear that non-profits and citizens are making good use of them. For example, the 6th most downloaded was the Anthropogenic disturbance footprint within boreal caribou ranges across Canada used by many environmental groups; number 8 was weather data; 9th was Sales of fuel used for road motor vehicles, by province and territory, used most frequently to calculate Green House Gas emissions; and 10th the Government of Canada Core Subject Thesaurus – used, I suspect, to decode the machinery of government. Most of the other top downloaded data sets related to immigration, used it appears, to help applicants. Hard to see the hand of big business in all this, although if open data helped Canada’s private sector become more efficient and productive, I would hardly complain.

If your still with me, thank you, I know that was a long slog.

Public Policy: The Big Opportunity For Health Record Data

A few weeks ago Colin Hansen – a politician in the governing party in British Columbia (BC) – penned an op-ed in the Vancouver Sun entitled Unlocking our data to save lives. It’s a paper both the current government and opposition should read, as it is filled with some very promising ideas.

In it, he notes that BC has one of the best collections of health data anywhere in the world and that, data mining these records could yield patterns – like longitudinal adverse affects when drugs are combined or the correlations between diseases – that could save billions as well as improve health care outcomes.

He recommends that the province find ways to share this data with researchers and academics in ways that ensure the privacy of individuals are preserved. While I agree with the idea, one thing we’ve learned in the last 5 years is that, as good as academics are, the wider public is often much better in identifying patterns in large data sets. So I think we should think bolder. Much, much bolder.

Two years ago California based Heritage Provider Network, a company that runs hospitals, launched a $3 Million predictive health contest that will reward the team who, in three years, creates the algorithm that best predicts how many days a patient will spend in a hospital in the next year. Heritage believes that armed with such an algorithm, they can create strategies to reach patients before emergencies occur and thus reduce the number of hospital stays. As they put it: “This will result in increasing the health of patients while decreasing the cost of care.”

Of course, the algorithm that Heritage acquires through this contest will be proprietary. They will own it and I can choose who to share it with. But a similar contest run by BC (or say, the VA in the United States) could create a public asset. Why would we care if others made their healthcare system more efficient, as long as we got to as well. We could create a public good, as opposed to Heritage’s private asset. More importantly, we need not offer a prize of $3 million dollars. Several contests with prizes of $10,000 would likely yield a number of exciting results. Thus for very little money with might help revolutionize BC, and possibly Canada’s and even the world’s healthcare systems. It is an exciting opportunity.

Of course, the big concern in all of this is privacy. The Globe and Mail featured an article in response to Hansen’s oped (shockingly but unsurprisingly, it failed to link back to – why do newspaper behave that way?) that focused heavily on the privacy concerns but was pretty vague about the details. At no point was a specific concern by the privacy commissioner raised or cited. For example, the article could have talked about the real concern in this space, what is called de-anonymization. This is when an analyst can take records – like health records – that have been anonymized to protect individual’s identity and use alternative sources to figure out who’s records belong to who. In the cases where this occurs it is usually only only a handful of people whose records are identified, but even such limited de-anonymization is unacceptable. You can read more on this here.

As far as I can tell, no one has de-anonymized the Heritage Health Prize data. But we can take even more precautions. I recently connected with Rob James – a local epidemiologist who is excited about how opening up anonymized health care records could save lives and money. He shared with me an approach taking by the US census bureau which is even more radical than de-anonymization. As outlined in this (highly technical) research paper by Jennifer C. Huckett and Michael D. Larsen, the approach involves creating a parallel data set that has none of the features of the original but maintains all the relationships between the data points. Since it is the relationships, not the data, that is often important a great deal of research can take place with much lower risks. As Rob points out, there is a reasonably mature academic literature on these types of privacy protecting strategies.

The simple fact is, healthcare spending in Canada is on the rise. In many provinces it will eclipse 50% of all spending in the next few years. This path is unsustainable. Spending in the US is even worse. We need to get smarter and more efficient. Data mining is perhaps the most straightforward and accessible strategy at our disposal.

So the question is this: does BC want to be a leader in healthcare research and outcomes in an area the whole world is going to be interested in? The foundation – creating a high value data set – is already in place. The unknown is if can we foster a policy infrastructure and public mandate that allows us to think and act in big ways. It would be great if government officials, the privacy commissioner and some civil liberties representatives started to dialogue to find some common ground.  The benefits to British Columbians – and potentially to a much wider population – could be enormous, both in money and, more importantly, lives, saved.

Canada's Action Plan on Open Government: A Review

The other day the Canadian Government published its Action Plan on Open Government, a high level document that both lays out the Government’s goals on this file as well as fulfill its pledge to create tangible goals as part of its participation in next week’s Open Government Partnership 2012 annual meeting in Brazil.

So what does the document say and what does it mean? Here is my take.

Take Away #1: Not a breakthrough document

There is much that is good in the government’s action plan – some of which I will highlight later. But for those hoping that Canada was going to get the Gov 2.0 bug and try to leapfrog leaders like the United States or the United Kingdom, this document will disappoint. By and large this document is not about transforming government – even at its most ambitious it appears to be much more about engaging in some medium sized experiments.

As a result the document emphasizes a number of things that the UK and US started doing several years ago such  getting license that adheres to international norms or posting government resource allocation and performance management information online in machine readable forms or refining the open data portal.

What you don’t see are explicit references to try to re-think how government leverages citizens experience and knowledge with a site like Challenge.gov, engage experts in innovative ways such as with Peer to Patent, or work with industries or provinces to generate personal open data such as the US has done with the Blue Button (for Healthcare) or the Green Button (for utilities).

Take Away #2: A Solid Foundation

This said, there is much in the document that is good. Specifically, in many areas, it does lay a solid foundation for some future successes. Probably the most important statements are the “foundational commitments” that appear on this page. Here are some key points:

Open Government Directive

In Year 1 of our Action Plan, we will confirm our policy direction for Open Government by issuing a new Directive on Open Government. The Directive will provide guidance to 106 federal departments and agencies on what they must do to maximize the availability of online information and data, identify the nature of information to be published, as well as the timing, formats, and standards that departments will be required to adopt… The clear goal of this Directive is to make Open Government and open information the ‘default’ approach.

This last sentence is nice to read. Of course the devil will be in the detail (and in the execution) but establishing a directive around open information could end being as important (although admittedly not as powerful – an important point) as the establishment of Access to Information. Done right such a directive could vastly expand the range of documents made available to the public, something that should be very doable as more and more government documentation moves into digital formats.

For those complaining about the lack of ATI reform in the document this directive, and its creation will be with further exploration. There is an enormous opportunity here to reset how government discloses information – and “the default to open” line creates a public standard that we can try to hold the government to account on.

And of course the real test for all this will come in years 2-3 when it comes time to disclose documents around something sensitive to the government… like, say, around the issue of the Northern Gateway Pipeline (or something akin to the Afghan Prisoner issue). In theory this directive should make all government research and assessments open, when this moment happens we’ll have a real test of the robustness of any new such directive.

Open Government License:

To support the Directive and reduce the administrative burden of managing multiple licensing regimes across the Government of Canada, we will issue a new universal Open Government License in Year 1 of our Action Plan with the goal of removing restrictions on the reuse of published Government of Canada information (data, info, websites, publications) and aligning with international best practices… The purpose of the new Open Government License will be to promote the re-use of federal information as widely as possible...

Full Disclosure: I have been pushing (in an unpaid capacity) for the government to reform its license and helping out in its discussions with other jurisdictions around how it can incorporate the best practices and most permissive language possible.

This is another important foundational piece. To be clear, this is not about an “open data” license. This is about creating a licensing for all government information and media. I suspect this appeals to this government in part because it ends the craziness of having lawyers across government constantly re-inventing new licenses and creating a complex set of licenses to manage. Let me be clear about what I think this means: This is functionally about neutering crown copyright. It’s about creating a licensing regime that makes very clear what the users rights are (which crown copyright does not doe) and that is as permissive as possible about re-use (which crown copyright, because of its lack of clarity, is not). Achieving such a license is a critical step to doing many of the more ambitious open government and gov 2.0 activities that many of us would like to see happen.

Take Away #3: The Good and Bad Around Access to Information

For many, I think this may be the biggest disappointment is that the government has chosen not to try to update the Access to Information Act. It is true that this is what the Access to Information Commissioners from across the country recommended they do in an open letter (recommendation #2 in their letter). Opening up the act likely has a number of political risks – particularly for a government that has not always been forthcoming documents (the Afghan detainee issue and F-35 contract both come to mind) – however, I again propose that it may be possible to achieve some of the objectives around improved access through the Open Government Directive.

What I think shouldn’t be overlooked, however, is the government’s “experiment” around modernizing the administration of Access to Information:

To improve service quality and ease of access for citizens, and to reduce processing costs for institutions, we will begin modernizing and centralizing the platforms supporting the administration of Access to Information (ATI). In Year 1, we will pilot online request and payment services for a number of departments allowing Canadians for the first time to submit and pay for ATI requests online with the goal of having this capability available to all departments as soon as feasible. In Years 2 and 3, we will make completed ATI request summaries searchable online, and we will focus on the design and implementation of a standardized, modern, ATI solution to be used by all federal departments and

These are welcome improvements. As one colleague – James McKinney – noted, the fact that you have to pay with a check means that only people with Canadian bank accounts can make ATIP requests. This largely means just Canadian citizens. This is ridiculous. Moreover, the process is slow and painful (who uses check! the Brits are phasing them out by 2018 – good on em!). The use of checks creates a real barrier – particularly I think, for young people.

Also, being able search summaries of previous requests is a no-brainer.

Take Away #4: The is a document of experiments

As I mentioned earlier, outside the foundational commitments, the document reads less like a grand experiment and more like a series of small experiments.

Here the Virtual Library is another interesting commitment – certainly during the consultations the number one complaint was that people have a hard time finding what they are looking for on government websites. Sadly, even if you know the name of the document you want, it is still often hard to find. A virtual library is meant to address this concern – obviously it is all going to be in the implementation – but it is a response to a genuine expressed need.

Meanwhile the Advancing Recordkeeping in the Government of Canada and User-Centric Web Services feel like projects that were maybe already in the pipeline before Open Government came on the scene. They certainly do conform with the shared services and IT centralization announced by Treasury Board last year. They could be helpful but honestly, these will all be about execution since these types of projects can harmonize processes and save money, or they can become enormous boondoggles that everyone tries to work around since they don’t meet anyone’s requirements. If they do go the right way, I can definitely imagine how they might help the management of ATI requests (I have to imagine it would make it easier to track down a document).

I am deeply excited about the implementation of International Aid Transparency Initiative (IATI). This is something I’ve campaigned for and urged the government to adopt, so it is great to see. I think these types of cross jurisdictional standards have a huge role to play in the open government movement, so joining one, figuring out what about the implementation works and doesn’t work, and assessing its impact, is important both for Open Government in general but also for Canada, as it will let us learn lessons that, I hope, will become applicable in other areas as more of these types of standards emerge.

Conclusion:

I think it was always going to be a stretch to imagine Canada taking a leadership role in Open Government space, at least at this point. Frankly, we have a lot of catching up to do, just to draw even with places like the US and the UK which have been working hard to keep experimenting with new ideas in the space. What is promising about the document is that it does present an opportunity for some foundational pieces to be put into play. The bad news is that real efforts to rethink governments relationship with citizens, or even the role of the public servant within a digital government, have not been taken very far.

So… a C+?

 

Additional disclaimer: As many of my readers know, I sit on the Federal Government’s Open Government Advisory Panel. My role on this panel is to serve as a challenge function to the ideas that are presented to us. In this capacity I share with them the same information I share with you – I try to be candid about what I think works and doesn’t work around ideas they put forward. Interestingly, I did not see even a draft version of the Action Plan until it was posted to the website and was (obviously by inference) not involved in its creation. Just want to share all that to be, well, transparent, about where I’m coming from – which remains as a citizen who cares about these issues and wants to push governments to do more around gov 2.0 and open gov.

Also, sorry or the typos, but I’m sick and it is 1am. So I’m checking out. Will proof read again when I awake.

Using BHAG's to Change Organizations: A Management, Open Data & Government Mashup

I’m a big believer in the ancillary benefits of a single big goal. Set a goal that has one clear objective, but as a result a bunch of other things have to change as well.

So one of my favourite Big Hairy Audacious Goals (BHAG) for an organization is to go paperless. I like the goal for all sorts of reasons. Much like a true BHAG it is is clear, compelling, and has obvious “finish line.” And while hard, it is achievable.

It has the benefit of potentially making the organization more “green” but, what I really like about it is that it requires a bunch of other steps to take place that should position the organization to become more efficient, effective and faster.

This is because paper is dumb technology. Among many, many other things, information on paper can’t be tracked, changes can’t be noted, pageviews can’t be recorded, data can’t be linked. It is hard to run a lean business when you’re using paper.

Getting rid of it often means you have get a better handle on workflow and processes so they can be streamlined. It means rethinking the tools you use. It means getting rid of checks and into direct deposit, moving off letters and into email, getting your documents, agendas, meeting minutes, policies and god knows what else out of MS Word and onto wikis, shifting from printed product manuals to PDFs or better still, YouTube videos. These changes in turn require a rethinking of how your employees work together and the skills they require.

So what starts off as a simple goal – getting rid of paper – pretty soon requires some deep organizational change. Of course, the rallying cry of “more efficient processes!” or “better understanding our workflow” have pretty limited appeal and, can be hard for everyone to wrap their head around. However, “getting rid of paper”? It is simple, clear and, frankly, is something that everyone in the organization can probably contribute an idea towards achieving. And, it will achieve many of the less sexy but more important goals.

Turns out, maybe some governments may be thinking this way.

The State of Oklahoma has a nice website that talks about all their “green” initiatives. Of course, it just so happens that many of these initiatives – reducing travel, getting rid of paper, etc… also happen to reduce costs and improve service but are easier to measure. I haven’t spoken with anyone at the State of Oklahoma to see if this is the real goal, but the website seems to acknowledges that it is:

OK.gov was created to improve access to government, reduce service-processing costs and enable state agencies to provide a higher quality of service to their constituents.

So for Oklahoma, going paperless becomes a way to get at some larger transformations. Nice BHAG. Of course, as with any good BHAG, you can track these changes and share them with your shareholders, stakeholders or… citizens.

And behold! The Oklahoma go green website invites different state agencies to report data on how their online services reduce paper consumption and/or carbon emissions. Data that they in turn track and share with the public via the state’s Socrata data portal. This graph shows how much agencies have reduced their paper output over the past four years.

Notice how some departments have no data – if I were an Oklahoma taxpayer, I’m not too sure I’d be thrilled with them.But take a step back. This is a wonderful example of how transparency and open data can help drive a government initiative. Not only can that data make it easier for the public to understand what has happened (and so be more readily engaged) but it can help cultivating a culture of accountability as well as – and perhaps more importantly – promote a culture of metrics that I believe will be critical for the future of government.

I often say to governments “be strategic about how you use some of the data you make open.” Don’t just share a bunch of stuff, use what you share to achieve policy or organizational objectives. This is a great example. It’s also a potentially a great example at organizational change in a large and complex environment. Interesting stuff.

 

 

Citizen Surveillance and the Coming Challenge for Public Institutions

The other day I stumbled over this intriguing article which describes how a group of residents in Vancouver have started to surveille the police as they do their work in the downtown eastside, one of the poorest and toughest neighborhoods in Canada. The reason is simple. Many people – particularly those who are marginalized and most vulnerable – simply do not trust the police. The interview with the founder of Vancouver Cop Watch probably sums it up best:

“One of the complaints we have about District 2 is about how the Vancouver police were arresting people and taking them off to other areas and beating them up instead of taking them to a jail,” Allan told the Georgia Straight in a phone interview. “So what we do is that, when in the Downtown Eastside, whenever we see the police arresting someone, we follow behind them to make sure that the person makes it to the jail.”

In a world where many feel it is hard to hold accountable government in general and police forces specifically, finding alternative means of creating such accountability will be deeply alluring. And people no longer need the funding and coordination of organizations like Witness (which initially focused on getting videocameras into peoples hands in an effort to prevent human rights abuses). Digital video cameras and smart phones coupled with services like youtube now provide this infrastructure for virtually nothing.

This is the surveillance society – predicted and written about by authors like David Brin – and it is driven as much by us, the citizens, as it is by government.

Vancouver Cop Watch is not the first example of this type of activity – I’ve read about people doing this across the United States. What is fascinating is watching the state try to resist and fail miserably. In the US the police have lost key battles in the courts. This after the police arrested people filming them even when while on their own property. And despite the ruling people continue to be arrested for filming the police – a choice I suspect diminishes public confidence in the police and the state.

And it is not just the police getting filmed. Transit workers in Toronto have taken a beating of late as they are filmed asleep on the job. Similarly, a scared passenger filmed an Ottawa bus driver who was aggressive and swearing at an apologizing mentally ill passenger. A few years ago the public in Montreal was outraged as city crews were filmed repairing few potholes and taking long breaks.

The simple fact is, if you are a front line worker – in either the private, but especially, the public sector – there is a good chance that at some point in your career you’re going to be filmed. And even when you are not being filmed, more data is going to be collected about what you do and how you do it.

Part of this reality is that it is going to require a new level of training for front line workers, this will be particularly hard on the police, but they should expect more stories like this one.

I also suspect there will be two reactions to it. Some government services will clam up and try to become more opaque, fearing all public inquiry. Their citizens – armed with cameras – all become potential threats. Over time, it is hard not imagining their legitimacy becoming further and further eroded (I’m thinking of you RCMP) as a video here, and audio clip there, shapes the publics image of the organization. Others will realize that anecdotal and chance views of their operations represents a real risk to their image. Consequently they may strive to be more transparent – sharing more data about their operations and their processes – in an effort to provide the public with greater context. The goal here will be to provide a counter point to any unfortunate incidents, trying to make a single negative anecdotal data point that happened to be filmed part of a larger complex number of data points.

Obviously, I have strong suspicions regarding which strategy will work, and which one won’t, in a democratic society but am confident many will disagree.

Either way, these challenges are going to require adaptation strategies and it won’t be easy for public institutions adverse to both negative publicity and transparency.

Want to Find Government Innovation? US Military is often leading the way.

When it comes to see what trends will impact government in 20-30 years I’m a big fan of watching the US military. They may do lot of things wrong but, when it comes to government, they are on the bleeding edge of being a “learning organization.” It often feels like they are less risk averse, more likely to experiment, and, (as noted) more likely to learn, than almost any government agency I can think of (hint, those things maybe be interconnected). Few people realize that to rise above Colonel in many military organizations, you must have at least a masters degree. Many complete PhDs. And these schools often turn into places where people challenge authority and the institution’s conventional thinking.

Part of it, I suspect, has to do with the whole “people die when you make mistakes” aspect of their work. It may also have to do with the seriousness with which they take their mandate. And part of it has to do with the resources they have at their disposal.

But regardless of the cause, I find they are often at the cutting edge of ideas in the public sector. For example, I can’t think of a government organization that empowers the lowest echelons of its employee base more than the US military. Their network centric vision of the world means those on the front lines (both literally and figuratively) are often empowered, trusted and strongly supported with tools, data and technology to make decisions on the fly. In an odd way, the very hierarchical system that the rest of government has been modeled on, has really transcended into something different. Still very hierarchical but, at the same time, networked.

Frankly, if a 1-800 call operator at Service Canada or the guy working at the DMV in Pasadena, CA (or even just their managers) had 20% of the autonomy of a US Sargent, I suspect government would be far more responsive and innovative. Of course, Service Canada or the California DMV would have to have a network centric approach to their work… and that’s a ways off since it demands serious cultural challenge, the hardest thing to shift in an org.

Anyways… long rant. Today I’m interested in another smart call the US military is making that government procurement departments around the world should be paying attention to (I’m especially looking at you Public Works – pushers of Beehive over GCPEDIA). This article, Open source helicopters trivialize Europe’s ODF troubles, on Computer World’s Public Sector IT blog outlines how the next generation of US helicopters will be built on an open platform. No more proprietary software that binds hardware to a particular vendor.

Money quote from the piece:

Weapons manufacturers and US forces made an unequivocal declaration for royalty-free standards in January through the FACE (Future Airborne Capabilities Environment) Consortium they formed in response to US Defence Secretary Leon Panetta’s call for a “common aircraft architecture and subsystems”.

“The FACE Standard is an open, nonproprietary technical specification that is publicly available without restrictive contracts, licensing terms, or royalties,” the Consortium announced from its base at The Open Group, the industry association responsible for the POSIX open Unix specification.

“In business terms, the open standards specified for FACE mean that programmers are freely able to use them without monetary remuneration or other obligation to the standards owner,” it said.

While business software producers have opposed governments that have tried to implement identical open standards policies with the claim it will handicap innovation and dampen competition, the US military is embracing open standards for precisely the opposite reasons.

So suddenly the we are going to have an open source approach to innovation and program delivery (helicopter manufacturing, operation and maintenance) at major scale. Trust me, if the US military is trying to do this with helicopters you can convert you proprietary intranet to a open source wiki platform. I can’t believe the complexity is as great. But the larger point here is that this approach could be used to think about any system a government wants to develop, from earthquake monitoring equipment to healthcare systems to transit passes. From a “government as a platform perspective” this could be a project to watch. Lots of potential lessons here.

Access to Information, Open Data and the Problem with Convergence

In response to my post yesterday one reader sent me a very thoughtful commentary that included this line at the end:

“Rather than compare [Freedom of Information] FOI legislation and Open Gov Data as if it’s “one or the other”, do you think there’s a way of talking about how the two might converge?”

One small detail:

So before diving in to the meat let me start by saying I don’t believe anything in yesterday’s post claimed open data was better or worse than Freedom of Information (FOI often referred to in Canada as Access to Information or ATI). Seeing FOI and open data as competing suggests they are similar tools. While they have similar goals – improving access – and there may be some overlap, I increasingly see them as fundamentally different tools. This is also why I don’t see an opportunity for convergence in the short term (more on that below). I do, however, believe open data and FOI processes can be complimentary. Indeed, I’m hopeful open data can alleviate some of the burden placed on FOI system which are often slow. Indeed, in Canada, government departments regularly violate rules around disclosure deadlines. If anything, this complimentary nature was the implicit point in yesterday’s post (which I could have made more explicit).

The Problem with Convergence:

As mentioned above, the overarching goals of open data and FOI systems are similar – to enable citizens to access government information – but the two initiatives are grounded in fundamentally different approaches to dealing with government information. From my view FOI has become a system of case by case review while open data is seeking to engage in an approach of “pre-clearance.”

Part of this has to do with what each system is reacting to. FOI was born, in part, out of a reaction to scandals in the mid 20th century which fostered public support for a right to access government information.

FOI has become a powerful tool for accessing government information. But the infrastructure created to manage it has also had some perverse effects. In some ways FOI has, paradoxically made it harder to gain access to government information. I remember talking to a group of retired reporters who talk about how it was easier to gain access to documents in a pre-FOI era since there were no guidelines and many public servants saw most documents as “public” anyways. The rules around disclosure today – thanks in part to FOI regimes – mean that governments can make closed the “default” setting for government information. In the United States the Ashcroft Memo serves as an excellent example of this problem. In this case the FOI legislation actually becomes a tool that helps governments withhold documents, rather than enable citizens to gain legitimate access.

But the bigger problem is that the process by which access to information requests are fulfilled is itself burdensome. While relevant and necessary for some types of information it is often overkill for others. And this is the niche that open data seeks to fill.

Let me pause to stress, I don’t share the above to disparage FOI. Quite the opposite. It is a critical and important tool and I’m not advocating for its end. Nor am I arguing the open data can – in the short or even medium term – solve the problems raised above.

This is why, over the short term, open data will remain a niche solution – a fact linked to its origins. Like FOI Open data has its roots in government transparency. However, it also evolved out of efforts to tear down antiquated intellectual property regimes to the facilitate sharing of data/information (particularly between organizations and governments). Thus the emphasis was not on case by case review of documents, but rather of clearing rights to categories of information, both created and to be created in the future. In other words, this is about granting access to the outputs of a system, not access to individual documents.

Another way of thinking about this is that open data initiatives seek to leverage the benefits of FOI while jettisoning its burdensome process. If a category of information can be pre-clear in advanced and in perpetuity for privacy, security and IP concerns then FOI processes – essential for individual documents and analysis – becomes unnecessary and one can reduce the transaction costs to citizens wishing to access the information.

Maybe, in the future, the scope of these open data initiatives could become broader, and I hope they will. Indeed there is, ample evidence to suggest that technology could be used to pre-clear or assess the sensitivity of any government document. An algorithm that assess a mixture of who the author is, the network of people who review it and a scan of the words would probably allow ascertain if a document could be released to an ATIP request in seconds, rather than weeks. It could at least give a risk profile and/or strip out privacy related information. These types of reforms would be much more disruptive (in the positive sense) to FOI legislation than open data.

But all that said, just getting the current focus of open data initiatives right would be a big accomplishment. And, even if such initiatives could be expanded, there are limits. I am not so naive to believe that government can be entirely open. Nor am I sure that would be an entirely good outcome. When trying to foster new ideas or assess how to balance competing interests in society, a private place to initiate and play with ideas may be essential. And despite the ruminations above, the limits of government IT systems means there will remain a lot of information – particularly non-data information like reports and analysis – that we won’t be able to “pre-clear- for sharing and downloading. Consequently an FOI regime – or something analogous – will continue to be necessary.

So rather than replace or converge with FOI systems, I hope open data will, for the short to medium term actually divert information out of the FOI, not because it competes, but because it offers a simpler and more efficient means of sharing (for both government and citizens) certain types of information. That said, open data initiatives offer none of the protections or rights of FOI and so this legislation will continue to serve as the fail safe mechanism should a government choose to stop sharing data. Moreover, FOI will continue to be a necessary tool for documents and information that – for all sorts of reasons (privacy, security, cabinet confidence, etc…) cannot fall under the rubric of an open data initiative. So convergence… not for now. But co-existence feels both likely and helpful for both.

Calculating the Value of Canada’s Open Data Portal: A Mini-Case Study

Okay, let’s geek out on some open data portal stats from data.gc.ca. I’ve got three parts to this review: First, an assessment on how to assess the value of data.gc.ca. Second, a look at what are the most downloaded data sets. And third, some interesting data about who is visiting the portal.

Before we dive in, a thank you to Jonathan C sent me some of this data to me the other day after requesting it from Treasury Board, the ministry within the Canadian Government that manages the government’s open data portal.

1. Assessing the Value of data.gc.ca

Here is the first thing that struck me. Many governments talk about how they struggle to find methodologies to measure the value of open data portals/initiatives. Often these assessments focus on things like number of apps created or downloaded. Sometimes (and incorrectly in my mind) pageviews or downloads are used. Occasionally it veers into things like mashups or websites.

However, one fairly tangible value of open data portals is that they cheaply resolve some access to information requests –  a point I’ve tried to make before. At the very minimum they give scale to some requests that previously would have been handled by slow and expensive access to information/freedom of information processes.

Let me share some numbers to explain what I mean.

The Canada Government is, I believe, only obligated to fulfill requests that originate within Canada. Drawing from the information in the charts later in this post, let’s say assume there were a total of 2200 downloads in January and that 1/3 of these originated from Canada – so a total of 726 “Canadian” downloads. Thanks to some earlier research, I happen to know that the office of the information commissioner has assessed that the average cost of fulfilling an access to information request in 2009-2010 was $1,332.21.

So in a world without an open data portal the hypothetical cost of fulfilling these “Canadian” downloads as formal access to information requests would have been $967,184.46 in January alone. Even if I’m off by 50%, then the cost – again, just for January – would still sit at $483,592.23. Assuming this is a safe monthly average, then over the course of a year the cost savings could be around $11,606,213.52 or $5,803,106.76 – depending on how conservative you’d want to be about the assumptions.

Of course, I’m well aware that not every one of these downloads would been an information request in a pre-portal world – that process is simply to burdensome. You have to pay a fee, and it has to be by check (who pays for anything by check any more???) so many of these users would simply have abandoned their search for government information. So some of these savings would not have been realized. But that doesn’t mean there isn’t value. Instead the open data portal is able to more cheaply reveal latent demand for data. In addition, only a fraction of the government’s data is presently on the portal – so all these numbers could get bigger still. And finally I’m only assessing downloads that originated inside Canada in these estimates.

So I’m not claiming that we have arrived at a holistic view of how to assess the value of open data portals – but even the narrow scope of assessment I outline above generates financial savings that are not trivial, and this is to say nothing of the value generated by those who downloaded the data – something that is much harder to measure – or of the value of increased access to Canadians and others.

2. Most Downloaded Datasets at data.gc.ca

This is interesting because… well… it’s just always interesting to see what people gravitate towards. But check this out…

Data sets like the Anthropogenic disturbance footprint within boreal caribou ranges across Canada may not seem interesting, but the ground breaking agreement between the Forest Products Association of Canada and a coalition of Environmental Non-Profits – known as the Canadian Boreal Forest Agreement (CBFA) – uses this data set a lot to assess where the endangered woodland caribou are most at risk. There is no app, but the data is critical in both protecting this species and in finding a way to sustainably harvest wood in Canada. (note, I worked as an adviser on the CBFA so am a) a big fan and b) not making this stuff up).

It is fascinating that immigration and visa data tops the list. But it really shouldn’t be a surprise. We are of course, a nation of immigrants. I’m sure that immigration and visa advisers, to say nothing of think tanks, municipal governments, social service non-profits and English as a second language schools are all very keen on using this data to help them understand how they should be shaping their services and policies to target immigrant communities.

There is, of course, weather. The original open government data set. We made this data open for 100s of years. So useful and so important you had to make it open.

And, nice to see Sales of fuel used for road motor vehicles, by province and territory. If you wanted to figure out the carbon footprint of vehicles, by province, I suspect this is a nice dataset to get. Probably is also useful for computing gas prices as it might let you get a handle on demand. Economists probably like this data set.

All this to say, I’m less skeptical than before about the data sets in data.gc.ca. With the exception of weather, these data sets aren’t likely useful to software developers – the group I tend to hear most from – but then I’ve always posited that apps were only going to be a tiny part of the open data ecosystem. Analysis is king for open data and there does appear to be people out there who are finding data of value for analyses they want to make. That’s a great outcome.

Here are the tables outlining the most popular data sets since launch and (roughly) in February.

  Top 10 most downloaded datasets, since launch

DATASET DEPARTMENT DOWNLOADS
1 Permanent Resident Applications Processed Abroad and Processing Times (English) Citizenship and Immigration Canada 4730
2 Permanent Resident Summary by Mission (English) Citizenship and Immigration Canada 1733
3 Overseas Permanent Resident Inventory (English) Citizenship and Immigration Canada 1558
4 Canada – Permanent residents by category (English) Citizenship and Immigration Canada 1261
5 Permanent Resident Applicants Awaiting a Decision (English) Citizenship and Immigration Canada 873
6 Meteorological Service of Canada (MSC) – City Page Weather Environment Canada 852
7 Meteorological Service of Canada (MSC) – Weather Element Forecasts Environment Canada 851
8 Permanent Resident Visa Applications Received Abroad – English Version Citizenship and Immigration Canada  800
9 Water Quality Indicators – Reports, Maps, Charts and Data Environment Canada 697
10 Canada – Permanent and Temporary Residents – English version Citizenship and Immigration Canada 625

Top 10 most downloaded datasets, for past 30 days

DATASET DEPARTMENT DOWNLOADS
1 Permanent Resident Applications Processed Abroad and Processing Times (English) Citizenship and Immigration Canada 481
2 Sales of commodities of large retailers – English version Statistics Canada  247
3 Permanent Resident Summary by Mission – English Version Citizenship and Immigration Canada 207
4 CIC Operational Network at a Glance – English Version Citizenship and Immigration Canada 163
5 Gross domestic product at basic prices, communications, transportation and trade – English version Statistics Canada 159
6 Anthropogenic disturbance footprint within boreal caribou ranges across Canada – As interpreted from 2008-2010 Landsat satellite imagery Environment Canada  102
7 Canada – Permanent residents by category – English version Citizenship and Immigration Canada  98
8 Meteorological Service of Canada (MSC) – City Page Weather Environment Canada  61
9 Sales of fuel used for road motor vehicles, by province and territory – English version  Statistics Canada 52
10 Government of Canada Core Subject Thesaurus – English Version  Library and Archives Canada  51

3. Visitor locations

So this is just plain fun. There is not a ton to derive from this – especially as IP addresses can, occasionally, be misleading. In addition, this is page view data, not download data. But what is fascinating is that computers in Canada are not the top source of traffic at data.gc.ca. Indeed, Canada’s share of the traffic is actually quite low. In fact, in January, just taking into account the countries in the chart (and not the long tail of visitors) Canada accounted for only 16% of the traffic to the site. That said, I suspect that downloads were significantly higher from Canadian visitors – although I have no hard evidence of this, just a hypothesis.

datagcca-december-visits

•Total visits since launch: 380,276 user sessions