Category Archives: public policy

Unstructured Thinking on Open Data: A response to Tom Slee

apologies for any typos, I’d like to look this over more, but I’ve got to get to other work.

Tom Slee has a very well written blog post with a critical perspective of open data. I encourage you to go and read it – but also to dive into the comments below it, which I think reflect one of the finer discussions I’ve seen below a blog post and articulate many of the critiques I would have had about Tom’s post, but in ways that are more articulate than I would have written (and frankly, how many times have you heard someone say that about comments).

I start with all this because a)  I think Tom and the team at Crooked Timber should be congratulated for fostering a fantastic online environment and b) before I dive in and disagree with Tom (which risks being confused with not liking or respecting him, which is definitely not the case – I don’t know him so can’t speak to his character but I sense nothing but honest and good intentions in his post and I definitely have a lot of respect for his mind). What I particularly respect is the calm and good natured way he responds to comments – especially when he concedes that a line of reasoning was flawed.

This is particularly refreshing given Tom’s original piece on this subject – the open data movement was a joke – of which this piece is a much more refined version – was fairly harsh and dismissive in its tone. That early piece lays bare some of the underlying assumptions that are also embedded in his newer piece. Tom has a very clear political perspective. He believes in a world where big companies are bad and smaller organizations are good even if that comes at the expense of a more effective or efficient service. Indeed, as one commenter notes (and I agree), Tom seems uncomfortable with the profit motive altogether. I don’t judge Tom’s for his perspective but they aren’t purely about open data – they are about a broader political agenda which open data may, or may not, end up serving. And to his credit, I think Tom is pretty clear about this. Open Data is not a problem in of itself to him, it is only a problem if it can be used to support an agenda that he believes trumps all others – enhancing freer and more open markets and/or putting the state’s role in certain activities at risk.

There is, I would like to note, however no discussion about the cost of closed data – or the powerful interests that mirror the open data free market doppelgänger’s – that like to keep it that way. No acknowledgement for the enormous inequalities embedded in the status quo where government data is controlled by government agents – and more often than not – sold to those who can pay for it. I have no doubt that open data will create new winners and losers – but let’s not pretend like the status quo doesn’t support many winners and create big losers either. Our starting point is not neutral. This sentiment came out with a rather unfortunantely terse statement from one commenter that, if one strips the emotion out of it, does raise a good point:

Um, so you don’t like open data, huh? At the very best, your analysis is radically incomplete for it doesn’t include any of the costs imposed by closed data. Anyone who has had to interact with the land record bureaucracy in places like India will tell you about the costs in time, bribes, lost work hours it takes to navigate a closed data environment. Also, there are plenty of stories in the Indian media of how farmers won disputes with authorities using GIS data and exposed land-grabs (because the data is open).

I remember Aneesh Chopra sharing a similar story (sorry I could find a better link). What I find frustrating is that open data advocates get accused of being techno-utopians, praising technology when things work and blaming officials when it goes wrong… but Slee seems to be doing the same in reverse. When people use bribery to hijack a GIS process to expropriate land, I blame corruption, not open data. And when the same GIS system allows a different set of poor farmers to securitize their land, get loans and stop using loan sharks – I praise the officials who implemented it and the legal system for being effective (and important per-requisite for open-data). Slee claims techno utopians fetishize open data, that’s true, some of them do (and I have been occasionally guilty). But he fetishizes the evils of private interests (and by extension, open data).

The more subtle and interesting approach Slee takes it to equate open data – and in particular standards – to culture creative products.

To maintain cultural diversity in the face of winner-take-all markets, governments in smaller countries have designed a toolbox of interventions. The contents include production subsidies, broadcast quotas, spending rules, national ownership, and competition policy. In general, such measures have received support from those with a left-leaning outlook.20

The reason to do this is that it allows Slee to link open data to a topic that I sense he really dislikes – globalization – and explicitly connect Open Data to industrial policy which should, from what I can gather he believes, protect local developers from companies like Google. This leads to a fairly interesting ideas like “if Google is not going to pay or negotiate with all those transit agencies (#40) then that’s fine by me: perhaps that will let some of those apps being developed by hobbyists at hackathons gain some usage within their own transit area.” It is worth noting that a hobbyist doesn’t, by definition, make a sustaining wage from their work, defeating the cultural products/local economy link. I’m personally aware of this as I’ve had a number of hobby open data projects.

But what’s worse is the scenario he describes is unlikely to emerge. I’m not sure citizens, or their transit authorities, will want to rely on a hobbyist for a service I know I (and many others) see as critical. More likely is that one or two companies will become the app maker of choice for most transit authorities (so Google doesn’t win, just some other new big company) or many transit authorities will build something in house that is, frankly, of highly varied quality since this is not their core competency. This world isn’t hard to imagine since it was the world we lived in before Google maps starting sharing transit results.

Moreover, Slee’s vision will probably not reward local firms or punish large companies – it just  rewards different ones. Indeed, it is part of their strategy. Apple just announced that it won’t be support the General Transit Feed in its news iOS6 map and will instead push users to an app. This should be great news for hobbyist developers. The likely outcome however – as Clay Johnson notes – is that transit authorities have less incentive to do open data since the can force users to use “their” app. Hard to see how local hobbyist benefit. In any of the scenarios outlined above it is harder still to see how citizens benefit. Again, our transit future could look a lot like 2005. Indeed, one need only look at a city like Vienna to see the future. So yes, Google hurts, but to whose benefit? And it is worth noting that with smart phone penetration in the Western world increasing – and higher still among Blacks and Hispanics than whites and rising fastest among those making less than 30K – it is not clear to me that is the wealthy and privileged who are paying the price as they drive home.

But from my reading of Slee’s article this is still a good outcome since local creators and always better than those who live far away. Indeed, he considers the price small, calling it mere loss consumer efficiency and not a loss of civic improvement. I couldn’t disagree more. If figuring out how to catch the bus becomes harder it has a material impact on my sense of the civic infrastructure, and definitely my capacity to utilize it. Indeed, when it comes to real time data like that in GTFS version two, research tells us people it actually attracts more people to use the bus.

But what’s frustrating is that the choice Slee presents is false. Yes, open data can help foster bigger players. Google is a giant. But doing open data (and more importantly, standardizing it) has also enabled local and vibrant competitors to emerge much more easily. Some are hobbyists, others are companies. Indeed the total range of innovation that is possible because of the standard is amazing. I would strongly encourage you to read David Turner’s rebuttles (such as this one) in the comments are a strong rebuttal (especially around standardization) and don’t receive the attention I think they warrant.

Again, I strongly encourage you to go read Slee’s piece. It is filled with important insights that open data advocates need to digest. His opening story around the Map Kibera project and the Dalit’s claim is an important case studies in how not to engage in public policy or run a technology project in the developing world. The caveat however, is that these lessons would be true whether the data in those projects was going to be open or closed. His broader point, that there are corporate, free-market players seeking to exploit the open data movement should be understood and grappled with as well. However, I’d invite him to consider the reverse too. There are powerful corporate interests that benefit from closed data. He has his doppelgängers too.

I also want to note that I’m comfortable with many of Slee’s critique’s of open data because I both share and don’t share his political goals. I definitely want to live in a world where we strive for equality of opportunity and where monopolies are hard to form or stay entrenched. I’m also concerned about the internet’s capacity to foster big winners that could become uncontrollable. I’m less interested however, in a world that seeks to support middle men who extract value while contributing very little. So I’m looking for places where open data can help serve both these goals. That many mean that there is data than never gets made open – and I’m comfortable with that. But more importantly  my sense is that – in many circumstances – the right way to deal with these problems is to keep the barriers to entry for new entrants as low as possible.

Help OpenNorth Raise 10K to Improve Democracy and Engagement thru Tech

Some of you may know that I sit on the board of directors for OpenNorth – a cool little non-profit that is building tools for citizens, governments and journalists to improve participation and, sometimes, just make it a little bit easier to be a citizen. Here’s great example of a simple tool they created that others are starting to use – Represent – a database that allows you to quickly figure out all the elected officials the serve the place where you are currently standing.

As a humble non-profit OpenNorth runs on a shoestring, with a lot of volunteer participation. With that in mind we’d like to raise $10,000 this Canada Day. I’ve already donated $100.

The reason?

To sponsor our next project – Citizen Writes – which, inspired by the successful Parliament Watch in Germany, would allow citizens to publicly ask questions both to candidates during elections and to representatives in office. The German site has, since 2004, posed over 140,000 questions from everyday citizens of which 80% been answered by politicians. More importantly such a tool could empower all back benchers, rebalancing power which is increasingly centralized in Canada.

I encourage you to check out our other projects too – I think you’ll find we are up to all sorts of goodness.

You can read more at the OpenNorth blog, or donate by going here.

The End of the World: The State vs. the Internet

Last weekend at FooCamp, I co-hosted a session titled “The End of the World: Will the Internet Destroy the State, or Will the State Destroy the Internet?” What follows are the ideas I opened with during my intro to the session and some additional thoughts I’ve had and that others shared during the conversation. To avoid some confusion, I’d also like to clarify a) I don’t claim that these questions have never been raised before, I mostly hope that this framing can generate useful thought and debate; and b) that I don’t believe these are the only two or three possible outcomes; it was just a interesting way of framing some poles so as to generate good conversation.

Introduction

A while back, I thought I saw a tweet from Evgeny Morozov that said something to the effect: “You don’t just go from printing press to Renaissance to iPad; there are revolutions and wars in between you can’t ignore.” Since I can’t find the tweet, maybe he didn’t say it or I imagined it… but it sparked a line of thinking.

Technology and Change

Most often, when people think of the printing press, they think of its impact on the Catholic Church – about how it enabled Martin Luther’s complaints to go viral and how the localization of the Bible cut out the need of the middle man the priest to connect and engage with God. But if the printing press undermined the Catholic Church, it had the opposite impact on the state. To be fair, heads of state took a beating (see French Revolution et al.), but the state itself was nimbler and made good use of the technology. Indeed, it is worth noting that the modern notion of the nation state was not conceivable without the printing press. The press transformed the state – scaling up its capacity to demand control over loyalty from citizens and mobilize resources which, in turn, had an impact on how states related (and fought) with one another.

In his seminal book Imagined Communities, Benedict Anderson outlined how the printing press allowed the state to standardize language and history. In other words, someone growing up in Marseilles 100 years before the printing press probably had a very different sense of history and spoke a markedly different dialect of French than someone living in Paris during the same period. But the printing press (and more specifically, those who controlled it) allowed a dominant discourse to emerge (in this case, likely the Parisian one). Think standardized dictionaries, school textbooks and curricula, to say nothing of history and entertainment. This caused people who might never have met to share a common imagined history, language and discourse. Do not underestimate the impact this had on people’s identity. As this wonderful quote from the book states: “Ultimately it is this fraternity that makes it possible, over the past two centuries, for so many millions of people, not so much to kill, as willingly to die for such limited imaginings.” In other words, states could now fully dispense with feudal middle managers and harness the power of larger swaths of population directly – a population that might never actually meet, but could nonetheless feel connected to one another. The printing press thus helped create the modern nation state by providing a form of tribalism at scale: what we now call nationalism. This was, in turn, an important ingredient for the wars that dominated the late 19th and early 20th century – think World War I and World War II. This isn’t to say without the printing press, you don’t get war – we know that isn’t true – but the type of total war between 20th century nation states does have a direct line to the printing press.

So yes, the techno-utopian world of: printing press -> Renaissance -> iPad is not particularly accurate.

What you do get is: printing press -> Renaissance -> state evolution -> destabilization of international order -> significant bloodshed -> re-stabilization of international system -> iPad.

I raise all this because if this is the impact the printing press had on the state, it begs a new question: What will be the impact of the internet on the state? Will the internet be a technology the state can harness to extract more loyalty from its citizens… or will the internet destroy the imagined communities that make the state possible, replaced by a more nimble, disruptive organization better able to survive the internet era?

Some Scenarios

Note: again, these scenarios aren’t absolutes or the only possibilities, they are designed to raise questions and provoke thinking.

The State Destroys the Internet

One possibility is that the state is as adaptive as capitalism. I’m always amazed at how capitalism has evolved over the centuries. From mercantilism to free market to social market to state capitalism, as a meme it readily adapts  to new environments. One possibility is that the state is the same – sufficiently flexible to adapt to new conditions. Consequently, one can imagine that the state grabs sufficient control of the internet to turn it into a tool that at best enhances – and at worst, doesn’t threaten – citizens’ connection to it. Iran, with its attempt to build a state-managed internal network that will allow it to closely monitor its citizens’ every move, is a scary example of the former. China – with its great firewall – may be an example of the latter. But one not need pick on non-western states.

And a networked world will provide states – especially democratic ones – with lots of reasons to seize greater control of their citizens’ lives. From organized crime, to  terrorism, to identity theft, governments find lots of reasons to monitor their citizens. This is to say nothing of advanced persistent threats which create a state of continual online warfare – or sort of modern day phoney phishy war – between China, the United States, Iran and others. This may be the ultimate justification.

Indeed, as a result of these threats, the United States already has an extensive system for using the internet to monitor its own citizens and even my own country – Canada – tried to pass a law last year to significantly ramp up the monitoring of citizens online. The UK, of course, has just proposed a law whose monitoring provisions would make any authoritarian government squeal with glee. And just last week we found out that the UK government is preparing to cut a blank check for internet service providers to pay for installing the monitoring systems to record what its citizens do online.

Have no doubts, this is about the state trying to ensure the internet serves – or at least doesn’t threaten – its interests.

This is sadly, the easiest future to imagine since it conforms with the world we already know – one where states are ascendant. However, this future represents, in many ways, a linear projection of the future – and our world, especially our networked world, rarely behaves in a linear fashion. So we should be careful about confusing familiarity with probability.

The Internet Destroys the State

Another possibility is that the internet undermines our connection with the state. Online we become increasingly engaged with epistemic communities – be it social, like someone’s World of Warcraft guild, or professional, such as an association with a scientific community. Meanwhile, in the physical world, local communities – possibly at the regional level – become ascendant. In both cases, regulations and rules created by the state feel increasingly like an impediment to conducting our day to day lives, commerce and broader goals. Frustration flares, and increasingly someone in Florida feels less and less connection with someone in Washington state – and the common sense of identity, the imagined community, created by the state begins to erode.

This is, of course, hard for many people to imagine – especially Americans. But for many people in the world – including Canadians – the unity of the state is not a carefree assumption. There have been three referenda on breaking up Canada in my lifetime. More to the point, this process probably wouldn’t start in places where the state is strongest (such as in North America); rather, it would start in places where it is weakest. Think Somalia, Egypt (at the moment) or Belgium (which has basically functioned for two years without a government and no one seemed to really notice). Maybe this isn’t a world with no state – but lots of little states (which I think breaks with our mold of what we imagine the state to be to a certain degree) or maybe some new organizing mechanism, one which leverages local community identities, but can co-exist with a network of diffused but important transnational identities. Or maybe the organizing unit gets bigger, so that greater resources can be called upon to manage ne,w network-based threats.

I, like most people find this world harder to imagine. This is because so many of our assumptions suddenly disappear. If not the state, then what? Who or what protects and manages the internet infrastructure? What about other types of threats – corporate interests, organized and cyber-crime, etc.? This is true paradigm-shifting stuff (apologies for use of the word,) and frankly, I still find myself too stuck in my Newtonian world and the rules make it hard to imagine or even know what quantum mechanics will be like. Again, I want to separate imagining the future with its probability. The two are not always connected, and this is why thinking about this future, as uncomfortable and alienating as it may be, is probably an important exercise.

McWorldThe Internet Rewards the Corporation

One of the big assumptions I often find about people who write/talk about the internet is that it almost always assumes that the individual is the fundamental unit of analysis. There are good reasons for this – using social media, an individual’s capacity to be disruptive has generally increased. And, as Clay Shirky has outlined, the need for coordinating institutions and managers has greatly diminished. Indeed, Shirky’s blog post on the collapse of complex business models is (in addition to being a wonderful piece) a fantastic description of how a disruptive technology can undermine the capacity of larger complex players in a system and benefit smaller, simpler stakeholders. Of course, the smaller stakeholder in our system may not be the individual – it may be an actor that is smaller, nimbler than the state, that can foster an imagined community, and can adopt various forms of marshaling resources for self-organization to hierarchical management. Maybe it is the corporation.

During the conversation at FooCamp, Tim O’Reilly pressed this point with great effect. It could be that the corporation is actually the entity best positioned to adapt to the internet age. Small enough to leverage networks, big enough to generate a community that is actually loyal and engaged.

Indeed, it is easy to imagine a feedback loop that accelerates the ascendance of the corporation. If our imagined communities of nation states cannot withstand a world of multiple narratives and so become weaker, corporations would benefit not just from a greater capacity to adapt, but the great counterbalance to their power – state regulation and borders – might simultaneously erode. A world where more and more power – through information, money and human capital – gets concentrated in corporations is not hard to imagine. Indeed there are many who believe this is already our world. Of course, if the places (generally government bodies) where corporate conflicts – particularly those across sectors – cannot be mediated peacefully then corporations may turn much more aggressive. The need to be bigger, to marshal more resources, to have a security division to defend corporate interests, could lead to a growth in corporations as entities we barely imagine today. It’s a scary future, but not one that hasn’t been imagined several times in SciFi novels, and not one I would put beyond the realm of imagination.

The End of the World

The larger point of all this is that new technologies do change the way we imagine our communities. A second and third order impact of the printing press was its critical role in creating the modern nation-state. The bigger question is, what will be the second and third order impacts of the internet – on our communities (real and imagined), our identity and where power gets concentrated?

As different as the outcomes above are, they share one important thing in common. None represent the status quo. In each case, the nature of the state, and its relationship with citizens, shifts. Consequently, I find it hard to imagine a future where the internet does not continue to put a real strain on how we organize ourselves, and in turn the systems we have built to manage this organization. Consequently, it is not hard to imagine that as more and more of those institutions – including potentially the state itself – come under strain, it could very likely push systems – like the international state system – that are presently stable into a place of instability. It is worth noting that after the printing press, one of the first real nation states – France – wreaked havoc on Europe for almost a half century, using its enhanced resources to conquer pretty much everyone in its path.

While I am fascinated by technology and believe it can be harnessed to do good, I like to think that I am not – as Evgeny labels them – a techno-utopian. We need to remember that, looking back on our history, the second and third order effects of some technologies can be highly destabilizing, which carries with it real risks of generating significant bloodshed and conflict. Hence the title of this blog post and the FooCamp session: The End of the World.

This is not a call for a renewed Luddite manifesto. Quite the opposite – we are on a treadmill we cannot get off. Our technologies have improved our lives, but they also create new problems that, very often social innovations and other technologies will be needed to solve. Rather, I want to raise this because I believe it to be important that still more people – particularly those in the valley and other technology hubs (and not just military strategists) – be thinking critically about what the potential second and third order effects of the internet, the web and the tools they are creating, so that they can contribute to the thinking around potential technological, social and institutional responses that could hopefully mitigate against the worst outcomes.

I hope this helps prompt further thinking and discussion.

 

Open Postal Codes: A Public Response to Canada Post on how they undermine the public good

Earlier this week the Ottawa Citizen ran a story in which I’m quoted about a fight between Treasury Board and Canada Post officials over making postal code data open. Treasury Board officials would love to add it to data.gc.ca while Canada post officials are, to put it mildly, deeply opposed.

This is of course, unsurprising since Canada Post recently launched a frivolous law suit against a software developer who is – quite legally – recreating the postal code data set. For those new to this issue I blogged about this, why postal codes matter and cover the weakness (and incompetence) of Canada Post’s legal case here.

But this new Ottawa Citizen story had me rolling my eyes anew – especially after reading the quotes and text from Canada Post spokesperson. This is in no way an attack on the spokesperson, who I’m sure is a nice person. It is an attack on their employer whose position, sadly, is not just in opposition to the public interest because of the outcome in generates but because of the way it treats citizens. Let me break down Canada Posts platform of ignorance public statement line by line, in order to spell out how they are undermining both the public interest, public debate and accountability.

Keeping the information up-to-date is one of the main reasons why Canada Post needs to charge for it, said Anick Losier, a spokeswoman for the crown corporation, in an interview earlier this year. There are more than 250,000 new addresses and more than a million address changes every year and they need the revenue generated from selling the data to help keep the information up-to-date.

So what is interesting about this is that – as far as I understand – it is not Canada Post that actually generates most of this data. It is local governments that are responsible for creating address data and, ironically, they are required to share it for free with Canada Post. So Canada Post’s data set is itself built on data that it receives for free. It would be interesting for cities to suddenly claim that they needed to engage in “cost-recovery” as well and start charging Canada Post. At some point you recognize that a public asset is a public asset and that it is best leveraged when widely adopted – something Canada Post’s “cost-recovery” prevents. Indeed, what Canada Post is essentially saying is that it is okay for it to leverage the work of other governments for free, but it isn’t okay for the public to leverage its works for free. Ah, the irony.

“We need to ensure accuracy of the data just because if the data’s inaccurate it comes into the system and it adds more costs,” she said.

“We all want to make sure these addresses are maintained.”

So, of course, do I. That said, the statement makes it sound like there is a gap between Canada Post – which is interested in the accuracy of the data – and everyone else – who isn’t. I can tell you, as someone who has engaged with non-profits and companies that make use of public data, no one is more concerned about accuracy of data than those who reuse it. That’s because when you make use of public data and share the results with the public or customers, they blame you, not the government source from which you got the data, for any problems or mistakes. So invariable one thing that happens when you make data open is that you actually have more stakeholders with strong interests in ensuring the data is accurate.

But there is also something subtly misleading about Canada Posts statement. At the moment, the only reason there is inaccurate data out there is because people are trying to find cheaper ways of creating the postal code data set and so are willing to tolerate less accurate data in order to not have to pay Canada Post. If (and that is a big if) Canada Post’s main concern was accuracy, then making the data open would be the best protection as it would eliminate less accurate version of postal code data. Indeed, this suggests a failure of understanding economics. Canada states that other parts of its business become more expensive when postal code data is inaccurate. That would suggest that providing free data might help reduce those costs – incenting people to create inaccurate postal code data by charging for it may be hurting Canada Post more than any else. But we can’t assess that, for reason I outline below. And ultimately, I suspect Canada Post’s main interest in not accuracy – it is cost recovery – but that doesn’t sound nearly as good as talking about accuracy or quality, so they try to shoe horn those ideas into their argument.

She said the data are sold on a “cost-recovery” basis but declined to make available the amount of revenue it brings in or the amount of money it costs the Crown corporation to maintain the data.

This is my favourite part. Basically, a crown corporation, whose assets belong to the public, won’t reveal the cost of a process over which it has a monopoly. Let’s be really clear. This is not like other parts of their business where there are competative risk in releasing information – Canada Post is a monopoly provider. Instead, we are being patronized and essentially asked to buzz off. There is no accountability and there is no reasons why they could give us these numbers. Indeed, the total disdain for the public is so appalling it reminds me of why I opt out of junk mail and moved my bills to email and auto-pay ages ago.

This matters because the “cost-recovery” issue goes to the heart of the debate. As I noted above, Canada Post gets the underlying address data for free. That said, there is no doubt that it then creates some value to the data by adding postal codes. The question is, should that value best be recouped through cost-recovery at this point in the value chain, or at later stages through additional economy activity (and this greater tax revenue). This debate would be easier to have if we knew the scope of the costs. Does creating postal code data cost Canada Post $100,000 a year? A million? 10 million? We don’t know and they won’t tell us. There are real economic benefits to be had in a digital economy where postal code data is open, but Canada Post prevents us from having a meaningful debate since we can’t find out the tradeoffs.

In addition, it also means that we can’t assess if their are disruptive ways in which postal code data could be generated vastly more efficiently. Canada Post has no incentive (quite the opposite actually) to generate this data more efficiently and there for make the “cost-recovery” much, much lower. It may be that creating postal code data really is a $100,000 a year problem, with the right person and software working on it.

So in the end, a government owned Crown Corporation refuses to not only do something that might help spur Canada’s digital economy – make postal code data open – it refuses to even engage in a legitimate public policy debate. For an organization that is fighting to find its way in the 21st century it is a pretty ominous sign.

* As an aside, in the Citizen article it says that I’m an open government activist who is working with the federal government on the website’s development. The first part – on activism – is true. The latter half, that I work on the open government website’s development, is not. The confusion may arise from the fact that I sit on the Treasury Board’s Open Government Advisory Panel, for which I’m not paid, but am asked for feedback, criticism and suggestions – like making postal code data open – about the government’s open government and open data initiatives.

The US Government's Digital Strategy: The New Benchmark and Some Lessons

Last week the White House launched its new roadmap for digital government. This included the publication of Digital Government: Building a 21st Century Platform to Better Serve the American People (PDF version), the issuing of a Presidential directive and the announcement of White House Innovation Fellows.

In other words, it was a big week for those interested in digital and open government. Having had some time to digest these docs and reflect upon them, below are some thoughts on these announcement and lessons I hope governments and other stakeholders take from it.

First off, the core document – Digital Government: Building a 21st Century Platform to Better Serve the American People – is a must read if you are a public servant thinking about technology or even about program delivery in general. In other words, if your email has a .gov in it or ends in something like .gc.ca you should probably read it. Indeed, I’d put this document right up there with another classic must read, The Power of Information Taskforce Report commissioned by the Cabinet Office in the UK (which if you have not read, you should).

Perhaps most exciting to me is that this is the first time I’ve seen a government document clearly declare something I’ve long advised governments I’ve worked with: data should be a layer in your IT architecture. The problem is nicely summarized on page 9:

Traditionally, the government has architected systems (e.g. databases or applications) for specific uses at specific points in time. The tight coupling of presentation and information has made it difficult to extract the underlying information and adapt to changing internal and external needs.

Oy. Isn’t that the case. Most government data is captured in an application and designed for a single use. For example, say you run the license renewal system. You update your database every time someone wants to renew their license. That makes sense because that is what the system was designed to do. But, maybe you like to get track, in real time, how frequently the database changes, and by who. Whoops. System was designed for that because that wasn’t needed in the original application. Of course, being able to present the data in that second way might be a great way to assess how busy different branches are so you could warn prospective customers about wait times. Now imagine this lost opportunity… and multiply it by a million. Welcome to government IT.

Decoupling data from application is pretty much close to the first think in the report. Here’s my favourite chunk from the report (italics mine, to note extra favourite part).

The Federal Government must fundamentally shift how it thinks about digital information. Rather than thinking primarily about the final presentation—publishing web pages, mobile applications or brochures—an information-centric approach focuses on ensuring our data and content are accurate, available, and secure. We need to treat all content as data—turning any unstructured content into structured data—then ensure all structured data are associated with valid metadata. Providing this information through web APIs helps us architect for interoperability and openness, and makes data assets freely available for use within agencies, between agencies, in the private sector, or by citizens. This approach also supports device-agnostic security and privacy controls, as attributes can be applied directly to the data and monitored through metadata, enabling agencies to focus on securing the data and not the device.

To help, the White House provides a visual guide for this roadmap. I’ve pasted it below. However, I’ve taken the liberty to highlight how most governments try to tackle open data on the right – just so people can see how different the White House’s approach is, and why this is not just an issue of throwing up some new data but a total rethink of how government architects itself online.

There are of course, a bunch of things that flow out of the White House’s approach that are not spelled out in the document. The first and most obvious is once you make data an information layer you have to manage it directly. This means that data starts to be seen and treated as a asset – this means understanding who’s the custodian and establishing a governance structure around it. This is something that, previously, really only libraries and statistical bureaus have really understand (and sometimes not even!).

This is the dirty secret about open data – is that to do it effectively you actually have to start treating data as an asset. For the White House the benefit of taking that view of data is that it saves money. Creating a separate information layer means you don’t have to duplicate it for all the different platforms you have. In addition, it gives you more flexibility in how you present it, meaning the costs of showing information on different devices (say computers vs. mobile phones) should also drop. Cost savings and increased flexibility are the real drivers. Open data becomes an additional benefit. This is something I dive into deeper detail in a blog post from July 2011: It’s the icing, not the cake: key lesson on open data for governments.

Of course, having a cool model is nice and all, but, as like the previous directive on open government, this document has hard requirements designed to force departments to being shifting their IT architecture quickly. So check out this interesting tidbit out of the doc:

While the open data and web API policy will apply to all new systems and underlying data and content developed going forward, OMB will ask agencies to bring existing high-value systems and information into compliance over a period of time—a “look forward, look back” approach To jump-start the transition, agencies will be required to:

  • Identify at least two major customer-facing systems that contain high-value data and content;
  • Expose this information through web APIs to the appropriate audiences;
  • Apply metadata tags in compliance with the new federal guidelines; and
  • Publish a plan to transition additional systems as practical

Note the language here. This is again not a “let’s throw some data up there and see what happens” approach. I endorse doing that as well, but here the White House is demanding that departments be strategic about the data sets/APIs they create. Locate a data set that you know people want access to. This is easy to assess. Just look at pageviews, or go over FOIA/ATIP requests and see what is demanded the most. This isn’t rocket science – do what is in most demand first. But you’d be surprised how few governments want to serve up data that is in demand.

Another interesting inference one can make from the report is that its recommendations embrace the possibility of participants outside of government – both for and non-profit – can build services on top of government information and data. Referring back to the chart above see how the Presentation Layer includes both private and public examples? Consequently, a non-profits website dedicated to say… job info veterans could pull live data and information from various Federal Government websites, weave it together and present in a way that is most helpful to the veterans it serves. In other words the opportunity for innovation is fairly significant. This also has two addition repercussions. It means that services the government does not currently offer – at least in a coherent way – could be woven together by others. It also means there may be information and services the government simply never chooses to develop a presentation layer for – it may simply rely on private or non-profit sector actors (or other levels of government) to do that for it. This has interesting political ramifications in that it could allow the government to “retreat” from presenting these services and rely on others. There are definitely circumstances where this would make me uncomfortable, but the solution is not to not architect this system this way, it is to ensure that such programs are funded in a way that ensures government involvement in all aspects – information, platform and presentation.

At this point I want to interject two tangential thoughts.

First, if you are wondering why it is your government is not doing this – be it at the local, state or national level. Here’s a big hint: this is what happens when you make the CIO an executive who reports at the highest level. You’re just never going to get innovation out of your government’s IT department if the CIO reports into the fricking CFO. All that tells me is that IT is a cost centre that should be focused on sustaining itself (e.g. keeping computers on) and that you see IT as having no strategic relevance to government. In the private sector, in the 21st century, this is pretty much the equivalent of committing suicide for most businesses. For governments… making CIO’s report into CFO’s is considered a best practice. I’ve more to say on this. But I’m taking a deep breath and am going to move on.

Second, I love how the document also is so clear on milestones – and nicely visualized as well. It may be my poor memory but I feel like it is rare for me to read a government road map on any issues where the milestones are so clearly laid out.

It’s particularly nice when a government treats its citizens as though they can understand something like this, and aren’t afraid to be held accountable for a plan. I’m not saying that other governments don’t set out milestones (some do, many however do not). But often these deadlines are buried in reams of text. Here is a simply scorecard any citizen can look at. Of course, last time around, after the open government directive was issued immediately after Obama took office, they updated these score cards for each department, highlight if milestones were green, yellow or red, depending on how the department was performing. All in front of the public. Not something I’ve ever seen in my country, that’s for sure.

Of course, the document isn’t perfect. I was initially intrigued to see the report advocates that the government “Shift to an Enterprise-Wide Asset Management and Procurement Model.” Most citizens remain blissfully unaware of just how broken government procurement is. Indeed, I say this dear reader with no idea where you live and who your government is, but I enormously confident your government’s procurement process is totally screwed. And I’m not just talking about when they try to buy fighter planes. I’m talking pretty much all procurement.

Today’s procurement is perfectly designed to serve one group. Big (IT) vendors. The process is so convoluted and so complicated they are really the only ones with the resources to navigate it. The White House document essentially centralizes procurement further. On the one hand this is good, it means the requirements around platforms and data noted in the document can be more readily enforced. Basically the centre is asserting more control at the expense of the departments. And yes, there may be some economies of scale that benefit the government. But the truth is whenever procurement decision get bigger, so to do the stakes, and so to does the process surrounding them. Thus there are a tiny handful of players that can respond to any RFP and real risks that the government ends up in a duopoly (kind of like with defense contractors). There is some wording around open source solutions that helps address some of this, but ultimately, it is hard to see how the recommendations are going to really alter the quagmire that is government procurement.

Of course, these are just some thoughts and comments that struck me and that I hope, those of you still reading, will find helpful. I’ve got thoughts on the White House Innovation Fellows especially given it appears to have been at least in part inspired by the Code for America fellowship program which I have been lucky enough to have been involved with. But I’ll save those for another post.

My LRC Review of "When the Gods Changed" and other recommended weekend readings

This week, the Literary Review of Canada published my and Taylor Owen’s review of When the Gods Changed: The Death of Liberal Canada by Peter C. Newman. For non-Canadians Peter Newman is pretty much a legend when it comes to covering Canadian history and politics, he was editor of the country’s largest newspaper and main news magazine and has published over 35 books. I also think the review will be of interest to non-Canadians since I think the topic of the decline of Liberal Canada are also true for a number of other countries experiencing more polarized politics.

Some other articles I’ve been digesting that I recommend for some Friday or weekend reading:

Why China’s Political Model Is Superior

This one is a couple of months old, but it doesn’t matter. Fascinating read. For one it shows the type of timelines that the Chinese look at the world with. Hint. It is waaayyyy longer than ours. Take a whiff:

In Athens, ever-increasing popular participation in politics led to rule by demagogy. And in today’s America, money is now the great enabler of demagogy. As the Nobel-winning economist A. Michael Spence has put it, America has gone from “one propertied man, one vote; to one man, one vote; to one person, one vote; trending to one dollar, one vote.” By any measure, the United States is a constitutional republic in name only.

Unattractive Real Estate Agents Achieve Quicker Sales

Before getting serious on you again, here’s a lighter more interesting note. I often comment in talks I give that real estate agents rarely use data to attract clients – mostly just pictures of themselves. Turns out… there might be more data in that then I thought! Apparently less attractive agents sell homes faster and work harder. More attractive agents take longer, but get more money. Food for thought here.

Andrew Coyne: Question isn’t where conservatism is going, but where has it gone

Another oldie but a goody. Liberal Canada may be dead, but it appears that Conservative Canada isn’t in much better shape. I’ve always enjoyed Coyne and feel like he’s been sharper than usual of late (since moving back to the National Post). For Americans, there may be some interesting lessons in here for the Tea Party movement. Canada experienced a much, much lighter form of conservative rebellion with creation of the Reform Party in the late 80s/early 90s which split off from establishment conservatives. Today, that group is now in power (rebranded) but Coyne assesses that much of what they do has been watered down. But not everything… to the next two articles!

Environmental charities ‘laundering’ foreign funds, Kent says

Sadly, Canada’s “Environment” Minister is spending most of his time attacking environmental groups. The charge is that they use US money to engage in advocacy against a pipeline to be built in Canada. Of course “Laundering” is a serious charge (in infers illegal activity) and given how quick the Conservatives have been in suing opponents for libel Kent had better be careful the stakeholders will adopt this tactic. Of course, this is probably why he doesn’t name any groups in particular (clever!). My advice, is that all the groups named by the Senate committee should sue him, then, to avoid the lawsuit he’d have to either a) back down from the claim altogether, or b) be specific about which group he is referring to to have the other suits thrown out. Next headline… to the double standard!

Fraser Institute co-founder confirms ‘years and years’ of U.S. oil billionaires’ funding

Some nifty investigative work here by a local Vancouver reporter finds that while the Canadian government believes it is bad for environmental groups to receive US funds for advocacy, it is apparently, completely okay for Conservative groups to receive sums of up to $1.7M from US oil billionaires. Ethical Oil – another astro-turf pro-pipeline group does something similar. It receives money from Canadian law firms that represent benefiting American and Chinese oil interests. But that money is labelled “Canadian” because it is washed through Canadian law firms. Confused? You should be.

What retail is hired to do: Apple vs. IKEA

I love that Clay Christiansen is on twitter. The Innovator’s Dilemma is a top 5 book of all time for me. Here is a great break down of how IKEA and Apple stores work. Most intriguing is the unique value proposition/framing their stores make to consumers which explains their phenomenal success as why they are often not imitated.

Open Data Movement is a Joke?

Yesterday, Tom Slee wrote a blog post called “Why the ‘Open Data Movement’ is a Joke,” which – and I say this as a Canadian who understands the context in which Slee is writing – is filled with valid complaints about our government, but which I feel paints a flawed picture of the open data movement.

Evgeny Morozov tweeted about the post yesterday, thereby boosting its profile. I’m a fan of Evgeny. He is an exceedingly smart and critical thinker on the intersection of technology and politics. He is exactly what our conversation needs (unlike, say, Andrew Keen). I broadly felt his comments (posted via his Twitter stream) were both on target: we need to think critically about open data; and lacked nuance: it is possible for governments to simultaneously become more open and more closed on different axis. I write all this confident that Evgeny may turn his ample firepower on me, but such is life.

So, a few comments on Slee’s post:

First, the insinuation that the open data movement is irretrievably tainted by corporate interests is so over the top it is hard to know where to begin to respond. I’ve been advocating for open data for several years in Canada. Frankly, it would have been interesting and probably helpful if a large Canadian corporation (or even a medium sized one) took notice. Only now, maybe 4-5 years in, are they even beginning to pay attention. Most companies don’t even know what open data is.

Indeed, the examples of corporate open data “sponsors” that Slee cites are U.S. corporations, sponsoring U.S. events (the Strata conference) and nonprofits (Code for America – of which I have been engaged with). Since Slee is concerned primarily with the Canadian context, I’d be interested to hear his thoughts on how these examples compare to Canadian corporate involvement in open data initiatives – or even foreign corporations’ involvement in Canadian open data.

And not to travel too far down the garden path on this, but it’s worth noting that the corporations that have jumped on the open data bandwagon in the US often have two things in common: First, their founders are bona fide geeks, who in my experience are both interested in hard data as an end unto itself (they’re all about numbers and algorithms), and want to see government-citizen interactions – and internal governmental interactions, too – made better and more efficient. Second, of course they are looking after their corporate interests, but they know they are not at the forefront of the open data movement itself. Their sponsorship of various open data projects may well have profit as one motive, but they are also deeply interested in keeping abreast of developments in what looks to be a genuine Next Big Thing. For a post the Evgeny sees as being critical of open data, I find all this deeply uncritical. Slee’s post reads as if anything that is touched by a corporation is tainted. I believe there are both opportunities and risks. Let’s discuss them.

So, who has been advocating for open data in Canada? Who, in other words, comprises the “open data movement” that Slee argues doesn’t really exist – and that “is a phrase dragged out by media-oriented personalities to cloak a private-sector initiative in the mantle of progressive politics”? If you attend one of the hundreds of hackathons that have taken place across Canada over the past couple years – like those that have happened in Vancouver, Regina, Victoria, Montreal and elsewhere – you’ll find they are generally organized in hackspaces and by techies interested in ways to improve their community. In Ottawa, which I think does the best job, they can attract hundreds of people, many who bring spouses and kids as they work on projects they think will be helpful to their community. While some of these developers hope to start businesses, many others try to tackle issues of public good, and/or try to engage non-profits to see if there is a way they can channel their talent and the data. I don’t for a second pretend that these participants are a representative cross-section of Canadians, but by and large the profile has been geek, technically inclined, leaning left, and socially minded. There are many who don’t fit that profile, but that is probably the average.

Second, I completely agree that this government has been one of the most – if not the most – closed and controlling in Canada’s history. I, like many Canadians, echo Slee’s frustration. What’s worse, is I don’t see things getting better. Canadian governments have been getting more centralized and controlling since at least Trudeau, and possibly earlier (Indeed, I believe polling and television have played a critical role in driving this trend). Yes, the government is co-opting the language of open data in an effort to appear more open. All governments co-opt language to appear virtuous. Be it on the environment, social issues or… openness, no government is perfect and indeed, most are driven by multiple, contradictory goals.

As a member of the Federal Government’s Open Government Advisory Panel I wrestle with this challenge constantly. I’m try hard to embed some openness into the DNA of government. I may fail. I know that I won’t succeed in all ways, but hopefully I can move the rock in the right direction a little bit. It’s not perfect, but then it’s pretty rare that anything involving government is. In my (unpaid, advisory, non-binding) role I’ve voiced that the government should provide the Access to Information Commissioner with a larger budget (they cut it) and that they enable government scientists to speak freely (they have not so far). I’ve also advocated that they should provide more open data. There they have, including some data sets that I think are important – such as aid data (which is always at risk of being badly spent). For some, it isn’t enough. I’d like for there to be more open data sets available, and I appreciate those (like Slee – who I believe is writing from a place of genuine care and concern) who are critical of these efforts.

But, to be clear, I would never equate open government data as being tantamount to solving the problems of a restrictive or closed government (and have argued as much here). Just as an authoritarian regime can run on open-source software, so too might it engage in open data. Open data is not the solution for Open Government (I don’t believe there is a single solution, or that Open Government is an achievable state of being – just a goal to pursue consistently), and I don’t believe anyone has made the case that it is. I know I haven’t. But I do believe open data can help. Like many others, I believe access to government information can lead to better informed public policy debates and hopefully some improved services for citizens (such as access to transit information). I’m not deluded into thinking that open data is going to provide a steady stream of obvious “gotcha moments” where government malfeasance is discovered, but I am hopeful that government data can arm citizens with information that the government is using to inform its decisions so that they can better challenge, and ultimately help hold accountable, said government.

Here is where I think Evgeny’s comments on the problem with the discourse around “open” are valid. Open Government and Open Data should not be used interchangeably. And this is an issue Open Government and Open Data advocates wrestle with. Indeed, I’ve seen a great deal of discussion and reflection come as a result of papers such as this one.

Third, the arguments around StatsCan all feel deeply problematic. I say this as the person who wrote the first article (that I’m aware of) about the long form census debacle in a major media publication and who has been consistently and continuously critical of it. This government has had a dislike for Statistics Canada (and evidence) long before open data was in their vocabulary, to say nothing of a policy interest. StatsCan was going to be a victim of dramatic cuts regardless of Canada’s open data policy – so it is misleading to claim that one would “much rather have a fully-staffed StatsCan charging for data than a half-staffed StatsCan providing it for free.” (That quote comes from Slee’s follow-up post, here.) That was never the choice on offer. Indeed, even if it had been, it wouldn’t have mattered. The total cost of making StatsCan data open is said to have been $2 million; this is a tiny fraction of the payroll costs of the 2,500 people they are looking to lay off.

I’d actually go further than Slee here, and repeat something I say all the time: data is political. There are those who, naively, believed that making data open would depoliticize policy development. I hope there are situations where this might be true, but I’ve never taken that for granted or assumed as much: Quite the opposite. In a world where data increasingly matters, it is increasingly going to become political. Very political. I’ve been saying this to the open data community for several years, and indeed was a warning that I made in the closing part of my keynote at the Open Government Data Camp in 2010. All this has, in my mind, little to do with open data. If anything, having data made open might increase the number of people who are aware of what is, and is not, being collected and used to inform public policy debates. Indeed, if StatsCan had made its data open years ago it might have had a larger constituency to fight on its behalf.

Finally, I agree with the Nat Torkington quote in the blog post:

Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.

Moreover, it is possible for a policy to have two different possible drivers. It can even have multiple contradictory drivers simultaneously. In Canada, my assessment is that the government doesn’t have this level of sophistication around its thinking on this file, a conclusion I more or less wrote when assessing their Open Government Partnership commitments. I have no doubt that the conservatives would like to turn government provided services over to the private sector, and open data has so far not been part of that strategy. In either case, there is, in my mind, a policy infrastructure that needs to be in place to pursue either of these goals (such as having a data governance structure in place). But from a more narrow open data perspective, my own feeling is that making the data open has benefits for public policy discourse, public engagement, and economic reasons. Indeed, making more government data available may enable citizens to fight back against policies they feel are unacceptable. You may not agree with all the goals of the Canadian government – as someone who has written at least 30 opeds in various papers outlining problems with various government policies, neither do I – but I see the benefits of open data as real and worth pursuing, so I advocate for it as best I can.

So in response to the opening arguments about the open data movement…

It’s not a movement, at least in any reasonable political or cultural sense of the word.

We will have to agree to disagree. My experience is quite the opposite. It is a movement. One filled with naive people, with skeptics, with idealists focused on accountability, developers hoping to create apps, conservatives who want to make government smaller and progressives who want to make it more responsive and smarter. There was little in the post that persuaded me there wasn’t a movement. What I did hear is that the author didn’t like some parts of the movement and its goals. Great! Please come join the discussion; we’d love to have you.

It’s doing nothing for transparency and accountability in government,

To say it is doing nothing for transparency seems problematic. I need only cite one data set now open to say that isn’t true. And certainly publication of aid data, procurement data, publications of voting records and the hansard are examples of places where it may be making government more transparent and accountable. What I think Slee is claiming is that open data isn’t transforming the government into a model of transparency and accountability, and he’s right. It isn’t. I don’t think anyone claimed it would. Nor do I think the public has been persuaded that because it does open data, the government is somehow open and transparent. These are not words the Canadian public associates with this government no matter what it does on this file.

It’s co-opting the language of progressive change in pursuit of what turns out to be a small-government-focused subsidy for industry.

There are a number of sensible, critical questions in Slee’s blog post. But this is a ridiculous charge. Prior to the data being open, you had an asset that was paid for by taxpayer dollars, then charged for at a premium that created a barrier to access. Of course, this barrier was easiest to surmount for large companies and wealthy individuals. If there was a subsidy for industry, it was under the previous model, as it effectively had the most regressive tax for access of any government service.

Indeed, probably the biggest beneficiaries of open data so far have been Canada’s municipalities, which have been able to gain access to much more data than they previously could, and have saved a significant amount of money (Canadian municipalities are chronically underfunded.) And of course, when looking at the most downloaded data sets from the site, it would appear that non-profits and citizens are making good use of them. For example, the 6th most downloaded was the Anthropogenic disturbance footprint within boreal caribou ranges across Canada used by many environmental groups; number 8 was weather data; 9th was Sales of fuel used for road motor vehicles, by province and territory, used most frequently to calculate Green House Gas emissions; and 10th the Government of Canada Core Subject Thesaurus – used, I suspect, to decode the machinery of government. Most of the other top downloaded data sets related to immigration, used it appears, to help applicants. Hard to see the hand of big business in all this, although if open data helped Canada’s private sector become more efficient and productive, I would hardly complain.

If your still with me, thank you, I know that was a long slog.

Public Policy: The Big Opportunity For Health Record Data

A few weeks ago Colin Hansen – a politician in the governing party in British Columbia (BC) – penned an op-ed in the Vancouver Sun entitled Unlocking our data to save lives. It’s a paper both the current government and opposition should read, as it is filled with some very promising ideas.

In it, he notes that BC has one of the best collections of health data anywhere in the world and that, data mining these records could yield patterns – like longitudinal adverse affects when drugs are combined or the correlations between diseases – that could save billions as well as improve health care outcomes.

He recommends that the province find ways to share this data with researchers and academics in ways that ensure the privacy of individuals are preserved. While I agree with the idea, one thing we’ve learned in the last 5 years is that, as good as academics are, the wider public is often much better in identifying patterns in large data sets. So I think we should think bolder. Much, much bolder.

Two years ago California based Heritage Provider Network, a company that runs hospitals, launched a $3 Million predictive health contest that will reward the team who, in three years, creates the algorithm that best predicts how many days a patient will spend in a hospital in the next year. Heritage believes that armed with such an algorithm, they can create strategies to reach patients before emergencies occur and thus reduce the number of hospital stays. As they put it: “This will result in increasing the health of patients while decreasing the cost of care.”

Of course, the algorithm that Heritage acquires through this contest will be proprietary. They will own it and I can choose who to share it with. But a similar contest run by BC (or say, the VA in the United States) could create a public asset. Why would we care if others made their healthcare system more efficient, as long as we got to as well. We could create a public good, as opposed to Heritage’s private asset. More importantly, we need not offer a prize of $3 million dollars. Several contests with prizes of $10,000 would likely yield a number of exciting results. Thus for very little money with might help revolutionize BC, and possibly Canada’s and even the world’s healthcare systems. It is an exciting opportunity.

Of course, the big concern in all of this is privacy. The Globe and Mail featured an article in response to Hansen’s oped (shockingly but unsurprisingly, it failed to link back to – why do newspaper behave that way?) that focused heavily on the privacy concerns but was pretty vague about the details. At no point was a specific concern by the privacy commissioner raised or cited. For example, the article could have talked about the real concern in this space, what is called de-anonymization. This is when an analyst can take records – like health records – that have been anonymized to protect individual’s identity and use alternative sources to figure out who’s records belong to who. In the cases where this occurs it is usually only only a handful of people whose records are identified, but even such limited de-anonymization is unacceptable. You can read more on this here.

As far as I can tell, no one has de-anonymized the Heritage Health Prize data. But we can take even more precautions. I recently connected with Rob James – a local epidemiologist who is excited about how opening up anonymized health care records could save lives and money. He shared with me an approach taking by the US census bureau which is even more radical than de-anonymization. As outlined in this (highly technical) research paper by Jennifer C. Huckett and Michael D. Larsen, the approach involves creating a parallel data set that has none of the features of the original but maintains all the relationships between the data points. Since it is the relationships, not the data, that is often important a great deal of research can take place with much lower risks. As Rob points out, there is a reasonably mature academic literature on these types of privacy protecting strategies.

The simple fact is, healthcare spending in Canada is on the rise. In many provinces it will eclipse 50% of all spending in the next few years. This path is unsustainable. Spending in the US is even worse. We need to get smarter and more efficient. Data mining is perhaps the most straightforward and accessible strategy at our disposal.

So the question is this: does BC want to be a leader in healthcare research and outcomes in an area the whole world is going to be interested in? The foundation – creating a high value data set – is already in place. The unknown is if can we foster a policy infrastructure and public mandate that allows us to think and act in big ways. It would be great if government officials, the privacy commissioner and some civil liberties representatives started to dialogue to find some common ground.  The benefits to British Columbians – and potentially to a much wider population – could be enormous, both in money and, more importantly, lives, saved.

Canada's Action Plan on Open Government: A Review

The other day the Canadian Government published its Action Plan on Open Government, a high level document that both lays out the Government’s goals on this file as well as fulfill its pledge to create tangible goals as part of its participation in next week’s Open Government Partnership 2012 annual meeting in Brazil.

So what does the document say and what does it mean? Here is my take.

Take Away #1: Not a breakthrough document

There is much that is good in the government’s action plan – some of which I will highlight later. But for those hoping that Canada was going to get the Gov 2.0 bug and try to leapfrog leaders like the United States or the United Kingdom, this document will disappoint. By and large this document is not about transforming government – even at its most ambitious it appears to be much more about engaging in some medium sized experiments.

As a result the document emphasizes a number of things that the UK and US started doing several years ago such  getting license that adheres to international norms or posting government resource allocation and performance management information online in machine readable forms or refining the open data portal.

What you don’t see are explicit references to try to re-think how government leverages citizens experience and knowledge with a site like Challenge.gov, engage experts in innovative ways such as with Peer to Patent, or work with industries or provinces to generate personal open data such as the US has done with the Blue Button (for Healthcare) or the Green Button (for utilities).

Take Away #2: A Solid Foundation

This said, there is much in the document that is good. Specifically, in many areas, it does lay a solid foundation for some future successes. Probably the most important statements are the “foundational commitments” that appear on this page. Here are some key points:

Open Government Directive

In Year 1 of our Action Plan, we will confirm our policy direction for Open Government by issuing a new Directive on Open Government. The Directive will provide guidance to 106 federal departments and agencies on what they must do to maximize the availability of online information and data, identify the nature of information to be published, as well as the timing, formats, and standards that departments will be required to adopt… The clear goal of this Directive is to make Open Government and open information the ‘default’ approach.

This last sentence is nice to read. Of course the devil will be in the detail (and in the execution) but establishing a directive around open information could end being as important (although admittedly not as powerful – an important point) as the establishment of Access to Information. Done right such a directive could vastly expand the range of documents made available to the public, something that should be very doable as more and more government documentation moves into digital formats.

For those complaining about the lack of ATI reform in the document this directive, and its creation will be with further exploration. There is an enormous opportunity here to reset how government discloses information – and “the default to open” line creates a public standard that we can try to hold the government to account on.

And of course the real test for all this will come in years 2-3 when it comes time to disclose documents around something sensitive to the government… like, say, around the issue of the Northern Gateway Pipeline (or something akin to the Afghan Prisoner issue). In theory this directive should make all government research and assessments open, when this moment happens we’ll have a real test of the robustness of any new such directive.

Open Government License:

To support the Directive and reduce the administrative burden of managing multiple licensing regimes across the Government of Canada, we will issue a new universal Open Government License in Year 1 of our Action Plan with the goal of removing restrictions on the reuse of published Government of Canada information (data, info, websites, publications) and aligning with international best practices… The purpose of the new Open Government License will be to promote the re-use of federal information as widely as possible...

Full Disclosure: I have been pushing (in an unpaid capacity) for the government to reform its license and helping out in its discussions with other jurisdictions around how it can incorporate the best practices and most permissive language possible.

This is another important foundational piece. To be clear, this is not about an “open data” license. This is about creating a licensing for all government information and media. I suspect this appeals to this government in part because it ends the craziness of having lawyers across government constantly re-inventing new licenses and creating a complex set of licenses to manage. Let me be clear about what I think this means: This is functionally about neutering crown copyright. It’s about creating a licensing regime that makes very clear what the users rights are (which crown copyright does not doe) and that is as permissive as possible about re-use (which crown copyright, because of its lack of clarity, is not). Achieving such a license is a critical step to doing many of the more ambitious open government and gov 2.0 activities that many of us would like to see happen.

Take Away #3: The Good and Bad Around Access to Information

For many, I think this may be the biggest disappointment is that the government has chosen not to try to update the Access to Information Act. It is true that this is what the Access to Information Commissioners from across the country recommended they do in an open letter (recommendation #2 in their letter). Opening up the act likely has a number of political risks – particularly for a government that has not always been forthcoming documents (the Afghan detainee issue and F-35 contract both come to mind) – however, I again propose that it may be possible to achieve some of the objectives around improved access through the Open Government Directive.

What I think shouldn’t be overlooked, however, is the government’s “experiment” around modernizing the administration of Access to Information:

To improve service quality and ease of access for citizens, and to reduce processing costs for institutions, we will begin modernizing and centralizing the platforms supporting the administration of Access to Information (ATI). In Year 1, we will pilot online request and payment services for a number of departments allowing Canadians for the first time to submit and pay for ATI requests online with the goal of having this capability available to all departments as soon as feasible. In Years 2 and 3, we will make completed ATI request summaries searchable online, and we will focus on the design and implementation of a standardized, modern, ATI solution to be used by all federal departments and

These are welcome improvements. As one colleague – James McKinney – noted, the fact that you have to pay with a check means that only people with Canadian bank accounts can make ATIP requests. This largely means just Canadian citizens. This is ridiculous. Moreover, the process is slow and painful (who uses check! the Brits are phasing them out by 2018 – good on em!). The use of checks creates a real barrier – particularly I think, for young people.

Also, being able search summaries of previous requests is a no-brainer.

Take Away #4: The is a document of experiments

As I mentioned earlier, outside the foundational commitments, the document reads less like a grand experiment and more like a series of small experiments.

Here the Virtual Library is another interesting commitment – certainly during the consultations the number one complaint was that people have a hard time finding what they are looking for on government websites. Sadly, even if you know the name of the document you want, it is still often hard to find. A virtual library is meant to address this concern – obviously it is all going to be in the implementation – but it is a response to a genuine expressed need.

Meanwhile the Advancing Recordkeeping in the Government of Canada and User-Centric Web Services feel like projects that were maybe already in the pipeline before Open Government came on the scene. They certainly do conform with the shared services and IT centralization announced by Treasury Board last year. They could be helpful but honestly, these will all be about execution since these types of projects can harmonize processes and save money, or they can become enormous boondoggles that everyone tries to work around since they don’t meet anyone’s requirements. If they do go the right way, I can definitely imagine how they might help the management of ATI requests (I have to imagine it would make it easier to track down a document).

I am deeply excited about the implementation of International Aid Transparency Initiative (IATI). This is something I’ve campaigned for and urged the government to adopt, so it is great to see. I think these types of cross jurisdictional standards have a huge role to play in the open government movement, so joining one, figuring out what about the implementation works and doesn’t work, and assessing its impact, is important both for Open Government in general but also for Canada, as it will let us learn lessons that, I hope, will become applicable in other areas as more of these types of standards emerge.

Conclusion:

I think it was always going to be a stretch to imagine Canada taking a leadership role in Open Government space, at least at this point. Frankly, we have a lot of catching up to do, just to draw even with places like the US and the UK which have been working hard to keep experimenting with new ideas in the space. What is promising about the document is that it does present an opportunity for some foundational pieces to be put into play. The bad news is that real efforts to rethink governments relationship with citizens, or even the role of the public servant within a digital government, have not been taken very far.

So… a C+?

 

Additional disclaimer: As many of my readers know, I sit on the Federal Government’s Open Government Advisory Panel. My role on this panel is to serve as a challenge function to the ideas that are presented to us. In this capacity I share with them the same information I share with you – I try to be candid about what I think works and doesn’t work around ideas they put forward. Interestingly, I did not see even a draft version of the Action Plan until it was posted to the website and was (obviously by inference) not involved in its creation. Just want to share all that to be, well, transparent, about where I’m coming from – which remains as a citizen who cares about these issues and wants to push governments to do more around gov 2.0 and open gov.

Also, sorry or the typos, but I’m sick and it is 1am. So I’m checking out. Will proof read again when I awake.

Here's a prediction: A Canadian F-35 will be shot down by a drone in 2035

One of the problems with living in a country like Canada is that certain people become the default person on certain issues. It’s a small place and the opportunity for specialization (and brand building) is small, so you can expect people to go back to the same well a fair bit on certain issues. I know, when it comes to Open Data, I can often be that well.

Yesterday’s article by Jack Granastein – one of the country’s favourite commentator’s on (and cheerleaders of) all things military – is a great case in point. It’s also a wonderful example of an article that is not designed to answer deep questions, but merely reassure readers not to question anything.

For those not in the know, Canada is in the midst of a scandal around the procurement of new fighter jets which, it turns out, the government not only chose to single source, but has been caught lying misleading the public about the costs despite repeated attempts by both the opposition and the media to ask for the full cost. Turns out the plans will cost twice as much as previously revealed, maybe more. For those interested in reading a case study in how not to do government procurement Andrew Coyne offers a good review in his two latest columns here and here. (Granastein, in the past, has followed the government script, using the radically low-ball figure of $16 billion, it is now accepted to be $26 billion).

Here is why Jack Granastein’s piece is so puzzling. The fact is, there really aren’t that many articles about whether the F-35 is the right plane or not. People are incensed about being radically mislead about the cost and the sole source process – not that we chose the F-35. But Granastein’s piece is all about assuring us that a) a lot of thought has gone into this choice and b) we shouldn’t really blame the military planners (nor apparently, the politicians). It is the public servants fault. So, some thoughts.

These are some disturbing and confusing conclusions. I have to say, it is very, very depressing to read someone as seasoned and knowledgeable as Granastein write:

But the estimates of costs, and the spin that has so exercised the Auditor-General, the media and the Opposition, are shaped and massaged by the deputy minister, in effect DND’s chief financial officer, who advises the minister of national defence.

Errr….Really? I think they are shaped by them at the direction or with the approval of the Minister of Defence. I agree that the Minister and Cabinet probably are not up to speed on the latest in airframe technology and so probably aren’t hand picking the fighter plane. But you know what they are up to speed on? Spinning budgets and political messages to sell to the public. To somehow try to deflect the blame onto the public servants feels, well, like yet another death nail for the notion of ministerial accountability.

But even Granastein’s love of the F-35 is hard to grasp. Apparently:

“we cannot see into the future, and we do not know what challenges we might face. Who foresaw Canadian fighters participating in Kosovo a dozen years ago? Who anticipated the Libyan campaign?”

I’m not sure I want to live and die on those examples. I mean in Libya alone our CF-18’s were joined by F-16s, Rafale fighters, Mirage 2000s and Mirage 2000Ds, Tornados, Eurofighter Typhoons, and JAS 39C Gripen (are you bored yet?). Apparently there were at least 7 other choices that would have worked out okay for the mission. The Kosovo mission had an even wider assortment of planes. Apparently, this isn’t a choice of getting it “just right” more like, “there are a lot of options that will work.”

But looking into the future there are some solid and strong predictions we can make:

1) Granastein himself argued in 2010 that performing sovereignty patrols in the arctic is one of the reasons we need to buy new planes. Here is a known future scenario. So frankly I’m surprised he’s bullish on the F-35s since the F-35’s will not be able to operate in the arctic for at least 5 years and may not for even longer. Given that, in that same article, Granastein swallowed the now revealed to be bogus total cost of owernship figures provided by the Department of National Defence hook, line and sinke, you think he might be more skeptical about other facts. Apparently not.

2) We can’t predict the future. I agree. But I’m going to make a prediction anyway. If Canada fights an enemy with any of the sophistication that would require us to have the F-35 (say, a China in 25 years) I predict that an F-35 will get shot down by a pilotless drone in that conflict.

What makes drones so interesting is that because they don’t have to have pilots they can be smaller, faster and more maneuverable. Indeed in the 1970s UAVs were able to outmaneuver the best US pilots of the day. Moreover, the world of aviation may change very quickly in the coming years. Everyone will tell you a drone can’t beat a piloted plane. This is almost likely true today (although a pilot-less drone almost shot down a Mig in 2002 in Iraq).

But may have two things going for them. First, if drones become cheaper to build and operate, and you don’t have to worry about losing the expensive pilot, you may be able to make up for competency with numbers. Imagining an F-35 defeating a single drone – such as the US Navy’s experimental X-47B – is easy. What about defeating a swarm of 5 of them that are working seamlessly together?

Second, much like nature, survival frequently favours those who can reproduce frequently. The F-35 is expected to last Canada 30-35 years. Yes there will be upgrades and changes, but that is a slow evolutionary pace. In that time, I suspect we’ll see somewhere between 5 (and likely a lot more) generations of drones. And why not? There are no pilots to retrain, just new lessons from the previous generation of drones to draw from, and new technological and geo-political realities to adapt to.

I’m not even beginning to argue that air-to-air combat capable drones are available today, but it isn’t unlikely that they could be available in 5-10 years. Of course, many air forces hate talking about this because, well, drones mean no more pilots and air forces are composed of… well… pilots. But it does suggest that Canada could buy a fighter that is much cheaper, would still enable us to participate in missions like Kosovo and Libya, without locking us into a 30-35 year commitment at the very moment the military aerospace industry is entering what is possibly the most disruptive period in its history.

It would seem that, at the very least, since we’ve been mislead about pretty much everything involved in this project, asking these questions now feels like fair game.

(Oh, and as an aside, as we decide to pay somewhere between $26-44 Billion for fighter planes, our government cut the entire $5 million year budget of the National Aboriginal Health Organization which over research and programs, in areas like suicide prevention, tobacco cessation, housing and midwifery. While today Canada ranks 6th in the world in the UN’s Quality of Life index, it was calculated that in 2007 Canada’s first nation’s population, had they been ranked as a separate group, would have ranked 63rd. Right above healthy countries like Belarus, Russia and Libya. Well at least now we’ll have less data about the problem, which means we won’t know to worry about it.)