Monthly Archives: May 2012

The US Government's Digital Strategy: The New Benchmark and Some Lessons

Last week the White House launched its new roadmap for digital government. This included the publication of Digital Government: Building a 21st Century Platform to Better Serve the American People (PDF version), the issuing of a Presidential directive and the announcement of White House Innovation Fellows.

In other words, it was a big week for those interested in digital and open government. Having had some time to digest these docs and reflect upon them, below are some thoughts on these announcement and lessons I hope governments and other stakeholders take from it.

First off, the core document – Digital Government: Building a 21st Century Platform to Better Serve the American People – is a must read if you are a public servant thinking about technology or even about program delivery in general. In other words, if your email has a .gov in it or ends in something like .gc.ca you should probably read it. Indeed, I’d put this document right up there with another classic must read, The Power of Information Taskforce Report commissioned by the Cabinet Office in the UK (which if you have not read, you should).

Perhaps most exciting to me is that this is the first time I’ve seen a government document clearly declare something I’ve long advised governments I’ve worked with: data should be a layer in your IT architecture. The problem is nicely summarized on page 9:

Traditionally, the government has architected systems (e.g. databases or applications) for specific uses at specific points in time. The tight coupling of presentation and information has made it difficult to extract the underlying information and adapt to changing internal and external needs.

Oy. Isn’t that the case. Most government data is captured in an application and designed for a single use. For example, say you run the license renewal system. You update your database every time someone wants to renew their license. That makes sense because that is what the system was designed to do. But, maybe you like to get track, in real time, how frequently the database changes, and by who. Whoops. System was designed for that because that wasn’t needed in the original application. Of course, being able to present the data in that second way might be a great way to assess how busy different branches are so you could warn prospective customers about wait times. Now imagine this lost opportunity… and multiply it by a million. Welcome to government IT.

Decoupling data from application is pretty much close to the first think in the report. Here’s my favourite chunk from the report (italics mine, to note extra favourite part).

The Federal Government must fundamentally shift how it thinks about digital information. Rather than thinking primarily about the final presentation—publishing web pages, mobile applications or brochures—an information-centric approach focuses on ensuring our data and content are accurate, available, and secure. We need to treat all content as data—turning any unstructured content into structured data—then ensure all structured data are associated with valid metadata. Providing this information through web APIs helps us architect for interoperability and openness, and makes data assets freely available for use within agencies, between agencies, in the private sector, or by citizens. This approach also supports device-agnostic security and privacy controls, as attributes can be applied directly to the data and monitored through metadata, enabling agencies to focus on securing the data and not the device.

To help, the White House provides a visual guide for this roadmap. I’ve pasted it below. However, I’ve taken the liberty to highlight how most governments try to tackle open data on the right – just so people can see how different the White House’s approach is, and why this is not just an issue of throwing up some new data but a total rethink of how government architects itself online.

There are of course, a bunch of things that flow out of the White House’s approach that are not spelled out in the document. The first and most obvious is once you make data an information layer you have to manage it directly. This means that data starts to be seen and treated as a asset – this means understanding who’s the custodian and establishing a governance structure around it. This is something that, previously, really only libraries and statistical bureaus have really understand (and sometimes not even!).

This is the dirty secret about open data – is that to do it effectively you actually have to start treating data as an asset. For the White House the benefit of taking that view of data is that it saves money. Creating a separate information layer means you don’t have to duplicate it for all the different platforms you have. In addition, it gives you more flexibility in how you present it, meaning the costs of showing information on different devices (say computers vs. mobile phones) should also drop. Cost savings and increased flexibility are the real drivers. Open data becomes an additional benefit. This is something I dive into deeper detail in a blog post from July 2011: It’s the icing, not the cake: key lesson on open data for governments.

Of course, having a cool model is nice and all, but, as like the previous directive on open government, this document has hard requirements designed to force departments to being shifting their IT architecture quickly. So check out this interesting tidbit out of the doc:

While the open data and web API policy will apply to all new systems and underlying data and content developed going forward, OMB will ask agencies to bring existing high-value systems and information into compliance over a period of time—a “look forward, look back” approach To jump-start the transition, agencies will be required to:

  • Identify at least two major customer-facing systems that contain high-value data and content;
  • Expose this information through web APIs to the appropriate audiences;
  • Apply metadata tags in compliance with the new federal guidelines; and
  • Publish a plan to transition additional systems as practical

Note the language here. This is again not a “let’s throw some data up there and see what happens” approach. I endorse doing that as well, but here the White House is demanding that departments be strategic about the data sets/APIs they create. Locate a data set that you know people want access to. This is easy to assess. Just look at pageviews, or go over FOIA/ATIP requests and see what is demanded the most. This isn’t rocket science – do what is in most demand first. But you’d be surprised how few governments want to serve up data that is in demand.

Another interesting inference one can make from the report is that its recommendations embrace the possibility of participants outside of government – both for and non-profit – can build services on top of government information and data. Referring back to the chart above see how the Presentation Layer includes both private and public examples? Consequently, a non-profits website dedicated to say… job info veterans could pull live data and information from various Federal Government websites, weave it together and present in a way that is most helpful to the veterans it serves. In other words the opportunity for innovation is fairly significant. This also has two addition repercussions. It means that services the government does not currently offer – at least in a coherent way – could be woven together by others. It also means there may be information and services the government simply never chooses to develop a presentation layer for – it may simply rely on private or non-profit sector actors (or other levels of government) to do that for it. This has interesting political ramifications in that it could allow the government to “retreat” from presenting these services and rely on others. There are definitely circumstances where this would make me uncomfortable, but the solution is not to not architect this system this way, it is to ensure that such programs are funded in a way that ensures government involvement in all aspects – information, platform and presentation.

At this point I want to interject two tangential thoughts.

First, if you are wondering why it is your government is not doing this – be it at the local, state or national level. Here’s a big hint: this is what happens when you make the CIO an executive who reports at the highest level. You’re just never going to get innovation out of your government’s IT department if the CIO reports into the fricking CFO. All that tells me is that IT is a cost centre that should be focused on sustaining itself (e.g. keeping computers on) and that you see IT as having no strategic relevance to government. In the private sector, in the 21st century, this is pretty much the equivalent of committing suicide for most businesses. For governments… making CIO’s report into CFO’s is considered a best practice. I’ve more to say on this. But I’m taking a deep breath and am going to move on.

Second, I love how the document also is so clear on milestones – and nicely visualized as well. It may be my poor memory but I feel like it is rare for me to read a government road map on any issues where the milestones are so clearly laid out.

It’s particularly nice when a government treats its citizens as though they can understand something like this, and aren’t afraid to be held accountable for a plan. I’m not saying that other governments don’t set out milestones (some do, many however do not). But often these deadlines are buried in reams of text. Here is a simply scorecard any citizen can look at. Of course, last time around, after the open government directive was issued immediately after Obama took office, they updated these score cards for each department, highlight if milestones were green, yellow or red, depending on how the department was performing. All in front of the public. Not something I’ve ever seen in my country, that’s for sure.

Of course, the document isn’t perfect. I was initially intrigued to see the report advocates that the government “Shift to an Enterprise-Wide Asset Management and Procurement Model.” Most citizens remain blissfully unaware of just how broken government procurement is. Indeed, I say this dear reader with no idea where you live and who your government is, but I enormously confident your government’s procurement process is totally screwed. And I’m not just talking about when they try to buy fighter planes. I’m talking pretty much all procurement.

Today’s procurement is perfectly designed to serve one group. Big (IT) vendors. The process is so convoluted and so complicated they are really the only ones with the resources to navigate it. The White House document essentially centralizes procurement further. On the one hand this is good, it means the requirements around platforms and data noted in the document can be more readily enforced. Basically the centre is asserting more control at the expense of the departments. And yes, there may be some economies of scale that benefit the government. But the truth is whenever procurement decision get bigger, so to do the stakes, and so to does the process surrounding them. Thus there are a tiny handful of players that can respond to any RFP and real risks that the government ends up in a duopoly (kind of like with defense contractors). There is some wording around open source solutions that helps address some of this, but ultimately, it is hard to see how the recommendations are going to really alter the quagmire that is government procurement.

Of course, these are just some thoughts and comments that struck me and that I hope, those of you still reading, will find helpful. I’ve got thoughts on the White House Innovation Fellows especially given it appears to have been at least in part inspired by the Code for America fellowship program which I have been lucky enough to have been involved with. But I’ll save those for another post.

The "I Lost My Wallet" Service – Doing Government Service Delivery Right

A couple of years ago I was in Portugal to give a talk on Gov 2.0 at a conference the government was organizing. After the talk I went for dinner with the country’s CIO and remember hearing about a fantastic program they were running that – for me – epitomized the notion of a citizen centric approach. It was a help desk called: I Lost My Wallet.

Genius.

Essentially, it was a place you went when… you lost your wallet. What the government had done was bring together all the agencies that controlled a document or card that was likely to have been in your wallet. As a result, rather than running around from agency to agency filling out your name and address over and over again on dozens of different forms, you went to a single desk, filled out one set of forms to get new copies of say, your social insurance card, your drivers license, healthcare card and library card.

But get this. From the briefing I had, my understanding was that this service was not limited to government cards, they’d also partnered with several private entities. For example, I notice that the service also works for replacing Portugal’s Automotive Club Card. In addition – if I remember correctly – I was told the government was negotiating with the banks so that you could also cancel and replace your ATM/bank card and visa cards at this counter as well.

Now this is citizen centric service. Here the government is literally molded itself – pulling together dozens of agencies and private sector actors around a single service – so that a citizens can simply and quickly deal with a high stress moment. Yes, I’d love to live in a world where all these cards disappeared altogether and were simply managed by a single card of your choosing (like say your Oyster card in the UK – so that your subway fare card was also your healthcare card, government ID, and credit card). But we are a few years away from that still and so this is a nice interim service.

But more importantly it shows a real ability to shed silos and build a service around a citizen/customer need. I believe they had a similar service for “I bought a house” since this is a moment when a number of different government services become relevant simultaneously. I of course, can imagine several others – most notably a “my partner just died” service could be invaluable at helping people manage a truly terrible moment when dealing with government bureaucracy is the last thing they want to be doing.

You can find the website for I lost my Wallet here (it is, naturally, in Portuguese). You can also read more about it, as documented by the European Union here. Lots of food for thought here for those of you designing programs to serve citizens, be it in the public or private sector.

Control Your Content: Why SurveyMonkey Should Add a "Download Your Answers" Button

Let me start by saying, I really like SurveyMonkey.

By this I mean, I like SurveyMonkey specifically, but I also like online surveys in general. They are easy to ignore if I’m uninterested in the topic but – when the topic is relevant -  it is a great, simple service that allows me to share feedback, comments and opinions with whomever wants to solicit them.

Increasingly however, I find people and organizations are putting up more demanding surveys – surveys that necessitate thoughtful, and even occasionally long form, responses.

Take for example the Canadian Government. It used an online survey tool during its consultation on open government and open data.  The experience was pretty good. Rather than a clunky government website, there was a relatively easy form to fill out. Better still, since the form was long, it was wonderful that you could save your answers and come back to it later! This mattered since some of the form’s questions prompted me to write lengthy (and hopefully) insightful responses.

But therein lies the rub. In the jargon of the social media world I was “creating content.” This wasn’t just about clicking boxes. I was writing. And surprisingly many of my answers were causing me to develop new ideas. I was excited! I wanted to take the content I had created and turn it into a blog post.

Sadly, most survey tools make it very, very hard for you to capture the content you’ve created. It feels like it would be relatively easy to have a “download my answers” button at the end of a survey. I mean, if I’ve taken 10-120 minutes to complete a survey or public consultation shouldn’t we make it easy for me to keep a record of my responses? Instead, I’ve got to copy and paste the questions, and my answers, into a text document as I go. And of course, I’d better decide that I want to do that before I start since some survey tools don’t allow you to go back and see previous answers.

I ultimately did convert my answers into a blog post (you can see it here), but there was about 20 minutes of cutting, pasting, figuring things out, and reformatting. And there was some content (like Semantic Differential questions – where you rate statements) which were simply to hard to replicate.

There are, of course, other uses too. I had a similar experience last week after being invited to complete a survey posted by the Open Government Partnership Steering Committee on its Independent Reporting Mechanism. About half way through filling it out some colleagues suggested we compare answers to better understand one another’s advice. A download your answer tool would have convert a 15 minute into a 10 second task. All to access content I created.

I’m not claiming this is the be all, end all of online survey features, but it is the kind of simple thing that I survey company can do that will cause some users to really fall in love with the service. To its credit SurveyMonkey was at least willing to acknowledge the feedback – just what you’d hope for from a company that specializes in soliciting opinion online! With luck, maybe the idea will go somewhere.

sm-tweet

Lessons from Michigan's "Innovation Fund" for Government Software

So it was with great interest that several weeks ago a reader emailed me this news article coming out of Michigan. Turns out the state recently approved a $2.5 million dollar innovation fund that will be dispersed in $100,000 to $300,000 chunks to fund about 10 projects. As Government Technology reports:

The $2.5 million innovation fund was approved by the state Legislature in Michigan’s 2012 budget. The fund was made formal this week in a directive from Gov. Rick Snyder. The fund will be overseen by a five-person board that includes Michigan Department of Technology, Management and Budget (DTMB) Director John Nixon and state CIO David Behen.

There are lessons in this for other governments thinking about how to spur greater innovation in government while also reducing the cost of software.

First up: the idea of an innovation fund – particularly one that is designed to support software that works for multiple governments – is a laudable one. As I’ve written before, many governments overpay for software. I shudder to think of how many towns and counties in Michigan alone are paying to have the exact same software developed for them independently. Rather than writing the same piece of software over and over again for each town, getting a single version that is usable by 80% (or heck, even just 25%) of cities and counties would be a big win. We have to find a way to get governments innovating faster, and getting them back in the driver’s seat on  the software they need (as opposed to adapting stuff made for private companies) would be a fantastic start.

Going from this vision – of getting something that works in multiple cities – to reality, is not easy. Read the Executive Directive more closely. What’s particularly interesting (from my reading) is the flexibility of the program:

In addition to the Innovation Fund and Investment Board, the plan may include a full range of public, private, and non-profit collaborative innovation strategies, including resource sharing…

There is good news and bad news here.

The bad news is that all this money could end up as loans to mom and pop software shops that serve a single city or jurisdiction, because they were never designed from the beginning to be usable across multiple jurisdictions. In other words, the innovation fund could go to fund a bunch of vendors who already exist and who, at best, do okay, or at worse, do mediocre work and, in either case, will never be disruptive and blow up the marketplace with something that is both radically helpful and radically low cost.

What makes me particularly nervous about the directive is that there is no reference to open source license. If a government is going to directly fund the development of software, I think it should be open source; otherwise, taxpayers are acting as venture capitalists to develop software that they are also going to pay licenses to use. In other words, they’re absorbing the risk of a VC in order to have the limited rights of being a client; that doesn’t seem right. An open source requirement would be the surest way to ensure an ROI on the program’s money. It assures that Michigan governments that want access to what gets developed can get use it at the lowest possible cost. (To be clear, I’ve no problem with private vendors – I am one – but their software can be closed because they (should) be absorbing the risk of developing it themselves. If the government is giving out grants to develop software for government use, the resulting software should be licensed open.)

Which brings us to the good. My interest in the line of the executive directive cited above was piqued by the reference to public and non-profit “collaborative innovation strategies.” I read that and I immediately think of one of my favourite organizations: Kuali.

Many readers have heard me talk about Kuali, an organization in which a group of universities collectively set the specs for a piece of software they all need and then share in the costs of developing it. I’m a big believer that this model could work for local and even state level governments. This is particularly true for the enterprise management software packages (like financial management), for which cities usually buy over-engineered, feature rich bloatware from organizations like SAP. The savings in all this could be significant, particularly for the middle-sized cities for whom this type of software is overkill.

My real hope is that this is the goal of this fund – to help provide some seed capital to start 10 Kuali-like projects. Indeed, I have no idea if the governor and his CIO’s staff have heard of or talked to the Kuali team before signing this directive, but if they haven’t, they should now. (Note: It’s only a 5 hour drive from the capital, Lansing, Michigan to the home of Kuali in Bloomington, Indiana).

So, if you are a state, provincial or national government and you are thinking about replicating Michigan’s directive – what should you do? Here’s my advice:

  • Require that all the code created by any projects you fund be open source. This doesn’t mean anyone can control the specs – that can still reside in the hands of a small group of players, but it does mean that a variety of companies can get involved in implementation so that there is still competition and innovation. This was the genius of Kuali – in the space of a few months, 10 different companies emerged that serviced Kuali software – in other words, the universities created an entire industry niche that served them and their specific needs exclusively. Genius.
  • Only fund projects that have at least 3 jurisdictions signed up. Very few enterprise open source projects start off with a single entity. Normally they are spec’ed out with several players involved. This is because if just one player is driving the development, they will rationally always choose to take shortcuts that will work for them, but cut down on the likelihood the software will work for others. If, from the beginning, you have to balance lots of different needs, you end up architecting your solution to be flexible enough to work in a diverse range of environments. You need that if your software is going to work for several different governments.
  • Don’t provide the funds, provide matching funds. One way to ensure governments have skin in the game and will actually help develop software is to make them help pay for the development. If a city or government agency is devoting $100,000 towards helping develop a software solution, you’d better believe they are going to try to make it work. If the State of Michigan is paying for something that may work, maybe they’ll contribute and be helpful, or maybe they’ll sit back and see what happens. Ensure they do the former and not the latter – make sure the other parties have skin in the game.
  • Don’t just provide funds for development – provide funds to set up the organization that will coordinate the various participating governments and companies, set out the specs, and project manage the development. Again, to understand what that is like – just fork Kuali’s governance and institutional structure.
  • Ignore government agencies or jurisdictions that believe they are a special unique flower. One of the geniuses of Kuali is that they abstracted the process/workflow layer. That way universities could quickly and easily customize the software so that it worked for how their university does its thing. This was possible not because the universities recognized they were each a unique and special flower but because they recognized that for many areas (like library or financial management) their needs are virtually identical. Find partners that look for similarities, not those who are busy trying to argue they are different.

There is of course more, but I’ll stop there. I’m excited for Michigan. This innovation fund has real promise. I just hope that it gets used to be disruptive, and not to simply fund a few slow and steady (and stodgy) software incumbents that aren’t going to shake up the market and help change the way we do government procurement. We don’t need to spend $2.5 million to get software that is marginally better (or not even). Governments already spend billions every year for that. If we are going to spend a few million to innovate, let’s do it to be truly disruptive.

Real Estate as Platform: Canadian Real Estate Industry looking for developers

As some readers know, I’ve been asked from time to time by members of the real estate industry to comment on the future of their industry, how technology might impact it and how open data (both the government variety, and the trend by regulators to make the industry’s data more open) may alter it.

It is with some interest that I point readers, as well as software vendors and others, to this Request for Information (RFI) issued by the Canadian Real Estate Association yesterday. The RFI is titled: A National Shared Development Platform and App Marketplace and this line from the RFI is particularly instructive:

This Request for Information (RFI) is issued by The Canadian Real Estate Association (CREA) to qualified service providers who may be interest in creating a National Shared Development Platform where certified, independent software vendors (ISVs) and data providers could develop, extract and combine data to generate new tools and services for REALTORS® and their clients.

In other words, from my reading it looks like the industry is seeking a data sharing portal that can serve as a platform for internally and externally driven innovation. It is very aligned with what I’ve been suggesting, so will be interesting to see how it evolves.

Intrigue.

 

The Transparent Hypocrisy of Ethical Oil – who is really laundering money

The other week the Canadian Minister of the Environment, Peter Kent accused Canadian Charities of “laundering money” because they accept some funds from outside the country. This has all been part of a larger effort – championed by Ethical Oil – to discredit Canada’s environmental organizations.

As an open government and transparency in politics advocate I find the whole conversation deeply interesting. On the one side, environmental groups have to disclose who funds them and where the money comes from. This is what actually allows Ethical Oil to make their complaint in the first place. Ethical Oil however, feels no need to share this information. Apparently what is good for the goose, is not good for the gander.

The media really only touches on this fact occasionally. In the Globe an Mail this hypocrisy was buried in the last few lines of a recent article:

Ethical Oil launched a radio ad Tuesday that will run throughout Ontario flaunting the proposal as a way to lower oil prices and create jobs.

Mr. Ellerton said he couldn’t immediately provide an estimate for how much the group is spending on the campaign. He also refused to reveal who funds the lobby group, other than to say: “Ethical Oil accepts donations from Canadians and Canadian businesses.”

The group has supported the Conservatives move to end foreign funding of environmental groups, including those that oppose the Northern Gateway and Keystone XL pipeline projects. Mr. Ellerton has campaigned to expose the funding behind those groups but said he could not shed more light on his own organization.

“We have an organizational policy not to disclose who are donors because we’ve faced lawsuits in the Kingdom of Saudi Arabia,” he said, “and we don’t want to expose our donors to that kind of litigation.”

Of course, the the notion of what a “Canadian Business” means is never challenged. It turns out that many of the large “Canadian” players in the oil sands – those with corporate headquarters in Canada – are barely Canadian. Indeed, a recent analysis using Bloomberg data showed that 71 per cent of all tar sands production is owned by non-Canadian shareholders.

Consider the following ownership stakes of “Canadian” businesses:

Petrobank Energy Resources: 94.8% foreign owned

Husky Energy: 90.9% foreign (this one really surprised me!)

MEG Energy: 89.1% foreign

Imperial Oil: 88.9% foreign

Nexen: 69.9% foreign

Canadian Natural Resources Limited: 58.8% foreign

Suncor Energy: 56.8% foreign

Canadian Oil Sands: 56.8% foreign

Cenovus: 54.7% foreign

I think it is great that Ethical Oil wants greater transparency around who is funding who in the Oil Sands debate. But shouldn’t they be held to the same standard so that we can understand who is funding them?

If Ethical Oil and the government want to call it money laundering when a foreign citizen funds a Canadian environmental group, should we also use the term if a foreign (often Chinese or American) entity plows money into Ethical Oil?

Mainstreaming The Gov 2.0 Message in the Canadian Public Service

A couple of years ago I wrote a Globe Op-Ed “A Click Heard Across the Public Service” that outlined the significance of the clerk using GCPEDIA to communicate with public servants. It was a message – or even more importantly – an action to affirm his commitment to change how government works. For those unfamiliar, the Clerk of the Privy Council is the head of the public service for the federal government, a crude analogy would be he is the CEO and the Prime Minister is the Chairman (yes, I know that analogy is going to get me in trouble with people…)

Well, the clerk continues to broadcast that message, this time in his Nineteenth Annual Report to the Prime Minister on the Public Service of Canada. As an observer in this space what is particularly exciting for me is that:

  • The Clerk continues to broadcast this message. Leadership and support at the top is essential on these issues. It isn’t sufficient, but it is necessary.
  • The role of open data and social media is acknowledged on several occasions

And as a policy entrepreneur, what is doubly exciting is that:

  • Projects I’ve been personally involved in get called out; and
  • Language I’ve been using in briefs, blog posts and talks to public servants is in this text

You can, of course, read the whole report here. There is much more in it than just talk of social media and rethinking the public service, there is obviously talk about the budget and other policy areas as well. But bot the continued prominence given to renewal and technology, and explicit statements about the failure to move fast enough to keep up with the speed of change in society at large, suggests that the clerk continues to be worried about this issue.

For those less keen to read the whole thing, here are some juice bits that mattered to me:

In the section “The World in Which We Serve” which is basically providing context…

At the same time, the traditional relationship between government and citizens continues to evolve. Enabled by instantaneous communication and collaboration technologies, citizens are demanding a greater role in public policy development and in the design and delivery of services. They want greater access to government data and more openness and transparency from their institutions.

Under “Our Evolving Institution” which lays out some of the current challenges and priorities we find this as one of the four areas of focus mentioned:

  • The Government expanded its commitment to Open Government through three main streams: Open Data (making greater amounts of government data available to citizens), Open Information (proactively releasing information about Government activities) and Open Dialogue (expanding citizen engagement with Government through Web 2.0 technologies).

This is indeed interesting. The more this government talks about open in general, the more it will be interesting to see how the public reacts, particularly in regards to its treatment of certain sectors (e.g. environmental groups). Still more interesting is what appears to be a growing recognition of the importance of data (from a government that cut the long form census). Just yesterday the Health Minister, while talking about a controversial multiple sclerosis vein procedure stated that:

“Before our government will give the green light to a limited clinical trial here in Canada, the proposed trial would need to receive all necessary ethical and medical approvals. As Minister of Health, when it comes to clinical issues, I rely on advice from doctors and scientists who are continually monitoring the latest research, and make recommendations in the best interests of patient health and safety.”

This is, interestingly, an interesting statement from a government that called doctors “unethical” because of their support for the insite injection site which, the evidence shows, is the best way to save lives and get drug users into detox programs.

For evidence based policy advocates – such as myself – the adoption of the language of data is one that I think could help refocus debates onto a more productive terrain.

Then towards the bottom of the report there is a call out that mentions the Open Policy conference at DFAIT I had the real joy of helping out convene and that I served as the host and facilitator for.

Policy Built on Shared Knowledge
The Department of Foreign Affairs and International Trade (DFAIT) has been experimenting with an Open Policy Development Model that uses social networking and technology to leverage ideas and expertise from both inside and outside the department. A recent full-day event convened 400 public and private sector participants and produced a number of open policy pilots, e.g., an emergency response simulation involving consular officials and a volunteer community of digital crisis-mappers.

DFAIT is also using GCConnex, the Public Service’s social networking site, to open up policy research and development to public servants across departments.

This is a great, a much deserved win for the team at DFAIT that went out on a limb to run this conference and we rewarded with participation from across the public service.

Finally, anyone who has seen me speak will recognize a lot of this text as well:

As author William Gibson observed, “The future is already here, it’s just unevenly distributed.” Across our vast enterprise, public servants are already devising creative ways to do a better job and get better results. We need to shine a light on these trailblazers so that we can all learn from their experiments and build on them. Managers and senior leaders can foster innovation—large and small—by encouraging their teams to ask how their work can be done better, test out new approaches and learn from mistakes.

So much innovation in the 21st century is being made possible by well-developed communication technologies. Yet many public servants are frustrated by a lack of access to the Web 2.0 and social media tools that have such potential for helping us transform the way we work and serve Canadians. Public servants should enjoy consistent access to these new tools wherever possible. We will find a way to achieve this while at the same time safeguarding the data and information in our care.

I also encourage departments to continue expanding the use of Web 2.0 technologies and social media to engage with Canadians, share knowledge, facilitate collaboration, and devise new and efficient services.

To be fully attribute, the William Gibson quote, which I use a great deal, was something I first saw used by my friend Tim O’Reilly who is, needless to say, a man with a real ability to understand a trend and explain an idea to people. I hope his approach to thinking is reflected in much of what I do.

What, in sum, all these call outs really tell us is that the Gov 2.0 message in the federal public service is being mainstreamed, at the very least among the most senior public servants. This does not mean that our government is going to magically transform, it simply means that the message is getting through and people are looking for ways to push this type of thinking into the organization. As I said before, this is not sufficient to change the way government works, but it is necessary.

Going to keep trying to see what I can do to help.

The Oil Sands in Alberta is like Language Laws in Quebec… It's a domestic issue

This post isn’t based on a poll I’ve conducted or some rigorous methodology, rather it has evolved out of conversations I’ve had with friends, thought leaders I’ve run into, articles I’ve read and polls I’ve seen in passing.

As most people know the development of the oil sands is a thorny issue in Canada. The federal government is sweeping aside environmental regulations, labeling environmentalist groups terrorists and money launderers and overriding processes developed to enabled people to express their concerns.

What’s the public make of all this?

Polls generally seem to have the country split. Canadians are not opposed to natural resource development (how could we be?) but they are also worried about the environment (I’m not claiming enough to act). In this regard the Nik Nanos poll from March is instructive. Here most Canadians place environmental concerns (4.24 out of 5) ahead of economic prosperity (3.71 out of 5). There are of course polls that say the opposite. The normally reliable Ipsos Reid has a Canadian Chamber of Commerce poll which asks “it is possible to increase oil and gas production while protecting the environment at the same time.” As if Canadians know! I certainly don’t know the answer to that question. What I do know is that I’d like it to be possible to increase oil and gas productions while protecting the environment at the same time. This, I suspect, is what people are really saying: “Yes! I’d like to have my cake and eat it to. Go figure out the details.” This does not mean it is possible. Just desirable.

So on a superficial level, I suspect that most Canadians think the oil sands are dirty. Their are of course the outlining camps: the die hard supporters and die hard opposers, but I’m not talking about them. Most Canadians are, at their core, uncomfortable with the oil sands. They know it is bad for the environments and may be good for the economy. That doesn’t mean they are opposed, it just doesn’t mean they aren’t happy either.

But here’s the rub.

I think most Canadians feel like the oil sands is an Alberta issue. Ultimately, many don’t care if it is dirty, many don’t care if it doesn’t benefit them. So long as the issue is confined within Alberta’s borders and it’s an Alberta problem/opportunity then they are happy to give them a free hand. I’ve been comparing it to French Language laws in Quebec. You’d be hard pressed to find many Canadians who strongly agree with them. They understand them. They get why they matter to Quebec. And frankly, they’ve given up caring. As long as the issue is confined within Quebec it’s “domestic politics” and they accept it now as fact.

Of course, the moment the issue stretches outside of Alberta, the gloves are off. Take the pipeline proposal for example. I suspect that support for the Gateway pipeline, especially after the Keystone pipeline is approved, will likely disintegrate. Barbara Yaffe beat me to the punch with her column “What’s in Pipeline Expansion for BC?” which articulated exactly where I think BC is going. Already opinion polls show opposition to the pipeline is growing. Anyone knows that the moment a tanker strikes ground off the coast of BC you have a $1B problem on your hand. Most BCers are beginning to ask why they should put their tourism and fishing industries at risk and be left footing the bill for environmental damage on oil that Alberta is making money off of? A couple hundred jobs a year in benefits isn’t going to cut it.

What’s worse is that it is almost impossible to imagine that Keystone won’t get approved this time around. As a result, Alberta will have it’s pipeline out and so the major source of concern – a way to get the oil out – will have been satisfied. Building a pipeline through BC is now no longer essential. It is a bonus. It is all about getting a extra $30 premium a barrel and, of course, satisfying all those Chinese investors. BCers will be even more confused about why they have to absorb the environmental risks so that their neighbor can get rich.

This is also why the provincial NDP’s formal opposition to the pipeline is clever. While it cites environmental concerns and does use tough language it does not draw a hard line in the sand. Rather, it concludes “that the risks of this project (the Northern Gateway Pipeline) far outweigh its benefits.” Implicit in this statement is that if the benefits were to increase – if say, Alberta were to pay a percentage of the royalties to BC – then their position could change as well. In other words – you want us to accommodate your language politics in our province? We may so “no” anyway, but if we say yes… it is going to cost you.

All of this is further complicated by the fact that Alberta’s history of playing well with other provinces on issues of national interest is not… spectacular (remember the Alberta firewall?). Alberta has often wanted to go it alone – that is, indeed, part of its brand. I suspect most of the rest of the country has neither the inclination nor the care to stop them, just like they didn’t with Quebec. But that doesn’t mean they are going to get a helping hand either.

 

 

 

My LRC Review of "When the Gods Changed" and other recommended weekend readings

This week, the Literary Review of Canada published my and Taylor Owen’s review of When the Gods Changed: The Death of Liberal Canada by Peter C. Newman. For non-Canadians Peter Newman is pretty much a legend when it comes to covering Canadian history and politics, he was editor of the country’s largest newspaper and main news magazine and has published over 35 books. I also think the review will be of interest to non-Canadians since I think the topic of the decline of Liberal Canada are also true for a number of other countries experiencing more polarized politics.

Some other articles I’ve been digesting that I recommend for some Friday or weekend reading:

Why China’s Political Model Is Superior

This one is a couple of months old, but it doesn’t matter. Fascinating read. For one it shows the type of timelines that the Chinese look at the world with. Hint. It is waaayyyy longer than ours. Take a whiff:

In Athens, ever-increasing popular participation in politics led to rule by demagogy. And in today’s America, money is now the great enabler of demagogy. As the Nobel-winning economist A. Michael Spence has put it, America has gone from “one propertied man, one vote; to one man, one vote; to one person, one vote; trending to one dollar, one vote.” By any measure, the United States is a constitutional republic in name only.

Unattractive Real Estate Agents Achieve Quicker Sales

Before getting serious on you again, here’s a lighter more interesting note. I often comment in talks I give that real estate agents rarely use data to attract clients – mostly just pictures of themselves. Turns out… there might be more data in that then I thought! Apparently less attractive agents sell homes faster and work harder. More attractive agents take longer, but get more money. Food for thought here.

Andrew Coyne: Question isn’t where conservatism is going, but where has it gone

Another oldie but a goody. Liberal Canada may be dead, but it appears that Conservative Canada isn’t in much better shape. I’ve always enjoyed Coyne and feel like he’s been sharper than usual of late (since moving back to the National Post). For Americans, there may be some interesting lessons in here for the Tea Party movement. Canada experienced a much, much lighter form of conservative rebellion with creation of the Reform Party in the late 80s/early 90s which split off from establishment conservatives. Today, that group is now in power (rebranded) but Coyne assesses that much of what they do has been watered down. But not everything… to the next two articles!

Environmental charities ‘laundering’ foreign funds, Kent says

Sadly, Canada’s “Environment” Minister is spending most of his time attacking environmental groups. The charge is that they use US money to engage in advocacy against a pipeline to be built in Canada. Of course “Laundering” is a serious charge (in infers illegal activity) and given how quick the Conservatives have been in suing opponents for libel Kent had better be careful the stakeholders will adopt this tactic. Of course, this is probably why he doesn’t name any groups in particular (clever!). My advice, is that all the groups named by the Senate committee should sue him, then, to avoid the lawsuit he’d have to either a) back down from the claim altogether, or b) be specific about which group he is referring to to have the other suits thrown out. Next headline… to the double standard!

Fraser Institute co-founder confirms ‘years and years’ of U.S. oil billionaires’ funding

Some nifty investigative work here by a local Vancouver reporter finds that while the Canadian government believes it is bad for environmental groups to receive US funds for advocacy, it is apparently, completely okay for Conservative groups to receive sums of up to $1.7M from US oil billionaires. Ethical Oil – another astro-turf pro-pipeline group does something similar. It receives money from Canadian law firms that represent benefiting American and Chinese oil interests. But that money is labelled “Canadian” because it is washed through Canadian law firms. Confused? You should be.

What retail is hired to do: Apple vs. IKEA

I love that Clay Christiansen is on twitter. The Innovator’s Dilemma is a top 5 book of all time for me. Here is a great break down of how IKEA and Apple stores work. Most intriguing is the unique value proposition/framing their stores make to consumers which explains their phenomenal success as why they are often not imitated.

Open Data Movement is a Joke?

Yesterday, Tom Slee wrote a blog post called “Why the ‘Open Data Movement’ is a Joke,” which – and I say this as a Canadian who understands the context in which Slee is writing – is filled with valid complaints about our government, but which I feel paints a flawed picture of the open data movement.

Evgeny Morozov tweeted about the post yesterday, thereby boosting its profile. I’m a fan of Evgeny. He is an exceedingly smart and critical thinker on the intersection of technology and politics. He is exactly what our conversation needs (unlike, say, Andrew Keen). I broadly felt his comments (posted via his Twitter stream) were both on target: we need to think critically about open data; and lacked nuance: it is possible for governments to simultaneously become more open and more closed on different axis. I write all this confident that Evgeny may turn his ample firepower on me, but such is life.

So, a few comments on Slee’s post:

First, the insinuation that the open data movement is irretrievably tainted by corporate interests is so over the top it is hard to know where to begin to respond. I’ve been advocating for open data for several years in Canada. Frankly, it would have been interesting and probably helpful if a large Canadian corporation (or even a medium sized one) took notice. Only now, maybe 4-5 years in, are they even beginning to pay attention. Most companies don’t even know what open data is.

Indeed, the examples of corporate open data “sponsors” that Slee cites are U.S. corporations, sponsoring U.S. events (the Strata conference) and nonprofits (Code for America – of which I have been engaged with). Since Slee is concerned primarily with the Canadian context, I’d be interested to hear his thoughts on how these examples compare to Canadian corporate involvement in open data initiatives – or even foreign corporations’ involvement in Canadian open data.

And not to travel too far down the garden path on this, but it’s worth noting that the corporations that have jumped on the open data bandwagon in the US often have two things in common: First, their founders are bona fide geeks, who in my experience are both interested in hard data as an end unto itself (they’re all about numbers and algorithms), and want to see government-citizen interactions – and internal governmental interactions, too – made better and more efficient. Second, of course they are looking after their corporate interests, but they know they are not at the forefront of the open data movement itself. Their sponsorship of various open data projects may well have profit as one motive, but they are also deeply interested in keeping abreast of developments in what looks to be a genuine Next Big Thing. For a post the Evgeny sees as being critical of open data, I find all this deeply uncritical. Slee’s post reads as if anything that is touched by a corporation is tainted. I believe there are both opportunities and risks. Let’s discuss them.

So, who has been advocating for open data in Canada? Who, in other words, comprises the “open data movement” that Slee argues doesn’t really exist – and that “is a phrase dragged out by media-oriented personalities to cloak a private-sector initiative in the mantle of progressive politics”? If you attend one of the hundreds of hackathons that have taken place across Canada over the past couple years – like those that have happened in Vancouver, Regina, Victoria, Montreal and elsewhere – you’ll find they are generally organized in hackspaces and by techies interested in ways to improve their community. In Ottawa, which I think does the best job, they can attract hundreds of people, many who bring spouses and kids as they work on projects they think will be helpful to their community. While some of these developers hope to start businesses, many others try to tackle issues of public good, and/or try to engage non-profits to see if there is a way they can channel their talent and the data. I don’t for a second pretend that these participants are a representative cross-section of Canadians, but by and large the profile has been geek, technically inclined, leaning left, and socially minded. There are many who don’t fit that profile, but that is probably the average.

Second, I completely agree that this government has been one of the most – if not the most – closed and controlling in Canada’s history. I, like many Canadians, echo Slee’s frustration. What’s worse, is I don’t see things getting better. Canadian governments have been getting more centralized and controlling since at least Trudeau, and possibly earlier (Indeed, I believe polling and television have played a critical role in driving this trend). Yes, the government is co-opting the language of open data in an effort to appear more open. All governments co-opt language to appear virtuous. Be it on the environment, social issues or… openness, no government is perfect and indeed, most are driven by multiple, contradictory goals.

As a member of the Federal Government’s Open Government Advisory Panel I wrestle with this challenge constantly. I’m try hard to embed some openness into the DNA of government. I may fail. I know that I won’t succeed in all ways, but hopefully I can move the rock in the right direction a little bit. It’s not perfect, but then it’s pretty rare that anything involving government is. In my (unpaid, advisory, non-binding) role I’ve voiced that the government should provide the Access to Information Commissioner with a larger budget (they cut it) and that they enable government scientists to speak freely (they have not so far). I’ve also advocated that they should provide more open data. There they have, including some data sets that I think are important – such as aid data (which is always at risk of being badly spent). For some, it isn’t enough. I’d like for there to be more open data sets available, and I appreciate those (like Slee – who I believe is writing from a place of genuine care and concern) who are critical of these efforts.

But, to be clear, I would never equate open government data as being tantamount to solving the problems of a restrictive or closed government (and have argued as much here). Just as an authoritarian regime can run on open-source software, so too might it engage in open data. Open data is not the solution for Open Government (I don’t believe there is a single solution, or that Open Government is an achievable state of being – just a goal to pursue consistently), and I don’t believe anyone has made the case that it is. I know I haven’t. But I do believe open data can help. Like many others, I believe access to government information can lead to better informed public policy debates and hopefully some improved services for citizens (such as access to transit information). I’m not deluded into thinking that open data is going to provide a steady stream of obvious “gotcha moments” where government malfeasance is discovered, but I am hopeful that government data can arm citizens with information that the government is using to inform its decisions so that they can better challenge, and ultimately help hold accountable, said government.

Here is where I think Evgeny’s comments on the problem with the discourse around “open” are valid. Open Government and Open Data should not be used interchangeably. And this is an issue Open Government and Open Data advocates wrestle with. Indeed, I’ve seen a great deal of discussion and reflection come as a result of papers such as this one.

Third, the arguments around StatsCan all feel deeply problematic. I say this as the person who wrote the first article (that I’m aware of) about the long form census debacle in a major media publication and who has been consistently and continuously critical of it. This government has had a dislike for Statistics Canada (and evidence) long before open data was in their vocabulary, to say nothing of a policy interest. StatsCan was going to be a victim of dramatic cuts regardless of Canada’s open data policy – so it is misleading to claim that one would “much rather have a fully-staffed StatsCan charging for data than a half-staffed StatsCan providing it for free.” (That quote comes from Slee’s follow-up post, here.) That was never the choice on offer. Indeed, even if it had been, it wouldn’t have mattered. The total cost of making StatsCan data open is said to have been $2 million; this is a tiny fraction of the payroll costs of the 2,500 people they are looking to lay off.

I’d actually go further than Slee here, and repeat something I say all the time: data is political. There are those who, naively, believed that making data open would depoliticize policy development. I hope there are situations where this might be true, but I’ve never taken that for granted or assumed as much: Quite the opposite. In a world where data increasingly matters, it is increasingly going to become political. Very political. I’ve been saying this to the open data community for several years, and indeed was a warning that I made in the closing part of my keynote at the Open Government Data Camp in 2010. All this has, in my mind, little to do with open data. If anything, having data made open might increase the number of people who are aware of what is, and is not, being collected and used to inform public policy debates. Indeed, if StatsCan had made its data open years ago it might have had a larger constituency to fight on its behalf.

Finally, I agree with the Nat Torkington quote in the blog post:

Obama and his staff, coming from the investment mindset, are building a Gov 2.0 infrastructure that creates a space for economic opportunity, informed citizens, and wider involvement in decision making so the government better reflects the community’s will. Cameron and his staff, coming from a cost mindset, are building a Gov 2.0 infrastructure that suggests it will be more about turning government-provided services over to the private sector.

Moreover, it is possible for a policy to have two different possible drivers. It can even have multiple contradictory drivers simultaneously. In Canada, my assessment is that the government doesn’t have this level of sophistication around its thinking on this file, a conclusion I more or less wrote when assessing their Open Government Partnership commitments. I have no doubt that the conservatives would like to turn government provided services over to the private sector, and open data has so far not been part of that strategy. In either case, there is, in my mind, a policy infrastructure that needs to be in place to pursue either of these goals (such as having a data governance structure in place). But from a more narrow open data perspective, my own feeling is that making the data open has benefits for public policy discourse, public engagement, and economic reasons. Indeed, making more government data available may enable citizens to fight back against policies they feel are unacceptable. You may not agree with all the goals of the Canadian government – as someone who has written at least 30 opeds in various papers outlining problems with various government policies, neither do I – but I see the benefits of open data as real and worth pursuing, so I advocate for it as best I can.

So in response to the opening arguments about the open data movement…

It’s not a movement, at least in any reasonable political or cultural sense of the word.

We will have to agree to disagree. My experience is quite the opposite. It is a movement. One filled with naive people, with skeptics, with idealists focused on accountability, developers hoping to create apps, conservatives who want to make government smaller and progressives who want to make it more responsive and smarter. There was little in the post that persuaded me there wasn’t a movement. What I did hear is that the author didn’t like some parts of the movement and its goals. Great! Please come join the discussion; we’d love to have you.

It’s doing nothing for transparency and accountability in government,

To say it is doing nothing for transparency seems problematic. I need only cite one data set now open to say that isn’t true. And certainly publication of aid data, procurement data, publications of voting records and the hansard are examples of places where it may be making government more transparent and accountable. What I think Slee is claiming is that open data isn’t transforming the government into a model of transparency and accountability, and he’s right. It isn’t. I don’t think anyone claimed it would. Nor do I think the public has been persuaded that because it does open data, the government is somehow open and transparent. These are not words the Canadian public associates with this government no matter what it does on this file.

It’s co-opting the language of progressive change in pursuit of what turns out to be a small-government-focused subsidy for industry.

There are a number of sensible, critical questions in Slee’s blog post. But this is a ridiculous charge. Prior to the data being open, you had an asset that was paid for by taxpayer dollars, then charged for at a premium that created a barrier to access. Of course, this barrier was easiest to surmount for large companies and wealthy individuals. If there was a subsidy for industry, it was under the previous model, as it effectively had the most regressive tax for access of any government service.

Indeed, probably the biggest beneficiaries of open data so far have been Canada’s municipalities, which have been able to gain access to much more data than they previously could, and have saved a significant amount of money (Canadian municipalities are chronically underfunded.) And of course, when looking at the most downloaded data sets from the site, it would appear that non-profits and citizens are making good use of them. For example, the 6th most downloaded was the Anthropogenic disturbance footprint within boreal caribou ranges across Canada used by many environmental groups; number 8 was weather data; 9th was Sales of fuel used for road motor vehicles, by province and territory, used most frequently to calculate Green House Gas emissions; and 10th the Government of Canada Core Subject Thesaurus – used, I suspect, to decode the machinery of government. Most of the other top downloaded data sets related to immigration, used it appears, to help applicants. Hard to see the hand of big business in all this, although if open data helped Canada’s private sector become more efficient and productive, I would hardly complain.

If your still with me, thank you, I know that was a long slog.