Tag Archives: gov20

Toronto Star Op-Ed: Muzzled Scientists, Open Government and the Limits of Rules

I’ve a piece in today’s Toronto Star “Rules are no substitute for cultivating a culture of open government” about the Information Commissioners decision to investigate the muzzling of Canadian scientists.

Some choice paragraphs:

The actions of the information commissioner are to be applauded; what is less encouraging are the limits of her ability to resolve the problem. The truth is that openness, transparency and accountability cannot be created by the adoption of new codes or rules alone.

This is because even more than programs and regulations, an open government is the result of culture, norms and leadership. And here the message — felt as strongly by government scientists as any other public servants — is clear. Public servants are allowed less and less to have a perspective, to say nothing of the ability to share that perspective.

and on ways I think this has consequences that impact the government’s agenda directly:

This breakdown in culture has consequences — some of which may impact the government’s most important priorities. Take, for example, the United States’ preoccupation with Canada’s environmental record in general and its specific concerns about the oilsands in regards to the proposed Keystone XL pipeline. The government has spent the last month trying to burnish its environmental record in anticipation of the decision. And yet, it is amazing how few in Ottawa recognize the direct link between the openness around which government scientists can speak about their work and the degree of trust that Canadians — as well as our allies — have in our capacity to protect the environment.

I hope you’ll give it a read.

Ontario's Open Data Policy: The Good, The Bad, The Ugly and the (Missed?) Opportunity

Yesterday the province of Ontario launched its Open Data portal. This is great news and is the culmination of a lot of work by a number of good people. The real work behind getting open data program launched is, by and large, invisible to the public, but it is essential – and so congratulations are in order for those who helped out.

Clearly this open data portal is in its early stages – something the province is upfront about. As a result, I’m less concerned with the number of data sets on the site (which however, needs to, and should, grow over time). Hopefully the good people in the government of Ontario have some surprises for us around interesting data sets.

Nor am I concerned about the layout of the site (which needs to, and should, improve over time – for example, once you start browsing the data you end up on this URL and there is no obvious path back to the open data landing page, it makes navigating the site hard).

In fact, unlike some I find any shortcomings in the website downright encouraging. Hopefully it means that speed, iteration and an attitude to ship early has won out over media obsessive, rigid, risk adverse approach governments all to often take. Time will tell if my optimism is warranted.

What I do want to focus on is the license since this is a core piece of infrastructure to an open data initiative. Indeed, it is the license that determines whether the data is actually open or closed. And I think we should be less forgiving of errors in this regard than in the past. It was one thing if you launched in the early days of open data two or four years ago. But we aren’t in early days anymore. There over 200 government open data portal around the world. We’ve crossed the chasm people. Not getting the license right is not a “beta” mistake any more. It’s just a mistake.

So what can we say about the Ontario Open Data license?

First, the Good

There is lots of good things to be said about it. It clearly keys off the UK’s Open Government License much like BC’s license did as does the proposed Canadian Open Government License. This means that above all, it is written in  plain english and is easily understood. In addition, the general format is familiar to many people interested in open data.

The other good thing about the license (pointed out to me by the always sharp Jason Birch) is that it’s attribution clause is softer than the UK, BC or even the proposed Federal Government license. Ontario uses the term “should” whereas the others use the term “must.”

Sadly, this one improvement pales in comparison to some of the problems and, most importantly the potentially lost opportunity I urgently highlight at the bottom of this post.

The Bad

While this license does have many good qualities initiated by the UK, it does suffer from some major flaws. The most notable comes in this line:

Ontario does not guarantee the continued supply of the Datasets, updates or corrections or that they are or will be accurate, useful, complete, current or free and clear of any possible third party copyright, moral right, other intellectual property right or other claim.

Basically this line kills the possibility that any business, non-profit or charity will ever use this data in any real sense. Hobbyests, geeks, academics will of course use it but this provision is deeply flawed.

Why?

Well, let me explain what it means. This says that the government cannot be held accountable to only release data it has the right to release. For example: say the government has software that tracks road repair data and it starts to release it and, happily all sorts of companies and app developers use it to help predict traffic and do other useful things. But then, one day the vendor that provided that road repair tracking software suddenly discovers in the fine print of the contract that they, not the government, own that data! Well! All those companies, non-profits and app developers are suddenly using proprietary data, not (open) government data. And the vendor would be entirely in its rights to go either sue them, or demand a license fee in exchange of letting them continue to use the data.

Now, I understand why the government is doing this. It doesn’t want to be liable if such a mistake is made. But, of course, if they don’t want to absorbe the risk, that risk doesn’t magically disappear, it transfers to the data user. But of course they have no way of managing that risk! Those users don’t know what the contracts say and what the obligations are, the party best positioned to figure that out is the government! Essentially this line transfers a risk to the party (in this case the user) that is least able to manage it. You are left asking yourself, what business, charity or non-profit is going to invest hundreds of thousands of dollars (or more) and people time to build a product, service or analysis around an asset (government data) that it might suddenly discover it doesn’t have the right to use?

The government is the only organization that can clear the rights. If it is unwilling to do so, then I think we need to question whether this is actually open data.

The Ugly

But of course the really ugly part of the license (which caused me to go on a bit of a twitter rant) comes early. Here it is:

If you do any of the above you must ensure that the following conditions are met:

  • your use of the Datasets causes no harm to others.

Wowzers.

This clause is so deeply problematic it is hard to know where to begin.

First, what is the definition of harm? If I use open data from the  Ontario government to rate hospitals and the some hospitals are sub-standard am I “harming” the hospital? Its workers? The community? The Ministry of Health?

So then who decides what the definition is? Well, since the Government of Ontario is the licensor of the data… it would seem to suggest that they do. Whatever the standing government of the data wants to decree is a “harm” suddenly becomes legit. Basically this clause could be used to strip many users – particularly those interested in using the data as a tool for accountability – of their right to use the data, simply because it makes the licensor (e.g. the government) uncomfortable.

A brief history lesson here for the lawyers who inserted this clause. Back in in March of 2011 when the Federal Government launched data.gc.ca they had a similar clause in their license. It read as follows:

“You shall not use the data made available through the GC Open Data Portal in any way which, in the opinion of Canada, may bring disrepute to or prejudice the reputation of Canada.”

While the language is a little more blunt, its effect was the same. After the press conference launching the site I sat down with Stockwell Day (who was the Minister responsible at the time) for 45 minutes and walked him through the various problems with their license.

After our conversations, guess how long it took for that clause to be removed from the license? 3 hours.

If this license is going to be taken seriously, that clause is going to have to go, otherwise, it risks being a laughing stock and a case study of what not to do in Open Government workshops around the world.

(An aside: What was particularly nice was the Minister Day personally called my cell phone to let me know that he’d removed that clause a few hours after our conversation. I’ve disagreed with Day on many, many, many things, but was deeply impressed by his knowledge of the open data file and his commitment to its ideals. Certainly his ability to change the license represents one of the fastest changes to policy I’ve ever witnessed.)

The (Missed?) Opportunity

What is ultimately disappointing about the Ontario license however is that it was never needed. Why every jurisdiction feels the need to invent its own license is always beyond me. What, beyond the softening of the attribution clause, has the Ontario license added to the Open Data world. Not much that I can see. And, as I’ve noted above, it many ways it is a step back.

You know data users would really like? A common license. That would make it MUCH easier to user data from the federal government, the government of Ontario and the Toronto City government all at the same time and not worry about compatibility issues and whether you are telling the end user the right thing or not. In this regard the addition of another license is a major step backwards. Yes, let me repeat that for other jurisdictions thinking about doing open data: The addition of another new license is a major step backwards.

Given that the Federal Government has proposed a new Open Government License that is virtually identical to this license but has less problematic language, why not simply use it? It would make the lives of the people who this license is supposed to enable  – the policy wonks, the innovators, the app developers, the data geeks – lives so much easier.

That opportunity still exists. The Government of Ontario could still elect to work with the Feds around a common license. Indeed given that the Ontario Open Data portal says they are asking for advice on how to improve this program, I implore, indeed beg, that you consider doing that. It would be wonderful if we could move to a single license in this country, and if a partnership between the Federal Government and Ontario might give such an initiative real momentum and weight. If not, into the balkanized abyss of a thousand licenses we wil stumble.

 

The UK's Digital Government Strategy – Worth a Peek

I’ve got a piece up on TechPresident about the UK Government’s Digital Strategy which was released today.

The strategy (and my piece!) are worth checking out. They are saying a lot of the right things – useful stuff for anyone in industry or sector that has been conservative vis-a-vis online services (I’m looking at you governments and banking).

As  I note in the piece… there is reason we should expect better:

The second is that the report is relatively frank, as far as government reports go. The website that introduces the three reports is emblazoned with an enormous title: “Digital services so good that people prefer to use them.” It is a refreshing title that amounts to a confession I’d like to see from more governments: “sorry, we’ve been doing it wrong.” And the report isn’t shy about backing that statement up with facts. It notes that while the proportion of Internet users who shop online grew from 74 percent in 2005 to 86 percent in 2011, only 54 percent of UK adults have used a government service online. Many of those have only used one.

Of course the real test will come with execution. The BC Government, the White House and others have written good reports on digital government, but it is rolling it out that is the tricky part. The UK Government has pretty good cred as far as I’m concerned, but I’ll be watching.

You can read the piece here – hope you enjoy!

Doing Government Websites Right

Today, I have a piece over on Tech President about how the new UK government website – Gov.uk – does a lot of things right.

I’d love to see more governments invest two of the key ingredients that made the website work – good design and better analytics.

Sadly, on the design front many politicians see design as a luxury and fail to understand that good design doesn’t just make things look better, they make websites (and other things) easier to use and so reduce other costs – like help desk costs. I can personally attest to this. Despite being adept at using the web I almost always call the help desk for federal government services because I find federal government websites virtually unnavigable.  Often I find these websites transform my personality from happy affable guy into someone who teeters between grumpy/annoyed on the mild side, to raving crazy lunatic on the other as I fail to grasp what I’m supposed to do next.

If I have to choose between wasting 45 minutes on a website getting nowhere versus calling a help line, waiting for 45 minutes on hold while I do other work and then getting my problem resolved… I go with the latter. It’s not a good use of anyone’s time, but it is often the best option at the moment.

On the analytics front, many governments simply lack the expertise to do something as simple as Google analytics, or worse are hamstrung by privacy and procurement rules keep them from using services that would enable them to know how their users are (or are not) using their website.

Aside from gov.uk, another great example of where these two ingredients came together is over at Honolulu Answers. Here a Code for America team worked with the city to see what pages (e.g. services) residents were actually visiting and then prioritized those. In addition, they worked with staff and citizen to construct answers to commonly asked questions. I suspect a simple website like this could generate real savings on the city’s help desk costs – to say nothing of happier residents and tourists.

At some risk of pressing this point too heavily, I hope that my TechPresident piece (and other articles about gov.uk) gets widely read by public servants, managers and, of course, politicians (hint: the public wants easier access to services, not easoer access to photos and press releases about you). I’m especially hoping the good people at Treasury Board Secretariat in the Canadian Federal government read it since the old Common Look and Feel standard sadly ensured that Canadian government websites are particularly terrible when it comes to usability.

The UK has shown how national governments can do better. Let’s hope others follow their lead.

 

How Government should interact with Developers, Data Geeks and Analysts

Below is a screen shot from the Opendatabc google group from about two months ago. I meant to blog about this earlier but life has been in the way. For me, this is a prefect example of how many people in the data/developer/policy world probably would like to interact with their local, regional or national government.

A few notes on this interaction:

  • I occasionally hear people try to claim the governments are not responsive to requests for data sets. Some aren’t. Some are. To be fair, this was not a request for the most controversial data set in the province. But it is was a request. And it was responded to. So clearly there are some governments that are responsive. The questions is figuring out which one’s are, why they are, and see if we can export that capacity to other jurisdictions.
  • This interaction took place in a google group – so the whole context is social and norm driven. I love that public officials in British Columbia as well as with the City of Vancouver are checking in every once in a while on google groups about open data, contributing to conversations and answering questions that citizens have about government, policies and open data. It’s a pretty responsive approach. Moreover, when people are not constructive it is the group that tends to moderate the behaviour, rather than some leviathan.
  • Yes, I’ve blacked out the email/name of the public servant. This is not because I think they’d mind being known or because they shouldn’t be know, but because I just didn’t have a chance to ask for permission. What’s interesting is that this whole interaction is public and the official was both doing what that government wanted and compliant with all social media rules. And yet, I’m blacking it out, which is a sign of how messed up current rules and norms make citizens relationships with public officials they interact with online -I’m worried of doing something wrong by telling others about a completely public action. (And to be clear, the province of BC has really good and progressive rules around these types of things)
  • Yes, this is not the be all end all of the world. But it’s a great example of a small thing being doing right. It’s nice to be able to show that to other government officials.

 

Containers, Facebook, Baseball & the Dark Matter around Open Data (#IOGDC keynote)

Below is a extended blog post that summarizes the keynote address I gave at the World Bank/Data.gov International Open Government Data Conference in Washington DC on Wednesday July 11th. This piece is cross posted over at the WeGov blog on TechPresident where I’m also write on transparency, technology and politics.

Yesterday, after spending the day at the International Open Government Data Conference at the World Bank (and co-hosted by data.gov) I left both upbeat and concerned. Upbeat because of the breadth of countries participating and the progress being made.

I was worried however because of the type of conversation we are having how it might limit the growth of both our community and the impact open data could have. Indeed as we talk about technology and how to do open data we risk missing the real point of the whole exercise – which is about use and impacts.

To get drive this point home I want to share three stories that highlight the challenges, I believe, we should be talking about.

Challenge 1: Scale Open Data

IDealx-300x247In 1956 Ideal-X, the ship pictured to the left, sailed from Newark to Houston and changed the world.

Confused? Let me explain.

As Marc Levine chronicles in his excellent book The Box, the world in 1956 was very different to our world today. Global trade was relatively low. China was a long way off from becoming the world’s factory floor. And it was relatively unusual for people to buy goods made elsewhere. Indeed, as Levine puts it, the cost of shipping goods was “so expensive that it did not pay to ship many things halfway across the country, much less halfway around the world.” I’m a child of the second era of globalization. I grew up in a world of global transport and shipping. The world before all of that which Levine is referring to is actually foreign to me. What is amazing is how much of that has just become a basic assumption of life.

And this is why Ideal-X, the aforementioned ship, is so important. It is the first cargo container ship (in how we understand containers). Its trip from Newark to Houston marked the beginning of a revolution because containers slashed the cost of shipping goods. Before Ideal-X the cost of loading cargo onto a medium sized cargo ship was $5.83 per ton, with containers, the cost dropped to 15.8 cents. Yes, the word you are looking for is: “wow.”

You have to understand that before containers loading a ship was a lot more like packing a mini-van for a family vacation to the beach than the orderly process of what could be described as stacking very large lego blocks on a boat. Before containers literally everything had to be hand packed, stored and tied down in the hull. (see picture to the right)

This is a little bit what our open data world looks like right today. The people who are consuming open data are like digital longshoreman. They have to look at each open data set differently, unpack it accordingly and figure out where to put it, how to treat it and what to do with it. Worse,  when looking at data from across multiple jurisdictions it is often much like cargo going around the world before 1956: a very slow and painful process. (see man on the right).

Of course, the real revolution in container shipping happened in 1966 when the size of containers was standardized. Within a few years containers could move from pretty much anywhere in the world from truck to train to boat and back again. In the following decades global shipping trade increased by 2.5 times the rate of economic output. In other words… it exploded.

Container-300x260Geek side bar: For techies, think of shipping containers as the TCP-IP packet of globalization. TCP-IP standardized the packet of information that flowed over the network so that data could move from anywhere to anywhere. Interestingly, like containers, what was in the package was actually not relevant and didn’t need to be known by the person transporting it. But the fact that it could move anywhere created scale and allowed for logarithmic growth.

What I’m trying to drive at is that, when it comes to open data, the number of open data sets that gets published is no longer the critical metric. Nor is the number of open data portals. We’ve won. There are more and more. The marginal political and/or persuasive benefit of an addition of another open data portal or data set won’t change the context anymore. I want to be clear – this is not to say that more open data sets and more open data portals are not important or valuable – from a policy and programmatic perspective more is much, much better. What I am saying is that having more isn’t going to shift the conversation about open data any more. This is especially true if data continues to require large amounts of work and time for people to unpack and understanding it over and over again across every portal.

In other words, what IS going to count, is how many standardized open data sets get created. This is what we SHOULD be measuring. The General Transit Feed Specification revolutionized how people engaged with public transit because the standard made it so easy to build applications and do analysis around it. What we need to do is create similar standards for dozens, hundreds, thousands of other data sets so that we can drive new forms of use and engagement. More importantly we need to figure out how to do this without relying on a standards process that take 8 to 15 to infinite years to decide on said standard. That model is too slow to serve us, and so re-imaging/reinventing that process is where the innovation is going to shift next.

So let’s stop counting the number of open data portals and data sets, and start counting the number of common standards – because that number is really low. More critically, if we want to experience the kind of explosive growth in use like that experienced by global trade and shipping after the rise of the standardized container then our biggest challenge is clear: We need to containerize open data.

Challenge 2: Learn from Facebook

facebook-logoOne of the things I find most interesting about Facebook is that everyone I’ve talked to about it notes how the core technology that made it possible was not particularly new. It wasn’t that Zuckerberg leveraged some new code or invented a new, better coding language. Rather it was that he accomplished a brilliant social hack.

Part of this was luck, that the public had come a long way and was much more willing to do social things online in 2004 than they were willing to do even two years earlier with sites like friendster. Or, more specifically, young people who’d grown up with internet access were willing to do things and imagine using online tools in ways those who had not grown up with those tools wouldn’t or couldn’t. Zuckerberg, and his users, had grown up digital and so could take the same tools everyone else had and do something others hadn’t imagined because their assumptions were just totally different.

My point here is that, while it is still early, I’m hoping we’ll soon have the beginnings of a cohort of public servants who’ve “grown up data.” For whom, despite their short career in the public service, have matured in a period where open data has been an assumption, not a novelty. My hope and suspicion is that this generation of public servants are going to think about Open Data very differently than many of us do. Most importantly, I’m hoping they’ll spur a discussion about how to use open data – not just to share information with the public – but to drive policy objectives. The canonical opportunity for me around this remains restaurant inspection data, but I know there are many, many, more.

What I’m trying to say is that the conferences we organize have got to talk less and less about how to get data open and have to start talking more about how do we use data to drive public policy objectives. I’m hoping the next International Open Government Data Conference will have an increasing number of presentations by citizens, non-profits and other outsiders are using open data to drive their agenda, and how public servants are using open data strategically to drive to a outcome.

I think we have to start fostering that conversation by next year at the latest and that this conversation, about use, has to become core to everything we talk about within 2 years, or we will risk losing steam. This is why I think the containerization of open data is so important, as well as why I think the White House’s digital government strategy is so important since it makes internal use core to the governments open data strategy.

Challenge 3: The Culture and Innovation Challenge.

In May 2010 I gave this talk on Open Data, Baseball and Government at the Gov 2.0 Summit in Washington DC. It centered around the story outline in the fantastic book Moneyball by Michael Lewis. It traces the story about how a baseball team – the Oakland A’s – used a new analysis of players stats to ferret out undervalued players. This enabled them to win a large number of games on a relatively small payroll. Consider the numbers to the right.

I mean if you are the owner of the Texas Rangers, you should be pissed! You are paying 250% in salary for 25% fewer wins than Oakland. If this were a government chart, where “wins” were potholes found and repaired, and “payroll” was costs… everyone in the world bank would be freaking out right now.

For those curious, the analytical “hack” was recognizing that the most valuable thing a player can do on offense is get on base. This is because it gives them an opportunity to score (+) but it also means you don’t burn one of your three “outs” that would end the inning and the chance for other players to score. The problem was, to measure the offensive power of a player, most teams were looking at hitting percentages (along with a lot of other weird, totally non-quantitative stuff) which ignores the possibility of getting walked, which allows you to get on base without hitting the ball!

What’s interesting however is that the original thinking about the fact that people were using the wrong metrics to assess baseball players first happened decades before the Oakland “A”s started to use it. Indeed it was a nighttime security guard with a strong mathematics background and an obsession for baseball that first began point this stuff out.

20-yearsThe point I’m making is that it took 20 years for a manager in baseball to recognize that there was better evidence and data they could be using to make decisions. TWENTY YEARS. And that manager was hated by all the other managers who believed he was ruining the game. Today, this approach to assessing baseball is common place – everyone is doing it – but see how the problem of using baseball’s “open data” to create better outcomes was never just an accessibility issue. Once that was resolved the bigger challenge centered around culture and power. Those with the power had created a culture in which new ideas – ideas grounded in evidence but that were disruptive – couldn’t find an audience. Of course, there were structural issues as well, many people had jobs that depended on not using the data, on instead relying on their “instincts” but I think the cultural issue is a significant one.

So we can’t expect that we are going to go from open portal today to better decisions tomorrow. There is a good chance that some of the ideas data causes us to think will be so radical and challenging that either the ideas, the people who champion them, or both, could get marginalized. On the up side, I feel like I’ve seen some evidence to the contrary to this in city’s like New York and Chicago, but the risk is still there.

So what are we going to do to ensure that the culture of government is one that embraces the challenges to our thinking and assumptions that doesn’t require 20 years to pass for us to make progress. This is a critical challenge for us – and it is much, much bigger than open data.

Conclusion: Focus on the Dark Matter

I’m deeply indebted to my friend – the brilliant Gordon Ross – who put me on to this idea the other day over tea.

Macguffin

Do you remember the briefcase in Pulp Fiction? The on that glowed when opened? That the characters were all excited about but you never knew what was inside it. It’s called a MacGuffin. I’m not talking about the briefcase per se. Rather I mean the object in a story that all the characters are obsessed about, but that you – the audience – never find out what it is, and frankly, really isn’t that important to you. In Pulp Fiction I remember reading that the briefcase is allegedly Marsellus Wallace soul. But ultimately, it doesn’t matter. What matters is that Vincent Vega, Jules Winnfield and a ton of other characters think it is important, and that drives the action and the plot forward.

Again – let me be clear – Open Data Portals are our MacGuffin device. We seem to care A LOT about them. But trust me, what really matters is everything that can happens around them. What makes open data important is not a data portal. It is a necessary prerequisite but it’s not the end, it just the means. We’re here because we believe that the things open data can let us and others do, matter. The Open Data portal was only ever a MacGuffin device – something that focused our attention and helped drive action so that we could do the other things – that dark matter that lies all around the MacGuffin device.

And that is what brings me back to our three challenges. Right now, the debate around open data risks become too much like a Pulp Fiction conference in which all the panels talk about the briefcase. Instead we should be talking more and more about all the action – the dark matter – taking place around the briefcase. Because that is what is really matters. For me, I think the three things that matter most are what I’ve mentioned about in this talk:

  • standards – which will let us scale, I believe strongly that the conversation is going to shift from portals to standards
  • strategic use – starting us down the path of learning how open data can drive policy outcomes; and
  • culture and power – recognizing that there are lots of open data is going to surface a lot of reasons why governments don’t want to engage data driven in decision making

In other words, I want to be talking about how open data can make the world a better place, not about how we do open data. That conversation still matters, open data portals still matter, but the path forward around them feels straightforward, and if they remain the focus we’ll be obsessing about the wrong thing.

So here’s what I’d like to see in the future from our Open Data conferences. We got to stop talking about how to do open data. This is because all of our efforts here, everything we are trying to accomplish… it has nothing to do with the data. What I think we want to be talking about is how open data can be a tool to make the world a better place. So let’s make sure that is the conversation we are have.

Should we Start a Government as Platform Business Association

I have an idea.

I want to suggest starting a community of disruptive software companies that are trying to sell products to local and regional governments. I know we can make cities better, more participatory, more accessible, to say nothing of saving them money. But to be effective I think we need a common message – an association that conveys why this disruption is in government’s interest, and how it will help them.

Here’s why.

Last year, I along with some friends, incorporated a small company with an innovative approach to messaging that helps cities be smarter, communicate better and serve their citizens more effectively.  Where we deploy, citizens love us. It’s a blast to do. (You can read more about our company here – if you work for a local or regional government, I’d love to talk to you).

We also don’t think we are alone. Indeed we like to think we are part of a new breed of start ups – companies we respect like SeeClickFix, Azavea  and Citizenvestor – whose DNA is influenced by the likes of Tim O’Reilly and others who talk about government as a platform. Companies that believe there are hyper low cost ways to make local government and services more transparent, enable citizen participation and facilitate still more innovation.

Happily we’ve had a number of early successes, several cities have signed on with us as paying customers. This is no small feat in the municipal space – governments, especially local governments – tend to be risk averse. Selling software as a service (SaaS) for a product category that previously didn’t exist can be challenge. But it pales in comparison to the two real challenges we confront:

1)   Too Cheap to Buy

For many cities we are, weirdly, too cheap  to buy. Our solution tends to cost a couple of thousand dollars a year. I’m talking four digits. In many municipalities, this price breaks the procurement model, which is designed for large purchases like heavy equipment or a large IT implementation. We’re too expensive for petty cash, too cheap for a formal process. I’ve even had a couple experiences where a city has spent significantly more money in staff time talking to and evaluating us than it would have cost to simply deploy us for a year and try us out. We need a smarter context for talking to procurement specifically, and local government in general. That might be easier in a herd.

2)   Marketing

As company that tries to keep its product as cheap as possible we have a limited budget to invest in marketing and sales.  We could charge more to pay for that overhead, but we’d prefer to be cheaper, mostly because we don’t believe taxpayers should pay for the parts of our business that don’t really give them value. In short, we need a better way of letting cities know we exist, one that is cheap and allows our product to reflect its value, not an advertising budget.

As I look at our peer group of companies, I have to believe they share similar challenges. So why don’t we band together? A group of small companies could potentially do a virtual trade show that not only could attract more clients than any of us could on our own, but would attract the right clients: local governments that are hungry for next generation online services.

So who would I imagine being part of this association? I don’t think the criteria is complex, so here are some basic ideas that come to mind:

  • Software focused
  • Disruptively low or hyperlow-cost: here I imagine the cost is under 35 cents per citizen per year.
  • Following a SaaS or open source model
  • You keep the barriers to entry low – any operational data your system creates is open and available to staff and, if requested by the city, to citizens as well
  • Citizen-centric: In addition to open data, your service should, whenever relevant or possible, make it as easy as possible for citizens to use, engage or participate.

Is it the Government as Platform Business Association? Or the Gov 2.0 Software Association? Or maybe the League of Awesomely Lean Gov Start Ups. I don’t know. But I can imagine a shared branded site, maybe we pool money to do some joint marketing. I love the idea of a package deal – get Recollect, SeeClickFix and OpenTreeMap bundled at a discount! Maybe there is even a little logo that companies who meet the criteria and participate in the group could paste on their website (no worries I’m not a designer and am no attached to the mock up below).

Gov20-bia-logo

The larger point here is that if the next generation of civic start ups – companies that can do software much, much cheaper while enhancing the experience for city staff and residents – have an education challenge on our hands. Cities need to learn that there emerging radically small, lean solutions to some problems. I’m not sure this is the right answer. I know this proposal creates a lot of unanswered questions, but it is an idea I wanted to throw out there.

If you have a company that fits the bill I’d love to hear from you. And if you work for a local or regional government and think this would be helpful, I’d love to hear about that as well.

The End of the World: The State vs. the Internet

Last weekend at FooCamp, I co-hosted a session titled “The End of the World: Will the Internet Destroy the State, or Will the State Destroy the Internet?” What follows are the ideas I opened with during my intro to the session and some additional thoughts I’ve had and that others shared during the conversation. To avoid some confusion, I’d also like to clarify a) I don’t claim that these questions have never been raised before, I mostly hope that this framing can generate useful thought and debate; and b) that I don’t believe these are the only two or three possible outcomes; it was just a interesting way of framing some poles so as to generate good conversation.

Introduction

A while back, I thought I saw a tweet from Evgeny Morozov that said something to the effect: “You don’t just go from printing press to Renaissance to iPad; there are revolutions and wars in between you can’t ignore.” Since I can’t find the tweet, maybe he didn’t say it or I imagined it… but it sparked a line of thinking.

Technology and Change

Most often, when people think of the printing press, they think of its impact on the Catholic Church – about how it enabled Martin Luther’s complaints to go viral and how the localization of the Bible cut out the need of the middle man the priest to connect and engage with God. But if the printing press undermined the Catholic Church, it had the opposite impact on the state. To be fair, heads of state took a beating (see French Revolution et al.), but the state itself was nimbler and made good use of the technology. Indeed, it is worth noting that the modern notion of the nation state was not conceivable without the printing press. The press transformed the state – scaling up its capacity to demand control over loyalty from citizens and mobilize resources which, in turn, had an impact on how states related (and fought) with one another.

In his seminal book Imagined Communities, Benedict Anderson outlined how the printing press allowed the state to standardize language and history. In other words, someone growing up in Marseilles 100 years before the printing press probably had a very different sense of history and spoke a markedly different dialect of French than someone living in Paris during the same period. But the printing press (and more specifically, those who controlled it) allowed a dominant discourse to emerge (in this case, likely the Parisian one). Think standardized dictionaries, school textbooks and curricula, to say nothing of history and entertainment. This caused people who might never have met to share a common imagined history, language and discourse. Do not underestimate the impact this had on people’s identity. As this wonderful quote from the book states: “Ultimately it is this fraternity that makes it possible, over the past two centuries, for so many millions of people, not so much to kill, as willingly to die for such limited imaginings.” In other words, states could now fully dispense with feudal middle managers and harness the power of larger swaths of population directly – a population that might never actually meet, but could nonetheless feel connected to one another. The printing press thus helped create the modern nation state by providing a form of tribalism at scale: what we now call nationalism. This was, in turn, an important ingredient for the wars that dominated the late 19th and early 20th century – think World War I and World War II. This isn’t to say without the printing press, you don’t get war – we know that isn’t true – but the type of total war between 20th century nation states does have a direct line to the printing press.

So yes, the techno-utopian world of: printing press -> Renaissance -> iPad is not particularly accurate.

What you do get is: printing press -> Renaissance -> state evolution -> destabilization of international order -> significant bloodshed -> re-stabilization of international system -> iPad.

I raise all this because if this is the impact the printing press had on the state, it begs a new question: What will be the impact of the internet on the state? Will the internet be a technology the state can harness to extract more loyalty from its citizens… or will the internet destroy the imagined communities that make the state possible, replaced by a more nimble, disruptive organization better able to survive the internet era?

Some Scenarios

Note: again, these scenarios aren’t absolutes or the only possibilities, they are designed to raise questions and provoke thinking.

The State Destroys the Internet

One possibility is that the state is as adaptive as capitalism. I’m always amazed at how capitalism has evolved over the centuries. From mercantilism to free market to social market to state capitalism, as a meme it readily adapts  to new environments. One possibility is that the state is the same – sufficiently flexible to adapt to new conditions. Consequently, one can imagine that the state grabs sufficient control of the internet to turn it into a tool that at best enhances – and at worst, doesn’t threaten – citizens’ connection to it. Iran, with its attempt to build a state-managed internal network that will allow it to closely monitor its citizens’ every move, is a scary example of the former. China – with its great firewall – may be an example of the latter. But one not need pick on non-western states.

And a networked world will provide states – especially democratic ones – with lots of reasons to seize greater control of their citizens’ lives. From organized crime, to  terrorism, to identity theft, governments find lots of reasons to monitor their citizens. This is to say nothing of advanced persistent threats which create a state of continual online warfare – or sort of modern day phoney phishy war – between China, the United States, Iran and others. This may be the ultimate justification.

Indeed, as a result of these threats, the United States already has an extensive system for using the internet to monitor its own citizens and even my own country – Canada – tried to pass a law last year to significantly ramp up the monitoring of citizens online. The UK, of course, has just proposed a law whose monitoring provisions would make any authoritarian government squeal with glee. And just last week we found out that the UK government is preparing to cut a blank check for internet service providers to pay for installing the monitoring systems to record what its citizens do online.

Have no doubts, this is about the state trying to ensure the internet serves – or at least doesn’t threaten – its interests.

This is sadly, the easiest future to imagine since it conforms with the world we already know – one where states are ascendant. However, this future represents, in many ways, a linear projection of the future – and our world, especially our networked world, rarely behaves in a linear fashion. So we should be careful about confusing familiarity with probability.

The Internet Destroys the State

Another possibility is that the internet undermines our connection with the state. Online we become increasingly engaged with epistemic communities – be it social, like someone’s World of Warcraft guild, or professional, such as an association with a scientific community. Meanwhile, in the physical world, local communities – possibly at the regional level – become ascendant. In both cases, regulations and rules created by the state feel increasingly like an impediment to conducting our day to day lives, commerce and broader goals. Frustration flares, and increasingly someone in Florida feels less and less connection with someone in Washington state – and the common sense of identity, the imagined community, created by the state begins to erode.

This is, of course, hard for many people to imagine – especially Americans. But for many people in the world – including Canadians – the unity of the state is not a carefree assumption. There have been three referenda on breaking up Canada in my lifetime. More to the point, this process probably wouldn’t start in places where the state is strongest (such as in North America); rather, it would start in places where it is weakest. Think Somalia, Egypt (at the moment) or Belgium (which has basically functioned for two years without a government and no one seemed to really notice). Maybe this isn’t a world with no state – but lots of little states (which I think breaks with our mold of what we imagine the state to be to a certain degree) or maybe some new organizing mechanism, one which leverages local community identities, but can co-exist with a network of diffused but important transnational identities. Or maybe the organizing unit gets bigger, so that greater resources can be called upon to manage ne,w network-based threats.

I, like most people find this world harder to imagine. This is because so many of our assumptions suddenly disappear. If not the state, then what? Who or what protects and manages the internet infrastructure? What about other types of threats – corporate interests, organized and cyber-crime, etc.? This is true paradigm-shifting stuff (apologies for use of the word,) and frankly, I still find myself too stuck in my Newtonian world and the rules make it hard to imagine or even know what quantum mechanics will be like. Again, I want to separate imagining the future with its probability. The two are not always connected, and this is why thinking about this future, as uncomfortable and alienating as it may be, is probably an important exercise.

McWorldThe Internet Rewards the Corporation

One of the big assumptions I often find about people who write/talk about the internet is that it almost always assumes that the individual is the fundamental unit of analysis. There are good reasons for this – using social media, an individual’s capacity to be disruptive has generally increased. And, as Clay Shirky has outlined, the need for coordinating institutions and managers has greatly diminished. Indeed, Shirky’s blog post on the collapse of complex business models is (in addition to being a wonderful piece) a fantastic description of how a disruptive technology can undermine the capacity of larger complex players in a system and benefit smaller, simpler stakeholders. Of course, the smaller stakeholder in our system may not be the individual – it may be an actor that is smaller, nimbler than the state, that can foster an imagined community, and can adopt various forms of marshaling resources for self-organization to hierarchical management. Maybe it is the corporation.

During the conversation at FooCamp, Tim O’Reilly pressed this point with great effect. It could be that the corporation is actually the entity best positioned to adapt to the internet age. Small enough to leverage networks, big enough to generate a community that is actually loyal and engaged.

Indeed, it is easy to imagine a feedback loop that accelerates the ascendance of the corporation. If our imagined communities of nation states cannot withstand a world of multiple narratives and so become weaker, corporations would benefit not just from a greater capacity to adapt, but the great counterbalance to their power – state regulation and borders – might simultaneously erode. A world where more and more power – through information, money and human capital – gets concentrated in corporations is not hard to imagine. Indeed there are many who believe this is already our world. Of course, if the places (generally government bodies) where corporate conflicts – particularly those across sectors – cannot be mediated peacefully then corporations may turn much more aggressive. The need to be bigger, to marshal more resources, to have a security division to defend corporate interests, could lead to a growth in corporations as entities we barely imagine today. It’s a scary future, but not one that hasn’t been imagined several times in SciFi novels, and not one I would put beyond the realm of imagination.

The End of the World

The larger point of all this is that new technologies do change the way we imagine our communities. A second and third order impact of the printing press was its critical role in creating the modern nation-state. The bigger question is, what will be the second and third order impacts of the internet – on our communities (real and imagined), our identity and where power gets concentrated?

As different as the outcomes above are, they share one important thing in common. None represent the status quo. In each case, the nature of the state, and its relationship with citizens, shifts. Consequently, I find it hard to imagine a future where the internet does not continue to put a real strain on how we organize ourselves, and in turn the systems we have built to manage this organization. Consequently, it is not hard to imagine that as more and more of those institutions – including potentially the state itself – come under strain, it could very likely push systems – like the international state system – that are presently stable into a place of instability. It is worth noting that after the printing press, one of the first real nation states – France – wreaked havoc on Europe for almost a half century, using its enhanced resources to conquer pretty much everyone in its path.

While I am fascinated by technology and believe it can be harnessed to do good, I like to think that I am not – as Evgeny labels them – a techno-utopian. We need to remember that, looking back on our history, the second and third order effects of some technologies can be highly destabilizing, which carries with it real risks of generating significant bloodshed and conflict. Hence the title of this blog post and the FooCamp session: The End of the World.

This is not a call for a renewed Luddite manifesto. Quite the opposite – we are on a treadmill we cannot get off. Our technologies have improved our lives, but they also create new problems that, very often social innovations and other technologies will be needed to solve. Rather, I want to raise this because I believe it to be important that still more people – particularly those in the valley and other technology hubs (and not just military strategists) – be thinking critically about what the potential second and third order effects of the internet, the web and the tools they are creating, so that they can contribute to the thinking around potential technological, social and institutional responses that could hopefully mitigate against the worst outcomes.

I hope this helps prompt further thinking and discussion.

 

The US Government's Digital Strategy: The New Benchmark and Some Lessons

Last week the White House launched its new roadmap for digital government. This included the publication of Digital Government: Building a 21st Century Platform to Better Serve the American People (PDF version), the issuing of a Presidential directive and the announcement of White House Innovation Fellows.

In other words, it was a big week for those interested in digital and open government. Having had some time to digest these docs and reflect upon them, below are some thoughts on these announcement and lessons I hope governments and other stakeholders take from it.

First off, the core document – Digital Government: Building a 21st Century Platform to Better Serve the American People – is a must read if you are a public servant thinking about technology or even about program delivery in general. In other words, if your email has a .gov in it or ends in something like .gc.ca you should probably read it. Indeed, I’d put this document right up there with another classic must read, The Power of Information Taskforce Report commissioned by the Cabinet Office in the UK (which if you have not read, you should).

Perhaps most exciting to me is that this is the first time I’ve seen a government document clearly declare something I’ve long advised governments I’ve worked with: data should be a layer in your IT architecture. The problem is nicely summarized on page 9:

Traditionally, the government has architected systems (e.g. databases or applications) for specific uses at specific points in time. The tight coupling of presentation and information has made it difficult to extract the underlying information and adapt to changing internal and external needs.

Oy. Isn’t that the case. Most government data is captured in an application and designed for a single use. For example, say you run the license renewal system. You update your database every time someone wants to renew their license. That makes sense because that is what the system was designed to do. But, maybe you like to get track, in real time, how frequently the database changes, and by who. Whoops. System was designed for that because that wasn’t needed in the original application. Of course, being able to present the data in that second way might be a great way to assess how busy different branches are so you could warn prospective customers about wait times. Now imagine this lost opportunity… and multiply it by a million. Welcome to government IT.

Decoupling data from application is pretty much close to the first think in the report. Here’s my favourite chunk from the report (italics mine, to note extra favourite part).

The Federal Government must fundamentally shift how it thinks about digital information. Rather than thinking primarily about the final presentation—publishing web pages, mobile applications or brochures—an information-centric approach focuses on ensuring our data and content are accurate, available, and secure. We need to treat all content as data—turning any unstructured content into structured data—then ensure all structured data are associated with valid metadata. Providing this information through web APIs helps us architect for interoperability and openness, and makes data assets freely available for use within agencies, between agencies, in the private sector, or by citizens. This approach also supports device-agnostic security and privacy controls, as attributes can be applied directly to the data and monitored through metadata, enabling agencies to focus on securing the data and not the device.

To help, the White House provides a visual guide for this roadmap. I’ve pasted it below. However, I’ve taken the liberty to highlight how most governments try to tackle open data on the right – just so people can see how different the White House’s approach is, and why this is not just an issue of throwing up some new data but a total rethink of how government architects itself online.

There are of course, a bunch of things that flow out of the White House’s approach that are not spelled out in the document. The first and most obvious is once you make data an information layer you have to manage it directly. This means that data starts to be seen and treated as a asset – this means understanding who’s the custodian and establishing a governance structure around it. This is something that, previously, really only libraries and statistical bureaus have really understand (and sometimes not even!).

This is the dirty secret about open data – is that to do it effectively you actually have to start treating data as an asset. For the White House the benefit of taking that view of data is that it saves money. Creating a separate information layer means you don’t have to duplicate it for all the different platforms you have. In addition, it gives you more flexibility in how you present it, meaning the costs of showing information on different devices (say computers vs. mobile phones) should also drop. Cost savings and increased flexibility are the real drivers. Open data becomes an additional benefit. This is something I dive into deeper detail in a blog post from July 2011: It’s the icing, not the cake: key lesson on open data for governments.

Of course, having a cool model is nice and all, but, as like the previous directive on open government, this document has hard requirements designed to force departments to being shifting their IT architecture quickly. So check out this interesting tidbit out of the doc:

While the open data and web API policy will apply to all new systems and underlying data and content developed going forward, OMB will ask agencies to bring existing high-value systems and information into compliance over a period of time—a “look forward, look back” approach To jump-start the transition, agencies will be required to:

  • Identify at least two major customer-facing systems that contain high-value data and content;
  • Expose this information through web APIs to the appropriate audiences;
  • Apply metadata tags in compliance with the new federal guidelines; and
  • Publish a plan to transition additional systems as practical

Note the language here. This is again not a “let’s throw some data up there and see what happens” approach. I endorse doing that as well, but here the White House is demanding that departments be strategic about the data sets/APIs they create. Locate a data set that you know people want access to. This is easy to assess. Just look at pageviews, or go over FOIA/ATIP requests and see what is demanded the most. This isn’t rocket science – do what is in most demand first. But you’d be surprised how few governments want to serve up data that is in demand.

Another interesting inference one can make from the report is that its recommendations embrace the possibility of participants outside of government – both for and non-profit – can build services on top of government information and data. Referring back to the chart above see how the Presentation Layer includes both private and public examples? Consequently, a non-profits website dedicated to say… job info veterans could pull live data and information from various Federal Government websites, weave it together and present in a way that is most helpful to the veterans it serves. In other words the opportunity for innovation is fairly significant. This also has two addition repercussions. It means that services the government does not currently offer – at least in a coherent way – could be woven together by others. It also means there may be information and services the government simply never chooses to develop a presentation layer for – it may simply rely on private or non-profit sector actors (or other levels of government) to do that for it. This has interesting political ramifications in that it could allow the government to “retreat” from presenting these services and rely on others. There are definitely circumstances where this would make me uncomfortable, but the solution is not to not architect this system this way, it is to ensure that such programs are funded in a way that ensures government involvement in all aspects – information, platform and presentation.

At this point I want to interject two tangential thoughts.

First, if you are wondering why it is your government is not doing this – be it at the local, state or national level. Here’s a big hint: this is what happens when you make the CIO an executive who reports at the highest level. You’re just never going to get innovation out of your government’s IT department if the CIO reports into the fricking CFO. All that tells me is that IT is a cost centre that should be focused on sustaining itself (e.g. keeping computers on) and that you see IT as having no strategic relevance to government. In the private sector, in the 21st century, this is pretty much the equivalent of committing suicide for most businesses. For governments… making CIO’s report into CFO’s is considered a best practice. I’ve more to say on this. But I’m taking a deep breath and am going to move on.

Second, I love how the document also is so clear on milestones – and nicely visualized as well. It may be my poor memory but I feel like it is rare for me to read a government road map on any issues where the milestones are so clearly laid out.

It’s particularly nice when a government treats its citizens as though they can understand something like this, and aren’t afraid to be held accountable for a plan. I’m not saying that other governments don’t set out milestones (some do, many however do not). But often these deadlines are buried in reams of text. Here is a simply scorecard any citizen can look at. Of course, last time around, after the open government directive was issued immediately after Obama took office, they updated these score cards for each department, highlight if milestones were green, yellow or red, depending on how the department was performing. All in front of the public. Not something I’ve ever seen in my country, that’s for sure.

Of course, the document isn’t perfect. I was initially intrigued to see the report advocates that the government “Shift to an Enterprise-Wide Asset Management and Procurement Model.” Most citizens remain blissfully unaware of just how broken government procurement is. Indeed, I say this dear reader with no idea where you live and who your government is, but I enormously confident your government’s procurement process is totally screwed. And I’m not just talking about when they try to buy fighter planes. I’m talking pretty much all procurement.

Today’s procurement is perfectly designed to serve one group. Big (IT) vendors. The process is so convoluted and so complicated they are really the only ones with the resources to navigate it. The White House document essentially centralizes procurement further. On the one hand this is good, it means the requirements around platforms and data noted in the document can be more readily enforced. Basically the centre is asserting more control at the expense of the departments. And yes, there may be some economies of scale that benefit the government. But the truth is whenever procurement decision get bigger, so to do the stakes, and so to does the process surrounding them. Thus there are a tiny handful of players that can respond to any RFP and real risks that the government ends up in a duopoly (kind of like with defense contractors). There is some wording around open source solutions that helps address some of this, but ultimately, it is hard to see how the recommendations are going to really alter the quagmire that is government procurement.

Of course, these are just some thoughts and comments that struck me and that I hope, those of you still reading, will find helpful. I’ve got thoughts on the White House Innovation Fellows especially given it appears to have been at least in part inspired by the Code for America fellowship program which I have been lucky enough to have been involved with. But I’ll save those for another post.

The "I Lost My Wallet" Service – Doing Government Service Delivery Right

A couple of years ago I was in Portugal to give a talk on Gov 2.0 at a conference the government was organizing. After the talk I went for dinner with the country’s CIO and remember hearing about a fantastic program they were running that – for me – epitomized the notion of a citizen centric approach. It was a help desk called: I Lost My Wallet.

Genius.

Essentially, it was a place you went when… you lost your wallet. What the government had done was bring together all the agencies that controlled a document or card that was likely to have been in your wallet. As a result, rather than running around from agency to agency filling out your name and address over and over again on dozens of different forms, you went to a single desk, filled out one set of forms to get new copies of say, your social insurance card, your drivers license, healthcare card and library card.

But get this. From the briefing I had, my understanding was that this service was not limited to government cards, they’d also partnered with several private entities. For example, I notice that the service also works for replacing Portugal’s Automotive Club Card. In addition – if I remember correctly – I was told the government was negotiating with the banks so that you could also cancel and replace your ATM/bank card and visa cards at this counter as well.

Now this is citizen centric service. Here the government is literally molded itself – pulling together dozens of agencies and private sector actors around a single service – so that a citizens can simply and quickly deal with a high stress moment. Yes, I’d love to live in a world where all these cards disappeared altogether and were simply managed by a single card of your choosing (like say your Oyster card in the UK – so that your subway fare card was also your healthcare card, government ID, and credit card). But we are a few years away from that still and so this is a nice interim service.

But more importantly it shows a real ability to shed silos and build a service around a citizen/customer need. I believe they had a similar service for “I bought a house” since this is a moment when a number of different government services become relevant simultaneously. I of course, can imagine several others – most notably a “my partner just died” service could be invaluable at helping people manage a truly terrible moment when dealing with government bureaucracy is the last thing they want to be doing.

You can find the website for I lost my Wallet here (it is, naturally, in Portuguese). You can also read more about it, as documented by the European Union here. Lots of food for thought here for those of you designing programs to serve citizens, be it in the public or private sector.