Category Archives: commentary

Transparency Case Study: There are Good and Bad Ways Your Organization can be made "Open"

If you have not had the chance, I strongly encourage you to check out a fantastic piece of journalism in this week’s Economist on the state of the Catholic Church in America. It’s a wonderful example of investigative and data driven journalism made possible (sadly) by the recent spat of sexual-abuse and bankruptcy cases. As a result some of the normally secret financial records of the Church have been made public enabling the Economist to reconstruct the secret and opaque general finances of the Catholic church in America. It is a fascinating, and at times disturbing read.

The articles also suggests – I believe – a broader lessons for non-profits, governments and companies. Disclosure and transparency are essential to the effective running of an organization. As you read the piece it is clear that more disclosure would probably have compelled the Church to manage its finances in a more fiscally sound manner. It probably would have also made acts that are, at best negligent, at worst corrupt, difficult to impossible. This is, indeed, why many charities, organizations and public companies must conduct audits and publish the results.

But conforming to legal requirements will not shield you from an angry public. My sense is that many member contribution based organizations – from public companies to clubs to governments, are going to feel enormous pressure from their “contributors” to disclose more about how funds are collected, managed, disbursed and used. In a post financial collapse and post Enron era it’s unclear to me that people trust auditors the way they once did. In addition, as technology makes it easier to track money in real time, contributors are going to want more than just an annual audit. Even if they look at it rarely, they are going to want to know there is a dashboard or system they can look at and understand that shows them where the money goes.

I’m open to being wrong about this – and I’m not suggesting this is a panacea that solves all problems, but I nonetheless suspect that many organizations are going to feel pressure to become more transparent. There will be good ways in which that takes place… and bad ways. The Catholic Church story in the Economist is probably an example of the worst possible way: transparency forced upon an organization through the release of documents in a court case.

For anyone running an non-profit, community group, public agency or government department – this makes the article doubly worth reading. It is a case study in the worst possible scenario for your organization. The kind of disaster you never want to have to deal with.

The problem is, and I’m going to go out on a limb here, is that, at some point in the next 10-20 years, there is a non-trivial risk any organization (including your’s, reader) will face a publicity or legitimacy crisis because of a real or imagined problem. Trust me when I tell you: that moment will not be the moment when it is easiest or desirable from a cost, political and cultural perspective, to make your organization more transaparent. So better for you to think about how you’d like to shift policies, culture and norms to make it more transparent and accountable today, when things are okay, than in the crisis.

Consider again the Catholic Church. There are some fascinating and disturbing facts shared in the story that provide some interesting context. On the fascinating side, I had no idea of the scope and size of the Catholic Church. Consider that, according to the article:

“Almost 100m Americans, a third of the nation, have been baptised into the faith and 74m identify themselves as Catholic.”

and

“there are now over 6,800 Catholic schools (5% of the national total); 630 hospitals (11%) plus a similar number of smaller health facilities; and 244 colleges and universities.”

We are talking about a major non-profit that is providing significant services into numerous communities. It also means that the Catholic church does a lot of things that many other non-profits do. Whatever you are doing, they are probably doing it too.

Now consider some of the terrible financial practices the Church tried/tries to get away with because it thinks no one will be able to see them:

Lying about assets: “In a particularly striking example, the diocese of San Diego listed the value of a whole city block in downtown San Diego at $40,000, the price at which it had been acquired in the 1940s, rather than trying to estimate the current market value, as required. Worse, it altered the forms in which assets had to be listed. The judge in the case, Louise Adler, was so vexed by this and other shenanigans on the part of the diocese that she ordered a special investigation into church finances which was led by Todd Neilson, a former FBI agent and renowned forensic accountant. The diocese ended up settling its sexual-abuse cases for almost $200m.”

Playing fast and loose with finances: “Some dioceses have, in effect, raided priests’ pension funds to cover settlements and other losses. The church regularly collects money in the name of priests’ retirement. But in the dioceses that have gone bust lawyers and judges confirm that those funds are commingled with other investments, which makes them easily diverted to other uses.”

Misleading contributors about the destination of funds: “Under Cardinal Bernard Law, the archdiocese of Boston contributed nothing to its clergy retirement fund between 1986 and 2002, despite receiving an estimated $70m-90m in Easter and Christmas offerings that many parishioners believed would benefit retired priests.”

Using Public Subsidies to Indirectly Fund Unpopular Activities: “Muni bonds are generally tax-free for investors, so the cost of borrowing is lower than it would be for a taxable investment. In other words, the church enjoys a subsidy more commonly associated with local governments and public-sector projects. If the church has issued more debt in part to meet the financial strains caused by the scandals, then the American taxpayer has indirectly helped mitigate the church’s losses from its settlements. Taxpayers may end up on the hook for other costs, too. For example, settlement of the hundreds of possible abuse cases in New York might cause the closure of Catholic schools across the city.”

Of course all of this pales in comparison to the most disturbing part of the article: in several jurisdictions, the church is spending money to lobby governments to no extend the statute of limitations around sexual-abuse cases. This is so that it, and its priests, cannot be charged by authorities diminishing the likelihood that they get sued. The prospect that an organization that is supposed to both model the highest ideals of behaviour as well as protect the most marginalized is trying to limit the statues on such a heinous crime is so profoundly disgusting it is hard to put words to it. The Economist gives it a shot:

Various sources say that Cardinal Dolan and other New York bishops are spending a substantial amount—estimates range from $100,000 a year to well over $1m—on lobbying the state assembly to keep the current statute of limitations in place. His office will not comment on these estimates. This is in addition to the soft lobbying of lawmakers by those with pulpits at their disposal. The USCCB, the highest Catholic body in America, also lobbies the federal government on the issue. In April the California Catholic Conference, an organisation that brings the state’s bishops together, sent a letter to California’s Assembly opposing a bill that would extend the statute and require more rigorous background checks on church workers.

This disgusting discover aside, most organizations are probably not going to have the same profound problems found in the Catholic Church. But in almost every organization, no matter the controls, some form of mismanagement is probably taking place. The question is, will you already have in place policies and a culture that support transparency and disclosure before the problem is discovered – or will the crises become the moment where you have to try to implement them, probably under less than ideal circumstances?

As they said after Watergate “It’s not the Crime that kills you, but the cover up,” good transparency and disclosure can’t stop the crime, but it might help prevent them. Moreover, it can also make the cover up harder and, potentially, make it easier to ease the concerns of contributors and rebuilt trust. One could imagine that if the Church had been more transparent about its finances it might have better protected itself against bankruptcy from some of these cases. More importantly, it’s transparency might have make it easier to rebuilt trust, whereas any chance now will just seem like a reaction to the crises, not a genuine desire to operate differently.

Again, I think the pressure on many orgs to be more transparent is going to grow. And managers should recognize there are good and bad conditions under which such transparency can take place. Read this Economist story. In addition to be fascinating, it is a great case study in the worst case scenario for opaque institutions.

China, Twitter and the 0.1%

Earlier this month I had the good fortune of visiting China – a place I’m deeply curious about and – aside from some second year university courses, the reporting from the Economist, and the occasional trip over to Tea Leaf Nation – remains too foreign to me for comfort given its enormous importance.

As always China – or to be specific – Beijing is overwhelming. The pollution, the people, the energy, the scale. It can be hard to grasp or describe. But I want to talk about my conversations which were overwhelming in other, equally fantastic ways.

As an indirect result of the trip I just posted a piece up on the WeGov section of TechPresident asking “Is Sina Weibo a Means of Free Speech or a Means of Social Control.” The post comes out of a dinner conversation I had in Beijing with Michael Anti, an exceedingly smart and engaging Chinese journalist, political blogger and former Nieman Journalism Fellow at Harvard who has a very interesting analysis about Weibo (Chinese twitter) as a means to enhance the power of the central government. The piece is worth reading largely because of the thoughts he has that are embedded in it.

There was another interesting thread that came out of our dinner conversation – one that was equally sobering. Michael described himself as part of the “Generation 70,” those who, if it isn’t obvious, were born in the 70s. For him, the defining trait or esprit, of that generation was a sense of possibility. As he noted “I’m from a humble family and fairly unimportant city in China, and I got to end up at Harvard.” For him and his peers as China grew, so did many of their opportunities. He argues that they were able to dream big in ways the almost no previous generation of Chinese could.

He contrasts that with the “Generation 90,” those now in mid teens to mid twenties and he sees a generation with small dreams and growing frustration. Forget about access to the top international universities in the world. Indeed, forget about access to top Chinese universities. Such opportunities are now reserved for the super rich and the super connected. What many have felt was a system that was relatively meritocratic is now flagrantly not. According to Anti, the result is that Generation 90 does not have big dreams. Forget about become a world class scientist, founding a leading company, leading an interesting organization. Many do not even dream of owning an apartment. This evolution (devolution?) in moods was summed up succinctly in a poster Anti saw at a demonstration a few months earlier which nihilistic read “We are Generation 90: Sacrifice Us.”

A few days later I had the good fortune of being in Soho (a very upscale neighborhood in Beijing) with a group of Generation 90ers. Eating at a very nice Chinese restaurant above a Nike and Apple store I discretely asked a few them if they agreed with Anti’s assessment about Generation 90.

Were they optimistic about their future? What were their dreams? Did they feel like opportunities for them were shrinking?

I was stunned.

They agreed with Anti. Here I was, in the nation’s capital, sitting in an upper middle class restaurant, with a vibrant, intelligent, bilingual group of young Chinese. This is a group that would easily fit in the top 5% in terms of education, opportunity and income, and most probably in the 1%. And they felt that opportunity for their generation were limited. Their dreams, were more limited than the generation before them.

Maybe this is a sort of Gen X syndrome Chinese style, but if I were the Chinese government, such sentiment would have me worried. There is an acceptance – from what I observed – in China that the system is rigged – but there was clearly a sense that before the benefits were more widely distributed, or at least available on a limited meritocratic basis.

I’ve never been as bullish on China as some of my peers. Between the demographic time bomb that is about to explode, the fact the state spends more on internal security than the military and the weak sense of the rule of law (essential for any effective market) I’ve felt that China’s problems are deeper and ultimately harder to address than say those in India or Brazil. But this conversation has given me new pause. If the 5% or worse, the 1% don’t feel like China can make room for them, where does that leave everyone else not in the .1%? These are, of course, problems many societies must face. But public discourse in China on a subject like this is almost certainly much harder to engage in than in say, Brazil, India or America.

China is a wonderfully complex and nuanced place. So maybe this all means nothing. Maybe it is a sort of Gen X funk. But it has given me a whole lot more to think about.

How Government should interact with Developers, Data Geeks and Analysts

Below is a screen shot from the Opendatabc google group from about two months ago. I meant to blog about this earlier but life has been in the way. For me, this is a prefect example of how many people in the data/developer/policy world probably would like to interact with their local, regional or national government.

A few notes on this interaction:

  • I occasionally hear people try to claim the governments are not responsive to requests for data sets. Some aren’t. Some are. To be fair, this was not a request for the most controversial data set in the province. But it is was a request. And it was responded to. So clearly there are some governments that are responsive. The questions is figuring out which one’s are, why they are, and see if we can export that capacity to other jurisdictions.
  • This interaction took place in a google group – so the whole context is social and norm driven. I love that public officials in British Columbia as well as with the City of Vancouver are checking in every once in a while on google groups about open data, contributing to conversations and answering questions that citizens have about government, policies and open data. It’s a pretty responsive approach. Moreover, when people are not constructive it is the group that tends to moderate the behaviour, rather than some leviathan.
  • Yes, I’ve blacked out the email/name of the public servant. This is not because I think they’d mind being known or because they shouldn’t be know, but because I just didn’t have a chance to ask for permission. What’s interesting is that this whole interaction is public and the official was both doing what that government wanted and compliant with all social media rules. And yet, I’m blacking it out, which is a sign of how messed up current rules and norms make citizens relationships with public officials they interact with online -I’m worried of doing something wrong by telling others about a completely public action. (And to be clear, the province of BC has really good and progressive rules around these types of things)
  • Yes, this is not the be all end all of the world. But it’s a great example of a small thing being doing right. It’s nice to be able to show that to other government officials.

 

Containers, Facebook, Baseball & the Dark Matter around Open Data (#IOGDC keynote)

Below is a extended blog post that summarizes the keynote address I gave at the World Bank/Data.gov International Open Government Data Conference in Washington DC on Wednesday July 11th. This piece is cross posted over at the WeGov blog on TechPresident where I’m also write on transparency, technology and politics.

Yesterday, after spending the day at the International Open Government Data Conference at the World Bank (and co-hosted by data.gov) I left both upbeat and concerned. Upbeat because of the breadth of countries participating and the progress being made.

I was worried however because of the type of conversation we are having how it might limit the growth of both our community and the impact open data could have. Indeed as we talk about technology and how to do open data we risk missing the real point of the whole exercise – which is about use and impacts.

To get drive this point home I want to share three stories that highlight the challenges, I believe, we should be talking about.

Challenge 1: Scale Open Data

IDealx-300x247In 1956 Ideal-X, the ship pictured to the left, sailed from Newark to Houston and changed the world.

Confused? Let me explain.

As Marc Levine chronicles in his excellent book The Box, the world in 1956 was very different to our world today. Global trade was relatively low. China was a long way off from becoming the world’s factory floor. And it was relatively unusual for people to buy goods made elsewhere. Indeed, as Levine puts it, the cost of shipping goods was “so expensive that it did not pay to ship many things halfway across the country, much less halfway around the world.” I’m a child of the second era of globalization. I grew up in a world of global transport and shipping. The world before all of that which Levine is referring to is actually foreign to me. What is amazing is how much of that has just become a basic assumption of life.

And this is why Ideal-X, the aforementioned ship, is so important. It is the first cargo container ship (in how we understand containers). Its trip from Newark to Houston marked the beginning of a revolution because containers slashed the cost of shipping goods. Before Ideal-X the cost of loading cargo onto a medium sized cargo ship was $5.83 per ton, with containers, the cost dropped to 15.8 cents. Yes, the word you are looking for is: “wow.”

You have to understand that before containers loading a ship was a lot more like packing a mini-van for a family vacation to the beach than the orderly process of what could be described as stacking very large lego blocks on a boat. Before containers literally everything had to be hand packed, stored and tied down in the hull. (see picture to the right)

This is a little bit what our open data world looks like right today. The people who are consuming open data are like digital longshoreman. They have to look at each open data set differently, unpack it accordingly and figure out where to put it, how to treat it and what to do with it. Worse,  when looking at data from across multiple jurisdictions it is often much like cargo going around the world before 1956: a very slow and painful process. (see man on the right).

Of course, the real revolution in container shipping happened in 1966 when the size of containers was standardized. Within a few years containers could move from pretty much anywhere in the world from truck to train to boat and back again. In the following decades global shipping trade increased by 2.5 times the rate of economic output. In other words… it exploded.

Container-300x260Geek side bar: For techies, think of shipping containers as the TCP-IP packet of globalization. TCP-IP standardized the packet of information that flowed over the network so that data could move from anywhere to anywhere. Interestingly, like containers, what was in the package was actually not relevant and didn’t need to be known by the person transporting it. But the fact that it could move anywhere created scale and allowed for logarithmic growth.

What I’m trying to drive at is that, when it comes to open data, the number of open data sets that gets published is no longer the critical metric. Nor is the number of open data portals. We’ve won. There are more and more. The marginal political and/or persuasive benefit of an addition of another open data portal or data set won’t change the context anymore. I want to be clear – this is not to say that more open data sets and more open data portals are not important or valuable – from a policy and programmatic perspective more is much, much better. What I am saying is that having more isn’t going to shift the conversation about open data any more. This is especially true if data continues to require large amounts of work and time for people to unpack and understanding it over and over again across every portal.

In other words, what IS going to count, is how many standardized open data sets get created. This is what we SHOULD be measuring. The General Transit Feed Specification revolutionized how people engaged with public transit because the standard made it so easy to build applications and do analysis around it. What we need to do is create similar standards for dozens, hundreds, thousands of other data sets so that we can drive new forms of use and engagement. More importantly we need to figure out how to do this without relying on a standards process that take 8 to 15 to infinite years to decide on said standard. That model is too slow to serve us, and so re-imaging/reinventing that process is where the innovation is going to shift next.

So let’s stop counting the number of open data portals and data sets, and start counting the number of common standards – because that number is really low. More critically, if we want to experience the kind of explosive growth in use like that experienced by global trade and shipping after the rise of the standardized container then our biggest challenge is clear: We need to containerize open data.

Challenge 2: Learn from Facebook

facebook-logoOne of the things I find most interesting about Facebook is that everyone I’ve talked to about it notes how the core technology that made it possible was not particularly new. It wasn’t that Zuckerberg leveraged some new code or invented a new, better coding language. Rather it was that he accomplished a brilliant social hack.

Part of this was luck, that the public had come a long way and was much more willing to do social things online in 2004 than they were willing to do even two years earlier with sites like friendster. Or, more specifically, young people who’d grown up with internet access were willing to do things and imagine using online tools in ways those who had not grown up with those tools wouldn’t or couldn’t. Zuckerberg, and his users, had grown up digital and so could take the same tools everyone else had and do something others hadn’t imagined because their assumptions were just totally different.

My point here is that, while it is still early, I’m hoping we’ll soon have the beginnings of a cohort of public servants who’ve “grown up data.” For whom, despite their short career in the public service, have matured in a period where open data has been an assumption, not a novelty. My hope and suspicion is that this generation of public servants are going to think about Open Data very differently than many of us do. Most importantly, I’m hoping they’ll spur a discussion about how to use open data – not just to share information with the public – but to drive policy objectives. The canonical opportunity for me around this remains restaurant inspection data, but I know there are many, many, more.

What I’m trying to say is that the conferences we organize have got to talk less and less about how to get data open and have to start talking more about how do we use data to drive public policy objectives. I’m hoping the next International Open Government Data Conference will have an increasing number of presentations by citizens, non-profits and other outsiders are using open data to drive their agenda, and how public servants are using open data strategically to drive to a outcome.

I think we have to start fostering that conversation by next year at the latest and that this conversation, about use, has to become core to everything we talk about within 2 years, or we will risk losing steam. This is why I think the containerization of open data is so important, as well as why I think the White House’s digital government strategy is so important since it makes internal use core to the governments open data strategy.

Challenge 3: The Culture and Innovation Challenge.

In May 2010 I gave this talk on Open Data, Baseball and Government at the Gov 2.0 Summit in Washington DC. It centered around the story outline in the fantastic book Moneyball by Michael Lewis. It traces the story about how a baseball team – the Oakland A’s – used a new analysis of players stats to ferret out undervalued players. This enabled them to win a large number of games on a relatively small payroll. Consider the numbers to the right.

I mean if you are the owner of the Texas Rangers, you should be pissed! You are paying 250% in salary for 25% fewer wins than Oakland. If this were a government chart, where “wins” were potholes found and repaired, and “payroll” was costs… everyone in the world bank would be freaking out right now.

For those curious, the analytical “hack” was recognizing that the most valuable thing a player can do on offense is get on base. This is because it gives them an opportunity to score (+) but it also means you don’t burn one of your three “outs” that would end the inning and the chance for other players to score. The problem was, to measure the offensive power of a player, most teams were looking at hitting percentages (along with a lot of other weird, totally non-quantitative stuff) which ignores the possibility of getting walked, which allows you to get on base without hitting the ball!

What’s interesting however is that the original thinking about the fact that people were using the wrong metrics to assess baseball players first happened decades before the Oakland “A”s started to use it. Indeed it was a nighttime security guard with a strong mathematics background and an obsession for baseball that first began point this stuff out.

20-yearsThe point I’m making is that it took 20 years for a manager in baseball to recognize that there was better evidence and data they could be using to make decisions. TWENTY YEARS. And that manager was hated by all the other managers who believed he was ruining the game. Today, this approach to assessing baseball is common place – everyone is doing it – but see how the problem of using baseball’s “open data” to create better outcomes was never just an accessibility issue. Once that was resolved the bigger challenge centered around culture and power. Those with the power had created a culture in which new ideas – ideas grounded in evidence but that were disruptive – couldn’t find an audience. Of course, there were structural issues as well, many people had jobs that depended on not using the data, on instead relying on their “instincts” but I think the cultural issue is a significant one.

So we can’t expect that we are going to go from open portal today to better decisions tomorrow. There is a good chance that some of the ideas data causes us to think will be so radical and challenging that either the ideas, the people who champion them, or both, could get marginalized. On the up side, I feel like I’ve seen some evidence to the contrary to this in city’s like New York and Chicago, but the risk is still there.

So what are we going to do to ensure that the culture of government is one that embraces the challenges to our thinking and assumptions that doesn’t require 20 years to pass for us to make progress. This is a critical challenge for us – and it is much, much bigger than open data.

Conclusion: Focus on the Dark Matter

I’m deeply indebted to my friend – the brilliant Gordon Ross – who put me on to this idea the other day over tea.

Macguffin

Do you remember the briefcase in Pulp Fiction? The on that glowed when opened? That the characters were all excited about but you never knew what was inside it. It’s called a MacGuffin. I’m not talking about the briefcase per se. Rather I mean the object in a story that all the characters are obsessed about, but that you – the audience – never find out what it is, and frankly, really isn’t that important to you. In Pulp Fiction I remember reading that the briefcase is allegedly Marsellus Wallace soul. But ultimately, it doesn’t matter. What matters is that Vincent Vega, Jules Winnfield and a ton of other characters think it is important, and that drives the action and the plot forward.

Again – let me be clear – Open Data Portals are our MacGuffin device. We seem to care A LOT about them. But trust me, what really matters is everything that can happens around them. What makes open data important is not a data portal. It is a necessary prerequisite but it’s not the end, it just the means. We’re here because we believe that the things open data can let us and others do, matter. The Open Data portal was only ever a MacGuffin device – something that focused our attention and helped drive action so that we could do the other things – that dark matter that lies all around the MacGuffin device.

And that is what brings me back to our three challenges. Right now, the debate around open data risks become too much like a Pulp Fiction conference in which all the panels talk about the briefcase. Instead we should be talking more and more about all the action – the dark matter – taking place around the briefcase. Because that is what is really matters. For me, I think the three things that matter most are what I’ve mentioned about in this talk:

  • standards – which will let us scale, I believe strongly that the conversation is going to shift from portals to standards
  • strategic use – starting us down the path of learning how open data can drive policy outcomes; and
  • culture and power – recognizing that there are lots of open data is going to surface a lot of reasons why governments don’t want to engage data driven in decision making

In other words, I want to be talking about how open data can make the world a better place, not about how we do open data. That conversation still matters, open data portals still matter, but the path forward around them feels straightforward, and if they remain the focus we’ll be obsessing about the wrong thing.

So here’s what I’d like to see in the future from our Open Data conferences. We got to stop talking about how to do open data. This is because all of our efforts here, everything we are trying to accomplish… it has nothing to do with the data. What I think we want to be talking about is how open data can be a tool to make the world a better place. So let’s make sure that is the conversation we are have.

The US Government's Digital Strategy: The New Benchmark and Some Lessons

Last week the White House launched its new roadmap for digital government. This included the publication of Digital Government: Building a 21st Century Platform to Better Serve the American People (PDF version), the issuing of a Presidential directive and the announcement of White House Innovation Fellows.

In other words, it was a big week for those interested in digital and open government. Having had some time to digest these docs and reflect upon them, below are some thoughts on these announcement and lessons I hope governments and other stakeholders take from it.

First off, the core document – Digital Government: Building a 21st Century Platform to Better Serve the American People – is a must read if you are a public servant thinking about technology or even about program delivery in general. In other words, if your email has a .gov in it or ends in something like .gc.ca you should probably read it. Indeed, I’d put this document right up there with another classic must read, The Power of Information Taskforce Report commissioned by the Cabinet Office in the UK (which if you have not read, you should).

Perhaps most exciting to me is that this is the first time I’ve seen a government document clearly declare something I’ve long advised governments I’ve worked with: data should be a layer in your IT architecture. The problem is nicely summarized on page 9:

Traditionally, the government has architected systems (e.g. databases or applications) for specific uses at specific points in time. The tight coupling of presentation and information has made it difficult to extract the underlying information and adapt to changing internal and external needs.

Oy. Isn’t that the case. Most government data is captured in an application and designed for a single use. For example, say you run the license renewal system. You update your database every time someone wants to renew their license. That makes sense because that is what the system was designed to do. But, maybe you like to get track, in real time, how frequently the database changes, and by who. Whoops. System was designed for that because that wasn’t needed in the original application. Of course, being able to present the data in that second way might be a great way to assess how busy different branches are so you could warn prospective customers about wait times. Now imagine this lost opportunity… and multiply it by a million. Welcome to government IT.

Decoupling data from application is pretty much close to the first think in the report. Here’s my favourite chunk from the report (italics mine, to note extra favourite part).

The Federal Government must fundamentally shift how it thinks about digital information. Rather than thinking primarily about the final presentation—publishing web pages, mobile applications or brochures—an information-centric approach focuses on ensuring our data and content are accurate, available, and secure. We need to treat all content as data—turning any unstructured content into structured data—then ensure all structured data are associated with valid metadata. Providing this information through web APIs helps us architect for interoperability and openness, and makes data assets freely available for use within agencies, between agencies, in the private sector, or by citizens. This approach also supports device-agnostic security and privacy controls, as attributes can be applied directly to the data and monitored through metadata, enabling agencies to focus on securing the data and not the device.

To help, the White House provides a visual guide for this roadmap. I’ve pasted it below. However, I’ve taken the liberty to highlight how most governments try to tackle open data on the right – just so people can see how different the White House’s approach is, and why this is not just an issue of throwing up some new data but a total rethink of how government architects itself online.

There are of course, a bunch of things that flow out of the White House’s approach that are not spelled out in the document. The first and most obvious is once you make data an information layer you have to manage it directly. This means that data starts to be seen and treated as a asset – this means understanding who’s the custodian and establishing a governance structure around it. This is something that, previously, really only libraries and statistical bureaus have really understand (and sometimes not even!).

This is the dirty secret about open data – is that to do it effectively you actually have to start treating data as an asset. For the White House the benefit of taking that view of data is that it saves money. Creating a separate information layer means you don’t have to duplicate it for all the different platforms you have. In addition, it gives you more flexibility in how you present it, meaning the costs of showing information on different devices (say computers vs. mobile phones) should also drop. Cost savings and increased flexibility are the real drivers. Open data becomes an additional benefit. This is something I dive into deeper detail in a blog post from July 2011: It’s the icing, not the cake: key lesson on open data for governments.

Of course, having a cool model is nice and all, but, as like the previous directive on open government, this document has hard requirements designed to force departments to being shifting their IT architecture quickly. So check out this interesting tidbit out of the doc:

While the open data and web API policy will apply to all new systems and underlying data and content developed going forward, OMB will ask agencies to bring existing high-value systems and information into compliance over a period of time—a “look forward, look back” approach To jump-start the transition, agencies will be required to:

  • Identify at least two major customer-facing systems that contain high-value data and content;
  • Expose this information through web APIs to the appropriate audiences;
  • Apply metadata tags in compliance with the new federal guidelines; and
  • Publish a plan to transition additional systems as practical

Note the language here. This is again not a “let’s throw some data up there and see what happens” approach. I endorse doing that as well, but here the White House is demanding that departments be strategic about the data sets/APIs they create. Locate a data set that you know people want access to. This is easy to assess. Just look at pageviews, or go over FOIA/ATIP requests and see what is demanded the most. This isn’t rocket science – do what is in most demand first. But you’d be surprised how few governments want to serve up data that is in demand.

Another interesting inference one can make from the report is that its recommendations embrace the possibility of participants outside of government – both for and non-profit – can build services on top of government information and data. Referring back to the chart above see how the Presentation Layer includes both private and public examples? Consequently, a non-profits website dedicated to say… job info veterans could pull live data and information from various Federal Government websites, weave it together and present in a way that is most helpful to the veterans it serves. In other words the opportunity for innovation is fairly significant. This also has two addition repercussions. It means that services the government does not currently offer – at least in a coherent way – could be woven together by others. It also means there may be information and services the government simply never chooses to develop a presentation layer for – it may simply rely on private or non-profit sector actors (or other levels of government) to do that for it. This has interesting political ramifications in that it could allow the government to “retreat” from presenting these services and rely on others. There are definitely circumstances where this would make me uncomfortable, but the solution is not to not architect this system this way, it is to ensure that such programs are funded in a way that ensures government involvement in all aspects – information, platform and presentation.

At this point I want to interject two tangential thoughts.

First, if you are wondering why it is your government is not doing this – be it at the local, state or national level. Here’s a big hint: this is what happens when you make the CIO an executive who reports at the highest level. You’re just never going to get innovation out of your government’s IT department if the CIO reports into the fricking CFO. All that tells me is that IT is a cost centre that should be focused on sustaining itself (e.g. keeping computers on) and that you see IT as having no strategic relevance to government. In the private sector, in the 21st century, this is pretty much the equivalent of committing suicide for most businesses. For governments… making CIO’s report into CFO’s is considered a best practice. I’ve more to say on this. But I’m taking a deep breath and am going to move on.

Second, I love how the document also is so clear on milestones – and nicely visualized as well. It may be my poor memory but I feel like it is rare for me to read a government road map on any issues where the milestones are so clearly laid out.

It’s particularly nice when a government treats its citizens as though they can understand something like this, and aren’t afraid to be held accountable for a plan. I’m not saying that other governments don’t set out milestones (some do, many however do not). But often these deadlines are buried in reams of text. Here is a simply scorecard any citizen can look at. Of course, last time around, after the open government directive was issued immediately after Obama took office, they updated these score cards for each department, highlight if milestones were green, yellow or red, depending on how the department was performing. All in front of the public. Not something I’ve ever seen in my country, that’s for sure.

Of course, the document isn’t perfect. I was initially intrigued to see the report advocates that the government “Shift to an Enterprise-Wide Asset Management and Procurement Model.” Most citizens remain blissfully unaware of just how broken government procurement is. Indeed, I say this dear reader with no idea where you live and who your government is, but I enormously confident your government’s procurement process is totally screwed. And I’m not just talking about when they try to buy fighter planes. I’m talking pretty much all procurement.

Today’s procurement is perfectly designed to serve one group. Big (IT) vendors. The process is so convoluted and so complicated they are really the only ones with the resources to navigate it. The White House document essentially centralizes procurement further. On the one hand this is good, it means the requirements around platforms and data noted in the document can be more readily enforced. Basically the centre is asserting more control at the expense of the departments. And yes, there may be some economies of scale that benefit the government. But the truth is whenever procurement decision get bigger, so to do the stakes, and so to does the process surrounding them. Thus there are a tiny handful of players that can respond to any RFP and real risks that the government ends up in a duopoly (kind of like with defense contractors). There is some wording around open source solutions that helps address some of this, but ultimately, it is hard to see how the recommendations are going to really alter the quagmire that is government procurement.

Of course, these are just some thoughts and comments that struck me and that I hope, those of you still reading, will find helpful. I’ve got thoughts on the White House Innovation Fellows especially given it appears to have been at least in part inspired by the Code for America fellowship program which I have been lucky enough to have been involved with. But I’ll save those for another post.

Control Your Content: Why SurveyMonkey Should Add a "Download Your Answers" Button

Let me start by saying, I really like SurveyMonkey.

By this I mean, I like SurveyMonkey specifically, but I also like online surveys in general. They are easy to ignore if I’m uninterested in the topic but – when the topic is relevant –  it is a great, simple service that allows me to share feedback, comments and opinions with whomever wants to solicit them.

Increasingly however, I find people and organizations are putting up more demanding surveys – surveys that necessitate thoughtful, and even occasionally long form, responses.

Take for example the Canadian Government. It used an online survey tool during its consultation on open government and open data.  The experience was pretty good. Rather than a clunky government website, there was a relatively easy form to fill out. Better still, since the form was long, it was wonderful that you could save your answers and come back to it later! This mattered since some of the form’s questions prompted me to write lengthy (and hopefully) insightful responses.

But therein lies the rub. In the jargon of the social media world I was “creating content.” This wasn’t just about clicking boxes. I was writing. And surprisingly many of my answers were causing me to develop new ideas. I was excited! I wanted to take the content I had created and turn it into a blog post.

Sadly, most survey tools make it very, very hard for you to capture the content you’ve created. It feels like it would be relatively easy to have a “download my answers” button at the end of a survey. I mean, if I’ve taken 10-120 minutes to complete a survey or public consultation shouldn’t we make it easy for me to keep a record of my responses? Instead, I’ve got to copy and paste the questions, and my answers, into a text document as I go. And of course, I’d better decide that I want to do that before I start since some survey tools don’t allow you to go back and see previous answers.

I ultimately did convert my answers into a blog post (you can see it here), but there was about 20 minutes of cutting, pasting, figuring things out, and reformatting. And there was some content (like Semantic Differential questions – where you rate statements) which were simply to hard to replicate.

There are, of course, other uses too. I had a similar experience last week after being invited to complete a survey posted by the Open Government Partnership Steering Committee on its Independent Reporting Mechanism. About half way through filling it out some colleagues suggested we compare answers to better understand one another’s advice. A download your answer tool would have convert a 15 minute into a 10 second task. All to access content I created.

I’m not claiming this is the be all, end all of online survey features, but it is the kind of simple thing that I survey company can do that will cause some users to really fall in love with the service. To its credit SurveyMonkey was at least willing to acknowledge the feedback – just what you’d hope for from a company that specializes in soliciting opinion online! With luck, maybe the idea will go somewhere.

sm-tweet

The Transparent Hypocrisy of Ethical Oil – who is really laundering money

The other week the Canadian Minister of the Environment, Peter Kent accused Canadian Charities of “laundering money” because they accept some funds from outside the country. This has all been part of a larger effort – championed by Ethical Oil – to discredit Canada’s environmental organizations.

As an open government and transparency in politics advocate I find the whole conversation deeply interesting. On the one side, environmental groups have to disclose who funds them and where the money comes from. This is what actually allows Ethical Oil to make their complaint in the first place. Ethical Oil however, feels no need to share this information. Apparently what is good for the goose, is not good for the gander.

The media really only touches on this fact occasionally. In the Globe an Mail this hypocrisy was buried in the last few lines of a recent article:

Ethical Oil launched a radio ad Tuesday that will run throughout Ontario flaunting the proposal as a way to lower oil prices and create jobs.

Mr. Ellerton said he couldn’t immediately provide an estimate for how much the group is spending on the campaign. He also refused to reveal who funds the lobby group, other than to say: “Ethical Oil accepts donations from Canadians and Canadian businesses.”

The group has supported the Conservatives move to end foreign funding of environmental groups, including those that oppose the Northern Gateway and Keystone XL pipeline projects. Mr. Ellerton has campaigned to expose the funding behind those groups but said he could not shed more light on his own organization.

“We have an organizational policy not to disclose who are donors because we’ve faced lawsuits in the Kingdom of Saudi Arabia,” he said, “and we don’t want to expose our donors to that kind of litigation.”

Of course, the the notion of what a “Canadian Business” means is never challenged. It turns out that many of the large “Canadian” players in the oil sands – those with corporate headquarters in Canada – are barely Canadian. Indeed, a recent analysis using Bloomberg data showed that 71 per cent of all tar sands production is owned by non-Canadian shareholders.

Consider the following ownership stakes of “Canadian” businesses:

Petrobank Energy Resources: 94.8% foreign owned

Husky Energy: 90.9% foreign (this one really surprised me!)

MEG Energy: 89.1% foreign

Imperial Oil: 88.9% foreign

Nexen: 69.9% foreign

Canadian Natural Resources Limited: 58.8% foreign

Suncor Energy: 56.8% foreign

Canadian Oil Sands: 56.8% foreign

Cenovus: 54.7% foreign

I think it is great that Ethical Oil wants greater transparency around who is funding who in the Oil Sands debate. But shouldn’t they be held to the same standard so that we can understand who is funding them?

If Ethical Oil and the government want to call it money laundering when a foreign citizen funds a Canadian environmental group, should we also use the term if a foreign (often Chinese or American) entity plows money into Ethical Oil?

Mainstreaming The Gov 2.0 Message in the Canadian Public Service

A couple of years ago I wrote a Globe Op-Ed “A Click Heard Across the Public Service” that outlined the significance of the clerk using GCPEDIA to communicate with public servants. It was a message – or even more importantly – an action to affirm his commitment to change how government works. For those unfamiliar, the Clerk of the Privy Council is the head of the public service for the federal government, a crude analogy would be he is the CEO and the Prime Minister is the Chairman (yes, I know that analogy is going to get me in trouble with people…)

Well, the clerk continues to broadcast that message, this time in his Nineteenth Annual Report to the Prime Minister on the Public Service of Canada. As an observer in this space what is particularly exciting for me is that:

  • The Clerk continues to broadcast this message. Leadership and support at the top is essential on these issues. It isn’t sufficient, but it is necessary.
  • The role of open data and social media is acknowledged on several occasions

And as a policy entrepreneur, what is doubly exciting is that:

  • Projects I’ve been personally involved in get called out; and
  • Language I’ve been using in briefs, blog posts and talks to public servants is in this text

You can, of course, read the whole report here. There is much more in it than just talk of social media and rethinking the public service, there is obviously talk about the budget and other policy areas as well. But bot the continued prominence given to renewal and technology, and explicit statements about the failure to move fast enough to keep up with the speed of change in society at large, suggests that the clerk continues to be worried about this issue.

For those less keen to read the whole thing, here are some juice bits that mattered to me:

In the section “The World in Which We Serve” which is basically providing context…

At the same time, the traditional relationship between government and citizens continues to evolve. Enabled by instantaneous communication and collaboration technologies, citizens are demanding a greater role in public policy development and in the design and delivery of services. They want greater access to government data and more openness and transparency from their institutions.

Under “Our Evolving Institution” which lays out some of the current challenges and priorities we find this as one of the four areas of focus mentioned:

  • The Government expanded its commitment to Open Government through three main streams: Open Data (making greater amounts of government data available to citizens), Open Information (proactively releasing information about Government activities) and Open Dialogue (expanding citizen engagement with Government through Web 2.0 technologies).

This is indeed interesting. The more this government talks about open in general, the more it will be interesting to see how the public reacts, particularly in regards to its treatment of certain sectors (e.g. environmental groups). Still more interesting is what appears to be a growing recognition of the importance of data (from a government that cut the long form census). Just yesterday the Health Minister, while talking about a controversial multiple sclerosis vein procedure stated that:

“Before our government will give the green light to a limited clinical trial here in Canada, the proposed trial would need to receive all necessary ethical and medical approvals. As Minister of Health, when it comes to clinical issues, I rely on advice from doctors and scientists who are continually monitoring the latest research, and make recommendations in the best interests of patient health and safety.”

This is, interestingly, an interesting statement from a government that called doctors “unethical” because of their support for the insite injection site which, the evidence shows, is the best way to save lives and get drug users into detox programs.

For evidence based policy advocates – such as myself – the adoption of the language of data is one that I think could help refocus debates onto a more productive terrain.

Then towards the bottom of the report there is a call out that mentions the Open Policy conference at DFAIT I had the real joy of helping out convene and that I served as the host and facilitator for.

Policy Built on Shared Knowledge
The Department of Foreign Affairs and International Trade (DFAIT) has been experimenting with an Open Policy Development Model that uses social networking and technology to leverage ideas and expertise from both inside and outside the department. A recent full-day event convened 400 public and private sector participants and produced a number of open policy pilots, e.g., an emergency response simulation involving consular officials and a volunteer community of digital crisis-mappers.

DFAIT is also using GCConnex, the Public Service’s social networking site, to open up policy research and development to public servants across departments.

This is a great, a much deserved win for the team at DFAIT that went out on a limb to run this conference and we rewarded with participation from across the public service.

Finally, anyone who has seen me speak will recognize a lot of this text as well:

As author William Gibson observed, “The future is already here, it’s just unevenly distributed.” Across our vast enterprise, public servants are already devising creative ways to do a better job and get better results. We need to shine a light on these trailblazers so that we can all learn from their experiments and build on them. Managers and senior leaders can foster innovation—large and small—by encouraging their teams to ask how their work can be done better, test out new approaches and learn from mistakes.

So much innovation in the 21st century is being made possible by well-developed communication technologies. Yet many public servants are frustrated by a lack of access to the Web 2.0 and social media tools that have such potential for helping us transform the way we work and serve Canadians. Public servants should enjoy consistent access to these new tools wherever possible. We will find a way to achieve this while at the same time safeguarding the data and information in our care.

I also encourage departments to continue expanding the use of Web 2.0 technologies and social media to engage with Canadians, share knowledge, facilitate collaboration, and devise new and efficient services.

To be fully attribute, the William Gibson quote, which I use a great deal, was something I first saw used by my friend Tim O’Reilly who is, needless to say, a man with a real ability to understand a trend and explain an idea to people. I hope his approach to thinking is reflected in much of what I do.

What, in sum, all these call outs really tell us is that the Gov 2.0 message in the federal public service is being mainstreamed, at the very least among the most senior public servants. This does not mean that our government is going to magically transform, it simply means that the message is getting through and people are looking for ways to push this type of thinking into the organization. As I said before, this is not sufficient to change the way government works, but it is necessary.

Going to keep trying to see what I can do to help.

The Oil Sands in Alberta is like Language Laws in Quebec… It's a domestic issue

This post isn’t based on a poll I’ve conducted or some rigorous methodology, rather it has evolved out of conversations I’ve had with friends, thought leaders I’ve run into, articles I’ve read and polls I’ve seen in passing.

As most people know the development of the oil sands is a thorny issue in Canada. The federal government is sweeping aside environmental regulations, labeling environmentalist groups terrorists and money launderers and overriding processes developed to enabled people to express their concerns.

What’s the public make of all this?

Polls generally seem to have the country split. Canadians are not opposed to natural resource development (how could we be?) but they are also worried about the environment (I’m not claiming enough to act). In this regard the Nik Nanos poll from March is instructive. Here most Canadians place environmental concerns (4.24 out of 5) ahead of economic prosperity (3.71 out of 5). There are of course polls that say the opposite. The normally reliable Ipsos Reid has a Canadian Chamber of Commerce poll which asks “it is possible to increase oil and gas production while protecting the environment at the same time.” As if Canadians know! I certainly don’t know the answer to that question. What I do know is that I’d like it to be possible to increase oil and gas productions while protecting the environment at the same time. This, I suspect, is what people are really saying: “Yes! I’d like to have my cake and eat it to. Go figure out the details.” This does not mean it is possible. Just desirable.

So on a superficial level, I suspect that most Canadians think the oil sands are dirty. Their are of course the outlining camps: the die hard supporters and die hard opposers, but I’m not talking about them. Most Canadians are, at their core, uncomfortable with the oil sands. They know it is bad for the environments and may be good for the economy. That doesn’t mean they are opposed, it just doesn’t mean they aren’t happy either.

But here’s the rub.

I think most Canadians feel like the oil sands is an Alberta issue. Ultimately, many don’t care if it is dirty, many don’t care if it doesn’t benefit them. So long as the issue is confined within Alberta’s borders and it’s an Alberta problem/opportunity then they are happy to give them a free hand. I’ve been comparing it to French Language laws in Quebec. You’d be hard pressed to find many Canadians who strongly agree with them. They understand them. They get why they matter to Quebec. And frankly, they’ve given up caring. As long as the issue is confined within Quebec it’s “domestic politics” and they accept it now as fact.

Of course, the moment the issue stretches outside of Alberta, the gloves are off. Take the pipeline proposal for example. I suspect that support for the Gateway pipeline, especially after the Keystone pipeline is approved, will likely disintegrate. Barbara Yaffe beat me to the punch with her column “What’s in Pipeline Expansion for BC?” which articulated exactly where I think BC is going. Already opinion polls show opposition to the pipeline is growing. Anyone knows that the moment a tanker strikes ground off the coast of BC you have a $1B problem on your hand. Most BCers are beginning to ask why they should put their tourism and fishing industries at risk and be left footing the bill for environmental damage on oil that Alberta is making money off of? A couple hundred jobs a year in benefits isn’t going to cut it.

What’s worse is that it is almost impossible to imagine that Keystone won’t get approved this time around. As a result, Alberta will have it’s pipeline out and so the major source of concern – a way to get the oil out – will have been satisfied. Building a pipeline through BC is now no longer essential. It is a bonus. It is all about getting a extra $30 premium a barrel and, of course, satisfying all those Chinese investors. BCers will be even more confused about why they have to absorb the environmental risks so that their neighbor can get rich.

This is also why the provincial NDP’s formal opposition to the pipeline is clever. While it cites environmental concerns and does use tough language it does not draw a hard line in the sand. Rather, it concludes “that the risks of this project (the Northern Gateway Pipeline) far outweigh its benefits.” Implicit in this statement is that if the benefits were to increase – if say, Alberta were to pay a percentage of the royalties to BC – then their position could change as well. In other words – you want us to accommodate your language politics in our province? We may so “no” anyway, but if we say yes… it is going to cost you.

All of this is further complicated by the fact that Alberta’s history of playing well with other provinces on issues of national interest is not… spectacular (remember the Alberta firewall?). Alberta has often wanted to go it alone – that is, indeed, part of its brand. I suspect most of the rest of the country has neither the inclination nor the care to stop them, just like they didn’t with Quebec. But that doesn’t mean they are going to get a helping hand either.

 

 

 

My LRC Review of "When the Gods Changed" and other recommended weekend readings

This week, the Literary Review of Canada published my and Taylor Owen’s review of When the Gods Changed: The Death of Liberal Canada by Peter C. Newman. For non-Canadians Peter Newman is pretty much a legend when it comes to covering Canadian history and politics, he was editor of the country’s largest newspaper and main news magazine and has published over 35 books. I also think the review will be of interest to non-Canadians since I think the topic of the decline of Liberal Canada are also true for a number of other countries experiencing more polarized politics.

Some other articles I’ve been digesting that I recommend for some Friday or weekend reading:

Why China’s Political Model Is Superior

This one is a couple of months old, but it doesn’t matter. Fascinating read. For one it shows the type of timelines that the Chinese look at the world with. Hint. It is waaayyyy longer than ours. Take a whiff:

In Athens, ever-increasing popular participation in politics led to rule by demagogy. And in today’s America, money is now the great enabler of demagogy. As the Nobel-winning economist A. Michael Spence has put it, America has gone from “one propertied man, one vote; to one man, one vote; to one person, one vote; trending to one dollar, one vote.” By any measure, the United States is a constitutional republic in name only.

Unattractive Real Estate Agents Achieve Quicker Sales

Before getting serious on you again, here’s a lighter more interesting note. I often comment in talks I give that real estate agents rarely use data to attract clients – mostly just pictures of themselves. Turns out… there might be more data in that then I thought! Apparently less attractive agents sell homes faster and work harder. More attractive agents take longer, but get more money. Food for thought here.

Andrew Coyne: Question isn’t where conservatism is going, but where has it gone

Another oldie but a goody. Liberal Canada may be dead, but it appears that Conservative Canada isn’t in much better shape. I’ve always enjoyed Coyne and feel like he’s been sharper than usual of late (since moving back to the National Post). For Americans, there may be some interesting lessons in here for the Tea Party movement. Canada experienced a much, much lighter form of conservative rebellion with creation of the Reform Party in the late 80s/early 90s which split off from establishment conservatives. Today, that group is now in power (rebranded) but Coyne assesses that much of what they do has been watered down. But not everything… to the next two articles!

Environmental charities ‘laundering’ foreign funds, Kent says

Sadly, Canada’s “Environment” Minister is spending most of his time attacking environmental groups. The charge is that they use US money to engage in advocacy against a pipeline to be built in Canada. Of course “Laundering” is a serious charge (in infers illegal activity) and given how quick the Conservatives have been in suing opponents for libel Kent had better be careful the stakeholders will adopt this tactic. Of course, this is probably why he doesn’t name any groups in particular (clever!). My advice, is that all the groups named by the Senate committee should sue him, then, to avoid the lawsuit he’d have to either a) back down from the claim altogether, or b) be specific about which group he is referring to to have the other suits thrown out. Next headline… to the double standard!

Fraser Institute co-founder confirms ‘years and years’ of U.S. oil billionaires’ funding

Some nifty investigative work here by a local Vancouver reporter finds that while the Canadian government believes it is bad for environmental groups to receive US funds for advocacy, it is apparently, completely okay for Conservative groups to receive sums of up to $1.7M from US oil billionaires. Ethical Oil – another astro-turf pro-pipeline group does something similar. It receives money from Canadian law firms that represent benefiting American and Chinese oil interests. But that money is labelled “Canadian” because it is washed through Canadian law firms. Confused? You should be.

What retail is hired to do: Apple vs. IKEA

I love that Clay Christiansen is on twitter. The Innovator’s Dilemma is a top 5 book of all time for me. Here is a great break down of how IKEA and Apple stores work. Most intriguing is the unique value proposition/framing their stores make to consumers which explains their phenomenal success as why they are often not imitated.