Category Archives: open source

Transparency Case Study: There are Good and Bad Ways Your Organization can be made "Open"

If you have not had the chance, I strongly encourage you to check out a fantastic piece of journalism in this week’s Economist on the state of the Catholic Church in America. It’s a wonderful example of investigative and data driven journalism made possible (sadly) by the recent spat of sexual-abuse and bankruptcy cases. As a result some of the normally secret financial records of the Church have been made public enabling the Economist to reconstruct the secret and opaque general finances of the Catholic church in America. It is a fascinating, and at times disturbing read.

The articles also suggests – I believe – a broader lessons for non-profits, governments and companies. Disclosure and transparency are essential to the effective running of an organization. As you read the piece it is clear that more disclosure would probably have compelled the Church to manage its finances in a more fiscally sound manner. It probably would have also made acts that are, at best negligent, at worst corrupt, difficult to impossible. This is, indeed, why many charities, organizations and public companies must conduct audits and publish the results.

But conforming to legal requirements will not shield you from an angry public. My sense is that many member contribution based organizations – from public companies to clubs to governments, are going to feel enormous pressure from their “contributors” to disclose more about how funds are collected, managed, disbursed and used. In a post financial collapse and post Enron era it’s unclear to me that people trust auditors the way they once did. In addition, as technology makes it easier to track money in real time, contributors are going to want more than just an annual audit. Even if they look at it rarely, they are going to want to know there is a dashboard or system they can look at and understand that shows them where the money goes.

I’m open to being wrong about this – and I’m not suggesting this is a panacea that solves all problems, but I nonetheless suspect that many organizations are going to feel pressure to become more transparent. There will be good ways in which that takes place… and bad ways. The Catholic Church story in the Economist is probably an example of the worst possible way: transparency forced upon an organization through the release of documents in a court case.

For anyone running an non-profit, community group, public agency or government department – this makes the article doubly worth reading. It is a case study in the worst possible scenario for your organization. The kind of disaster you never want to have to deal with.

The problem is, and I’m going to go out on a limb here, is that, at some point in the next 10-20 years, there is a non-trivial risk any organization (including your’s, reader) will face a publicity or legitimacy crisis because of a real or imagined problem. Trust me when I tell you: that moment will not be the moment when it is easiest or desirable from a cost, political and cultural perspective, to make your organization more transaparent. So better for you to think about how you’d like to shift policies, culture and norms to make it more transparent and accountable today, when things are okay, than in the crisis.

Consider again the Catholic Church. There are some fascinating and disturbing facts shared in the story that provide some interesting context. On the fascinating side, I had no idea of the scope and size of the Catholic Church. Consider that, according to the article:

“Almost 100m Americans, a third of the nation, have been baptised into the faith and 74m identify themselves as Catholic.”

and

“there are now over 6,800 Catholic schools (5% of the national total); 630 hospitals (11%) plus a similar number of smaller health facilities; and 244 colleges and universities.”

We are talking about a major non-profit that is providing significant services into numerous communities. It also means that the Catholic church does a lot of things that many other non-profits do. Whatever you are doing, they are probably doing it too.

Now consider some of the terrible financial practices the Church tried/tries to get away with because it thinks no one will be able to see them:

Lying about assets: “In a particularly striking example, the diocese of San Diego listed the value of a whole city block in downtown San Diego at $40,000, the price at which it had been acquired in the 1940s, rather than trying to estimate the current market value, as required. Worse, it altered the forms in which assets had to be listed. The judge in the case, Louise Adler, was so vexed by this and other shenanigans on the part of the diocese that she ordered a special investigation into church finances which was led by Todd Neilson, a former FBI agent and renowned forensic accountant. The diocese ended up settling its sexual-abuse cases for almost $200m.”

Playing fast and loose with finances: “Some dioceses have, in effect, raided priests’ pension funds to cover settlements and other losses. The church regularly collects money in the name of priests’ retirement. But in the dioceses that have gone bust lawyers and judges confirm that those funds are commingled with other investments, which makes them easily diverted to other uses.”

Misleading contributors about the destination of funds: “Under Cardinal Bernard Law, the archdiocese of Boston contributed nothing to its clergy retirement fund between 1986 and 2002, despite receiving an estimated $70m-90m in Easter and Christmas offerings that many parishioners believed would benefit retired priests.”

Using Public Subsidies to Indirectly Fund Unpopular Activities: “Muni bonds are generally tax-free for investors, so the cost of borrowing is lower than it would be for a taxable investment. In other words, the church enjoys a subsidy more commonly associated with local governments and public-sector projects. If the church has issued more debt in part to meet the financial strains caused by the scandals, then the American taxpayer has indirectly helped mitigate the church’s losses from its settlements. Taxpayers may end up on the hook for other costs, too. For example, settlement of the hundreds of possible abuse cases in New York might cause the closure of Catholic schools across the city.”

Of course all of this pales in comparison to the most disturbing part of the article: in several jurisdictions, the church is spending money to lobby governments to no extend the statute of limitations around sexual-abuse cases. This is so that it, and its priests, cannot be charged by authorities diminishing the likelihood that they get sued. The prospect that an organization that is supposed to both model the highest ideals of behaviour as well as protect the most marginalized is trying to limit the statues on such a heinous crime is so profoundly disgusting it is hard to put words to it. The Economist gives it a shot:

Various sources say that Cardinal Dolan and other New York bishops are spending a substantial amount—estimates range from $100,000 a year to well over $1m—on lobbying the state assembly to keep the current statute of limitations in place. His office will not comment on these estimates. This is in addition to the soft lobbying of lawmakers by those with pulpits at their disposal. The USCCB, the highest Catholic body in America, also lobbies the federal government on the issue. In April the California Catholic Conference, an organisation that brings the state’s bishops together, sent a letter to California’s Assembly opposing a bill that would extend the statute and require more rigorous background checks on church workers.

This disgusting discover aside, most organizations are probably not going to have the same profound problems found in the Catholic Church. But in almost every organization, no matter the controls, some form of mismanagement is probably taking place. The question is, will you already have in place policies and a culture that support transparency and disclosure before the problem is discovered – or will the crises become the moment where you have to try to implement them, probably under less than ideal circumstances?

As they said after Watergate “It’s not the Crime that kills you, but the cover up,” good transparency and disclosure can’t stop the crime, but it might help prevent them. Moreover, it can also make the cover up harder and, potentially, make it easier to ease the concerns of contributors and rebuilt trust. One could imagine that if the Church had been more transparent about its finances it might have better protected itself against bankruptcy from some of these cases. More importantly, it’s transparency might have make it easier to rebuilt trust, whereas any chance now will just seem like a reaction to the crises, not a genuine desire to operate differently.

Again, I think the pressure on many orgs to be more transparent is going to grow. And managers should recognize there are good and bad conditions under which such transparency can take place. Read this Economist story. In addition to be fascinating, it is a great case study in the worst case scenario for opaque institutions.

Using Metrics to Measure Interest in an Open Source Project

David Boswell has a couple of interesting posts (here and here) about how he is using metrics to measure how effective Mozilla is at attracting and engaging people express an interest in helping contribute to the Mozilla mission.

Some of the metrics being used can be seen at Mozilla’s Are We Growing Yet website. What I think is important is how the team is trying to look at metrics from the site to see if tweaks have an impact on attracting more people. Obviously, this is something commercial websites have been doing for a long time and it is about time these lessons were being applied in open source community management.

What I’m particularly excited about (and have helped with) is the creation of contribution paths for different parts of Mozilla. This way we can begin to try to measure how many people that express an interest in contributing to Mozilla start to contribute code (or engage in another activity, like marketing) and how far along the code contribution path they travel. Obviously we’d love for contributors to travel as far along that path as possible and so, with these metrics, we can ideally begin to see how changes in policy, tools or websites might encourage or discourage people’s decision to contribute.

Interesting stuff.

Is Civic Hacking Becoming 'Our Pieces, Loosely Joined?'

I’ve got a piece up over on the WeGov blog at TechPresident – Is Civic Hacking Becoming ‘Our Pieces, Loosely Joined?

Juicy bit:

There is however, a larger issue that this press release raises. So far, it appears that the spirit of re-use among the big players, like MySociety and the Sunlight Foundation*, only goes so deep. Indeed often it seems they are limited to believing others should re-use their code. There are few examples where the bigger players dedicate resources to support other people’s components. Again, it is fine if this is all about creating competing platforms and competing to get players in smaller jurisdictions who cannot finance creating whole websites on their own to adopt it. But if this is about reducing duplication then I’ll expect to see some of the big players throw resources behind components they see built elsewhere. So far it isn’t clear to me that we are truly moving to a world of “small pieces loosely joined” instead of a world of “our pieces, loosely joined.”

You can read the rest over there.

OSCON Community Management Keynote Video, Slides and some Bonus Material

Want to thank everyone who came to my session and who sent me wonderful feedback from both the keynote and the session. I was thrilled to see ZDnet wrote a piece about the keynote as well as have practioners, such as Sonya Barry, the Community Manager for Java write things like this about the longer session:

Wednesday at OSCON we kicked off the morning with the opening plenaries. David Eaves’ talk inspired me to attend his longer session later in the day – Open Source 2.0 – The Science of Community Management. It was packed – in fact the most crowded session I’ve ever seen here. People sharing chairs, sitting on every available spot on the floor, leaning up against the back wall and the doors. Tori did a great writeup of the session, so I won’t rehash, but if you haven’t, you should read it – What does this have to do with the Java Community? Everything. Java’s strength is the community just as much as the technology, and individual project communities are so important to making a project successful and robust.

That post pretty much made my day. It’s why we come to OSCON, to hopefully pass on something helpful, so this conference really felt meaningful to me.

So, to be helpful I wanted to lay out a bunch of the content for those who were and were not there in a single place, plus a fun photo of my little guy – Alec – hanging out at #OSCON.

A Youtube video of the keynote is now up – and I’ve posted my slides here.

In addition, I did an interview in the O’Reilly boothif it goes up on YouTube, I’ll post it.

There is no video of my longer session, formally titled Open Source 2.0 – The Science of Community Management, but informally titled Three Myths of Open Source Communities, but Jeff Longland helpfully took these notes and I’ll try to rewrite it as a series of blog posts in the near future.

Finally, I earlier linked to some blog posts I’ve written about open source communities, and on open source community management as these are a deeper dive on some of the ideas I shared.

Some other notes about OSCON…

If you didn’t catch Robert “r0ml” Lefkowitz’s talk: How The App Store Killed Free Software, And Why We’re OK With That which, contrary to some predictions was neither trolling nor link bait but a very thoughtful talk which I did not entirely agree with but has left me with many, many things to think about (a sign of a great talk) do try to see if an audio copy can be tracked down.

Jono Bacon, Brian Fitzpatrick and Ben Collins-Sussman are all menches of the finest type – I’m grateful for their engagement and support given I’m late arriving at a party they all started. While you are reading this, check out buying Brian and Ben’s new book – Team Geek: A Software Developer’s Guide to Working Well with Others.

Also, if you haven’t watched Tim O’Reilly’s opening keynote, The Clothesline Paradox and the Sharing Economy, take a look. My favourite part is him discussing how we break down the energy sector and claim “solar” only provides us with a tiny fraction of our energy mix (around the 9 minutes mark). Of course, pretty much all energy is solar, from the stuff we count (oil, hydroelectic, etc.. – its all made possible by solar) or the stuff we don’t count like growing our food, etc.. Loved that.

Oh, and this ignite talk on Cryptic Crosswords by Dan Bentley from OSCON last year, remains one of my favourite. I didn’t get to catch is talk this year on why the metric system sucks – but am looking forward seeing it once it is up on YouTube.

Finally, cause I’m a sucker dad, here’s early attempts to teach my 7 month old hitting the OSCON booth hall. As his tweet says “Today I may be a mere pawn, but tomorrow I will be the grandmaster.”

Alec-Chess

Containers, Facebook, Baseball & the Dark Matter around Open Data (#IOGDC keynote)

Below is a extended blog post that summarizes the keynote address I gave at the World Bank/Data.gov International Open Government Data Conference in Washington DC on Wednesday July 11th. This piece is cross posted over at the WeGov blog on TechPresident where I’m also write on transparency, technology and politics.

Yesterday, after spending the day at the International Open Government Data Conference at the World Bank (and co-hosted by data.gov) I left both upbeat and concerned. Upbeat because of the breadth of countries participating and the progress being made.

I was worried however because of the type of conversation we are having how it might limit the growth of both our community and the impact open data could have. Indeed as we talk about technology and how to do open data we risk missing the real point of the whole exercise – which is about use and impacts.

To get drive this point home I want to share three stories that highlight the challenges, I believe, we should be talking about.

Challenge 1: Scale Open Data

IDealx-300x247In 1956 Ideal-X, the ship pictured to the left, sailed from Newark to Houston and changed the world.

Confused? Let me explain.

As Marc Levine chronicles in his excellent book The Box, the world in 1956 was very different to our world today. Global trade was relatively low. China was a long way off from becoming the world’s factory floor. And it was relatively unusual for people to buy goods made elsewhere. Indeed, as Levine puts it, the cost of shipping goods was “so expensive that it did not pay to ship many things halfway across the country, much less halfway around the world.” I’m a child of the second era of globalization. I grew up in a world of global transport and shipping. The world before all of that which Levine is referring to is actually foreign to me. What is amazing is how much of that has just become a basic assumption of life.

And this is why Ideal-X, the aforementioned ship, is so important. It is the first cargo container ship (in how we understand containers). Its trip from Newark to Houston marked the beginning of a revolution because containers slashed the cost of shipping goods. Before Ideal-X the cost of loading cargo onto a medium sized cargo ship was $5.83 per ton, with containers, the cost dropped to 15.8 cents. Yes, the word you are looking for is: “wow.”

You have to understand that before containers loading a ship was a lot more like packing a mini-van for a family vacation to the beach than the orderly process of what could be described as stacking very large lego blocks on a boat. Before containers literally everything had to be hand packed, stored and tied down in the hull. (see picture to the right)

This is a little bit what our open data world looks like right today. The people who are consuming open data are like digital longshoreman. They have to look at each open data set differently, unpack it accordingly and figure out where to put it, how to treat it and what to do with it. Worse,  when looking at data from across multiple jurisdictions it is often much like cargo going around the world before 1956: a very slow and painful process. (see man on the right).

Of course, the real revolution in container shipping happened in 1966 when the size of containers was standardized. Within a few years containers could move from pretty much anywhere in the world from truck to train to boat and back again. In the following decades global shipping trade increased by 2.5 times the rate of economic output. In other words… it exploded.

Container-300x260Geek side bar: For techies, think of shipping containers as the TCP-IP packet of globalization. TCP-IP standardized the packet of information that flowed over the network so that data could move from anywhere to anywhere. Interestingly, like containers, what was in the package was actually not relevant and didn’t need to be known by the person transporting it. But the fact that it could move anywhere created scale and allowed for logarithmic growth.

What I’m trying to drive at is that, when it comes to open data, the number of open data sets that gets published is no longer the critical metric. Nor is the number of open data portals. We’ve won. There are more and more. The marginal political and/or persuasive benefit of an addition of another open data portal or data set won’t change the context anymore. I want to be clear – this is not to say that more open data sets and more open data portals are not important or valuable – from a policy and programmatic perspective more is much, much better. What I am saying is that having more isn’t going to shift the conversation about open data any more. This is especially true if data continues to require large amounts of work and time for people to unpack and understanding it over and over again across every portal.

In other words, what IS going to count, is how many standardized open data sets get created. This is what we SHOULD be measuring. The General Transit Feed Specification revolutionized how people engaged with public transit because the standard made it so easy to build applications and do analysis around it. What we need to do is create similar standards for dozens, hundreds, thousands of other data sets so that we can drive new forms of use and engagement. More importantly we need to figure out how to do this without relying on a standards process that take 8 to 15 to infinite years to decide on said standard. That model is too slow to serve us, and so re-imaging/reinventing that process is where the innovation is going to shift next.

So let’s stop counting the number of open data portals and data sets, and start counting the number of common standards – because that number is really low. More critically, if we want to experience the kind of explosive growth in use like that experienced by global trade and shipping after the rise of the standardized container then our biggest challenge is clear: We need to containerize open data.

Challenge 2: Learn from Facebook

facebook-logoOne of the things I find most interesting about Facebook is that everyone I’ve talked to about it notes how the core technology that made it possible was not particularly new. It wasn’t that Zuckerberg leveraged some new code or invented a new, better coding language. Rather it was that he accomplished a brilliant social hack.

Part of this was luck, that the public had come a long way and was much more willing to do social things online in 2004 than they were willing to do even two years earlier with sites like friendster. Or, more specifically, young people who’d grown up with internet access were willing to do things and imagine using online tools in ways those who had not grown up with those tools wouldn’t or couldn’t. Zuckerberg, and his users, had grown up digital and so could take the same tools everyone else had and do something others hadn’t imagined because their assumptions were just totally different.

My point here is that, while it is still early, I’m hoping we’ll soon have the beginnings of a cohort of public servants who’ve “grown up data.” For whom, despite their short career in the public service, have matured in a period where open data has been an assumption, not a novelty. My hope and suspicion is that this generation of public servants are going to think about Open Data very differently than many of us do. Most importantly, I’m hoping they’ll spur a discussion about how to use open data – not just to share information with the public – but to drive policy objectives. The canonical opportunity for me around this remains restaurant inspection data, but I know there are many, many, more.

What I’m trying to say is that the conferences we organize have got to talk less and less about how to get data open and have to start talking more about how do we use data to drive public policy objectives. I’m hoping the next International Open Government Data Conference will have an increasing number of presentations by citizens, non-profits and other outsiders are using open data to drive their agenda, and how public servants are using open data strategically to drive to a outcome.

I think we have to start fostering that conversation by next year at the latest and that this conversation, about use, has to become core to everything we talk about within 2 years, or we will risk losing steam. This is why I think the containerization of open data is so important, as well as why I think the White House’s digital government strategy is so important since it makes internal use core to the governments open data strategy.

Challenge 3: The Culture and Innovation Challenge.

In May 2010 I gave this talk on Open Data, Baseball and Government at the Gov 2.0 Summit in Washington DC. It centered around the story outline in the fantastic book Moneyball by Michael Lewis. It traces the story about how a baseball team – the Oakland A’s – used a new analysis of players stats to ferret out undervalued players. This enabled them to win a large number of games on a relatively small payroll. Consider the numbers to the right.

I mean if you are the owner of the Texas Rangers, you should be pissed! You are paying 250% in salary for 25% fewer wins than Oakland. If this were a government chart, where “wins” were potholes found and repaired, and “payroll” was costs… everyone in the world bank would be freaking out right now.

For those curious, the analytical “hack” was recognizing that the most valuable thing a player can do on offense is get on base. This is because it gives them an opportunity to score (+) but it also means you don’t burn one of your three “outs” that would end the inning and the chance for other players to score. The problem was, to measure the offensive power of a player, most teams were looking at hitting percentages (along with a lot of other weird, totally non-quantitative stuff) which ignores the possibility of getting walked, which allows you to get on base without hitting the ball!

What’s interesting however is that the original thinking about the fact that people were using the wrong metrics to assess baseball players first happened decades before the Oakland “A”s started to use it. Indeed it was a nighttime security guard with a strong mathematics background and an obsession for baseball that first began point this stuff out.

20-yearsThe point I’m making is that it took 20 years for a manager in baseball to recognize that there was better evidence and data they could be using to make decisions. TWENTY YEARS. And that manager was hated by all the other managers who believed he was ruining the game. Today, this approach to assessing baseball is common place – everyone is doing it – but see how the problem of using baseball’s “open data” to create better outcomes was never just an accessibility issue. Once that was resolved the bigger challenge centered around culture and power. Those with the power had created a culture in which new ideas – ideas grounded in evidence but that were disruptive – couldn’t find an audience. Of course, there were structural issues as well, many people had jobs that depended on not using the data, on instead relying on their “instincts” but I think the cultural issue is a significant one.

So we can’t expect that we are going to go from open portal today to better decisions tomorrow. There is a good chance that some of the ideas data causes us to think will be so radical and challenging that either the ideas, the people who champion them, or both, could get marginalized. On the up side, I feel like I’ve seen some evidence to the contrary to this in city’s like New York and Chicago, but the risk is still there.

So what are we going to do to ensure that the culture of government is one that embraces the challenges to our thinking and assumptions that doesn’t require 20 years to pass for us to make progress. This is a critical challenge for us – and it is much, much bigger than open data.

Conclusion: Focus on the Dark Matter

I’m deeply indebted to my friend – the brilliant Gordon Ross – who put me on to this idea the other day over tea.

Macguffin

Do you remember the briefcase in Pulp Fiction? The on that glowed when opened? That the characters were all excited about but you never knew what was inside it. It’s called a MacGuffin. I’m not talking about the briefcase per se. Rather I mean the object in a story that all the characters are obsessed about, but that you – the audience – never find out what it is, and frankly, really isn’t that important to you. In Pulp Fiction I remember reading that the briefcase is allegedly Marsellus Wallace soul. But ultimately, it doesn’t matter. What matters is that Vincent Vega, Jules Winnfield and a ton of other characters think it is important, and that drives the action and the plot forward.

Again – let me be clear – Open Data Portals are our MacGuffin device. We seem to care A LOT about them. But trust me, what really matters is everything that can happens around them. What makes open data important is not a data portal. It is a necessary prerequisite but it’s not the end, it just the means. We’re here because we believe that the things open data can let us and others do, matter. The Open Data portal was only ever a MacGuffin device – something that focused our attention and helped drive action so that we could do the other things – that dark matter that lies all around the MacGuffin device.

And that is what brings me back to our three challenges. Right now, the debate around open data risks become too much like a Pulp Fiction conference in which all the panels talk about the briefcase. Instead we should be talking more and more about all the action – the dark matter – taking place around the briefcase. Because that is what is really matters. For me, I think the three things that matter most are what I’ve mentioned about in this talk:

  • standards – which will let us scale, I believe strongly that the conversation is going to shift from portals to standards
  • strategic use – starting us down the path of learning how open data can drive policy outcomes; and
  • culture and power – recognizing that there are lots of open data is going to surface a lot of reasons why governments don’t want to engage data driven in decision making

In other words, I want to be talking about how open data can make the world a better place, not about how we do open data. That conversation still matters, open data portals still matter, but the path forward around them feels straightforward, and if they remain the focus we’ll be obsessing about the wrong thing.

So here’s what I’d like to see in the future from our Open Data conferences. We got to stop talking about how to do open data. This is because all of our efforts here, everything we are trying to accomplish… it has nothing to do with the data. What I think we want to be talking about is how open data can be a tool to make the world a better place. So let’s make sure that is the conversation we are have.

Lessons from Michigan's "Innovation Fund" for Government Software

So it was with great interest that several weeks ago a reader emailed me this news article coming out of Michigan. Turns out the state recently approved a $2.5 million dollar innovation fund that will be dispersed in $100,000 to $300,000 chunks to fund about 10 projects. As Government Technology reports:

The $2.5 million innovation fund was approved by the state Legislature in Michigan’s 2012 budget. The fund was made formal this week in a directive from Gov. Rick Snyder. The fund will be overseen by a five-person board that includes Michigan Department of Technology, Management and Budget (DTMB) Director John Nixon and state CIO David Behen.

There are lessons in this for other governments thinking about how to spur greater innovation in government while also reducing the cost of software.

First up: the idea of an innovation fund – particularly one that is designed to support software that works for multiple governments – is a laudable one. As I’ve written before, many governments overpay for software. I shudder to think of how many towns and counties in Michigan alone are paying to have the exact same software developed for them independently. Rather than writing the same piece of software over and over again for each town, getting a single version that is usable by 80% (or heck, even just 25%) of cities and counties would be a big win. We have to find a way to get governments innovating faster, and getting them back in the driver’s seat on  the software they need (as opposed to adapting stuff made for private companies) would be a fantastic start.

Going from this vision – of getting something that works in multiple cities – to reality, is not easy. Read the Executive Directive more closely. What’s particularly interesting (from my reading) is the flexibility of the program:

In addition to the Innovation Fund and Investment Board, the plan may include a full range of public, private, and non-profit collaborative innovation strategies, including resource sharing…

There is good news and bad news here.

The bad news is that all this money could end up as loans to mom and pop software shops that serve a single city or jurisdiction, because they were never designed from the beginning to be usable across multiple jurisdictions. In other words, the innovation fund could go to fund a bunch of vendors who already exist and who, at best, do okay, or at worse, do mediocre work and, in either case, will never be disruptive and blow up the marketplace with something that is both radically helpful and radically low cost.

What makes me particularly nervous about the directive is that there is no reference to open source license. If a government is going to directly fund the development of software, I think it should be open source; otherwise, taxpayers are acting as venture capitalists to develop software that they are also going to pay licenses to use. In other words, they’re absorbing the risk of a VC in order to have the limited rights of being a client; that doesn’t seem right. An open source requirement would be the surest way to ensure an ROI on the program’s money. It assures that Michigan governments that want access to what gets developed can get use it at the lowest possible cost. (To be clear, I’ve no problem with private vendors – I am one – but their software can be closed because they (should) be absorbing the risk of developing it themselves. If the government is giving out grants to develop software for government use, the resulting software should be licensed open.)

Which brings us to the good. My interest in the line of the executive directive cited above was piqued by the reference to public and non-profit “collaborative innovation strategies.” I read that and I immediately think of one of my favourite organizations: Kuali.

Many readers have heard me talk about Kuali, an organization in which a group of universities collectively set the specs for a piece of software they all need and then share in the costs of developing it. I’m a big believer that this model could work for local and even state level governments. This is particularly true for the enterprise management software packages (like financial management), for which cities usually buy over-engineered, feature rich bloatware from organizations like SAP. The savings in all this could be significant, particularly for the middle-sized cities for whom this type of software is overkill.

My real hope is that this is the goal of this fund – to help provide some seed capital to start 10 Kuali-like projects. Indeed, I have no idea if the governor and his CIO’s staff have heard of or talked to the Kuali team before signing this directive, but if they haven’t, they should now. (Note: It’s only a 5 hour drive from the capital, Lansing, Michigan to the home of Kuali in Bloomington, Indiana).

So, if you are a state, provincial or national government and you are thinking about replicating Michigan’s directive – what should you do? Here’s my advice:

  • Require that all the code created by any projects you fund be open source. This doesn’t mean anyone can control the specs – that can still reside in the hands of a small group of players, but it does mean that a variety of companies can get involved in implementation so that there is still competition and innovation. This was the genius of Kuali – in the space of a few months, 10 different companies emerged that serviced Kuali software – in other words, the universities created an entire industry niche that served them and their specific needs exclusively. Genius.
  • Only fund projects that have at least 3 jurisdictions signed up. Very few enterprise open source projects start off with a single entity. Normally they are spec’ed out with several players involved. This is because if just one player is driving the development, they will rationally always choose to take shortcuts that will work for them, but cut down on the likelihood the software will work for others. If, from the beginning, you have to balance lots of different needs, you end up architecting your solution to be flexible enough to work in a diverse range of environments. You need that if your software is going to work for several different governments.
  • Don’t provide the funds, provide matching funds. One way to ensure governments have skin in the game and will actually help develop software is to make them help pay for the development. If a city or government agency is devoting $100,000 towards helping develop a software solution, you’d better believe they are going to try to make it work. If the State of Michigan is paying for something that may work, maybe they’ll contribute and be helpful, or maybe they’ll sit back and see what happens. Ensure they do the former and not the latter – make sure the other parties have skin in the game.
  • Don’t just provide funds for development – provide funds to set up the organization that will coordinate the various participating governments and companies, set out the specs, and project manage the development. Again, to understand what that is like – just fork Kuali’s governance and institutional structure.
  • Ignore government agencies or jurisdictions that believe they are a special unique flower. One of the geniuses of Kuali is that they abstracted the process/workflow layer. That way universities could quickly and easily customize the software so that it worked for how their university does its thing. This was possible not because the universities recognized they were each a unique and special flower but because they recognized that for many areas (like library or financial management) their needs are virtually identical. Find partners that look for similarities, not those who are busy trying to argue they are different.

There is of course more, but I’ll stop there. I’m excited for Michigan. This innovation fund has real promise. I just hope that it gets used to be disruptive, and not to simply fund a few slow and steady (and stodgy) software incumbents that aren’t going to shake up the market and help change the way we do government procurement. We don’t need to spend $2.5 million to get software that is marginally better (or not even). Governments already spend billions every year for that. If we are going to spend a few million to innovate, let’s do it to be truly disruptive.

Want to Find Government Innovation? US Military is often leading the way.

When it comes to see what trends will impact government in 20-30 years I’m a big fan of watching the US military. They may do lot of things wrong but, when it comes to government, they are on the bleeding edge of being a “learning organization.” It often feels like they are less risk averse, more likely to experiment, and, (as noted) more likely to learn, than almost any government agency I can think of (hint, those things maybe be interconnected). Few people realize that to rise above Colonel in many military organizations, you must have at least a masters degree. Many complete PhDs. And these schools often turn into places where people challenge authority and the institution’s conventional thinking.

Part of it, I suspect, has to do with the whole “people die when you make mistakes” aspect of their work. It may also have to do with the seriousness with which they take their mandate. And part of it has to do with the resources they have at their disposal.

But regardless of the cause, I find they are often at the cutting edge of ideas in the public sector. For example, I can’t think of a government organization that empowers the lowest echelons of its employee base more than the US military. Their network centric vision of the world means those on the front lines (both literally and figuratively) are often empowered, trusted and strongly supported with tools, data and technology to make decisions on the fly. In an odd way, the very hierarchical system that the rest of government has been modeled on, has really transcended into something different. Still very hierarchical but, at the same time, networked.

Frankly, if a 1-800 call operator at Service Canada or the guy working at the DMV in Pasadena, CA (or even just their managers) had 20% of the autonomy of a US Sargent, I suspect government would be far more responsive and innovative. Of course, Service Canada or the California DMV would have to have a network centric approach to their work… and that’s a ways off since it demands serious cultural challenge, the hardest thing to shift in an org.

Anyways… long rant. Today I’m interested in another smart call the US military is making that government procurement departments around the world should be paying attention to (I’m especially looking at you Public Works – pushers of Beehive over GCPEDIA). This article, Open source helicopters trivialize Europe’s ODF troubles, on Computer World’s Public Sector IT blog outlines how the next generation of US helicopters will be built on an open platform. No more proprietary software that binds hardware to a particular vendor.

Money quote from the piece:

Weapons manufacturers and US forces made an unequivocal declaration for royalty-free standards in January through the FACE (Future Airborne Capabilities Environment) Consortium they formed in response to US Defence Secretary Leon Panetta’s call for a “common aircraft architecture and subsystems”.

“The FACE Standard is an open, nonproprietary technical specification that is publicly available without restrictive contracts, licensing terms, or royalties,” the Consortium announced from its base at The Open Group, the industry association responsible for the POSIX open Unix specification.

“In business terms, the open standards specified for FACE mean that programmers are freely able to use them without monetary remuneration or other obligation to the standards owner,” it said.

While business software producers have opposed governments that have tried to implement identical open standards policies with the claim it will handicap innovation and dampen competition, the US military is embracing open standards for precisely the opposite reasons.

So suddenly the we are going to have an open source approach to innovation and program delivery (helicopter manufacturing, operation and maintenance) at major scale. Trust me, if the US military is trying to do this with helicopters you can convert you proprietary intranet to a open source wiki platform. I can’t believe the complexity is as great. But the larger point here is that this approach could be used to think about any system a government wants to develop, from earthquake monitoring equipment to healthcare systems to transit passes. From a “government as a platform perspective” this could be a project to watch. Lots of potential lessons here.

Transparency isn't a cost – it's a cost saver (a note for Governments and Drummond)

Yesterday Don Drummond – a leading economist hired by the Ontario government to review how the province delivers services in the face of declining economic growth and rising deficits – published his report.

There is much to commend, it lays out stark truths that frankly, many citizens already know, but that government was too afraid to say aloud. It is a report that, frankly, I think many provincial and state governments may look at with great interest since the challenges faced by Ontario are faced by governments across North America (and Europe).

From an IT perspective – particular one where I believe open innovation could play a powerfully transformative role – I found the report lacking. I say this with enormous trepidation, as I believe Drummond to be a man of supreme intellect, but my sense is he (and/or his team) have profoundly misunderstand government transparency and why it should be relevant. In Chapter 16 (no I have not yet read all 700 pages) a few pieces come together to create, what I believe, are problematic conditions. The first relates to the framing around “accountability”:

Accountability is an essential aspect of government operations, but we often treat that goal as an absolute good. Taxpayers expect excellent public-sector management as well as open and transparent procurement practices. However, an exclusive focus on rigorous financial reporting and compliance as the measure of successful management requires significant investments of time, energy and resources. At some point, this investment is subject to diminishing returns.

Remember the context. This section largely deals with how government services – and in particular the IT aspects of these services – could be consolidated (a process that rarely yields the breadth of savings people believe it will). Through this lens the interesting things about the word “accountability” in this section above is that I could replace it with searchability – the capacity to locate pieces of information. I agree with Drummond that there is a granularity around recording items – say tracking every receipt versus offering per diems – that creates unnecessary costs. Nor to I believe we should pay unlimited costs for transparency – just for the sake of transparency. But I do believe that government needs a much, much stronger capacity to search and locate pieces of information. Indeed, I think that capacity, the ability for government to mine its own data intelligently, will be critical. Transparency thus becomes one of the few metrics citizens have into not only how effective a government’s inputs are, but how effective its systems are.

Case in point. If you required every Canadian under the age of 30 to conduct an ATIP request tomorrow, I predict that you’d have a massive collapse in Canadians confidence in government. The length of ATIP requests (and the fact that in many places, they aren’t even online) probably says less about government secrecy to these Canadians than it does about the government’s capacity to locate, identify and process its own data and information. When you can’t get information to me in a timely manner, it strongly suggests that managers may not be able to get timely information either.

If Ontario’s public service is going to be transformed – especially if it is going to fulfill other Drummond report recommendations, such as:

Further steps should be taken to advance partnering with municipal and federal services —efficiencies can be found by working collaboratively with other levels of government. For example, ServiceOntario in Ottawa co-locates with the City of Ottawa and Service Canada to provide services from one location, therefore improving the client experience. Additionally, the new BizPal account (which allows Ontario businesses to manage multiple government requirements from a single account) allows 127 Ontario municipalities (such as Kingston, Timmins, Brampton and Sudbury) to partner with ServiceOntario and become more efficient in issuing business permits and licensing. The creation of more such hubs, with their critical mass, would make it easier to provide services in both official languages. Such synergies in service delivery will improve customer experience and capitalize on economies of scale.

Then it is going to require systems that can be easily queried as well as interface with other systems quickly. Architecting systems in open standards, that can be easily searched and recoded, will be essential. This is particularly true if the recommendation that private sector partners (who love proprietary data models, standards and systems which regularly trap governments in expensive traps) are to be used more frequently. All this is to say, we shouldn’t to transparency for transparencies sake. We should do transparency because it will make Ontario more interoperable, will lower costs, and will enable more accountability.

Accountability doesn’t have to be a cost drive. Quite the opposite, transparency should and can be the bi-product of good procurement strategies, interoperable architecture choices and effective processes.

Let’s not pit transparency against cost savings. Very often, it’s a false dichotomy.

Two Reasons I love blogging: Helping out great communities

Non profits and governments… this is how open source works: If someone is doing something that is of value to you, help make it better.

There have been two great examples of this type of behaviour on this blog over the past week.

On Monday, I blogged about Represent, a project by OpenNorth that seeks to track all the boundary data in Canada so that citizens can locate what ridings, jurisdictions, regions, etc… they are located in. Yesterday, Elijah van der Giessen, the Creative Services Lead at David Suzuki Foundation commented that:

The David Suzuki Foundation is really jazzed by this project. This is going to solve some big infrastructure gaps for us as our campaigners refocus on the cities and provinces (let’s just say the Feds are gonna be a tough nut to crack for the next while!).

I’ve assigned some programming interns and staff time to supporting OpenNorth.ca, and I encourage other NGOs to contribute to the project.

This is exactly how open source projects grow and become more valuable. I definitely applaud the David Suzuki Foundation for taking this step and hope other non-profits and for profits that see value in Represent will also step forward.

Another example involves cities that are trying to rethink their software stack. Sometimes, projects, especially small projects, just need to find the right people. The other week David Hill the CIO of Marin County posted on my blog that he was looking for partners to adapt KUALI financials (an open source financial software solution developed for an by universities) to the local government context. I posted his comment as a blog post and readers have started to send me contact information for other government CIOs that might find this project interesting. Most notably, it turns out that the City of Bloomington, Indiana are “supporters” of the Kuali project and already have Kuali RICE middleware up and running and is currently evaluating the KPME time & attendance module. Here are two cities that are moving down the same path and may find real benefits to working together.

I’d love nothing more than to see a Kuali for cities emerge. It might radically reshape the software procurement world in government. I hope that Marin County and Bloomington are able to connect.

Fun times in the blogosphere!

 

 

Adapting KUALI financials for cities: Marin County is looking for Partners

Readers of my blog will be familiar Kuali – the coalition of universities that co-create a suite software  core to their operations – as I’ve blogged about several times and argued that it is a powerful model for local governments interested in rethinking how they procure (or really, co-create) their software.

For some time now I’ve heard rumors that some local governments have been playing with Kuali’s software to see if they can adapt it to work for their needs. Yesterday, David Hill of Marin County posted the comment below to a blog post I’d written about Kuali in which he openly states that he is looking for other municipalities to partner with as they try to fork Kuali financials and adapt it to local government.

<dhill@marincounty.org> (unregistered) wrote:

I completely agree.  It is a radical change for government in at least four ways:

1)  Government developers (are there any?) have little experience with open source
2)  CIOs have no inherent motivation to leave the commercial market model
3)  Governments have little experience is sharing
4)  CIOs are losing their staff due to budget cuts, and have no excess resources to take on a project that appears risky

But, let’s not waste a crisis.  Now is the best time to get KUALI financials certified for government finance and accounting and into production.

Please contact me if you are  planning to upgrade or replace your financial system and would like to look at KFS.
Randy Ozden,  VivanTech CEO is a great commercial partner
David Hill,
CIO
County of Marin

David’s offer is an exciting opportunity and I definitely encourage any municipal and county government officials interested in finding a cheap alternative to their financial management software to reach out to David Hill and at least explore this option. (or if you know any local government officials, please forward this to them). I would love nothing more to see some Kuali style projects start to emerge at the local level.