Monthly Archives: March 2012

Next Generation Open Data: Personal Data Access

Background

This Monday I had the pleasure of being in Mexico City for the OECD’s High Level Meeting on e-Government. CIO’s from a number of countries were present – including Australia, Canada, the UK and Mexico (among others). But what really got me going was a presentation by Chris Vein, the Deputy United States Chief Technology Officer for Government Innovation.

In his presentation he referenced work around the Blue Button and the Green Button – both efforts I was previously familiar with. But my conversation with Chris sparked several new ideas and reminded me of just how revolutionary these initiatives are.

For those unacquainted with them, here’s a brief summary:

The Blue Button Initiative emerged out of the US Department of Veterans Affairs (VA) with a simple goal – create a big blue button on their website that would enable a logged in user to download their health records. That way they can then share those records with whoever they wish, a new doctor, a hospital, an application or even just look at it themselves. The idea has been deemed so good, so important and so popular, that it is now being championed as industry standard, something that not just the VA but all US health providers should do.

The Green Button Initiative is similar. I first read about it on ReadWriteWeb under the catchy and insightful title “Green Button” Open Data Just Created an App Market for 27M US Homes. Essentially the Green Button would enable users to download their energy consumption data from their utility. In the United States 9 utilities have already launched Green Buttons and an app ecosystem – applications that would enable people to monitor their energy use – is starting to emerge. Indeed Chris Vein talked about one app that enabled a user to see their thermostat in real time and then assess the financial and environmental implications of raising and/or lowering it. I personally see the Green Button evolving into an API that you can give others access to… but that is a detail.

Why it Matters

Colleagues like Nigel Shadbolt in the UK have talked a lot about enabling citizens to get their data out of websites like Facebook. And Google has it’s own very laudable Data Liberation Front run by great guy and werewolf expert, Brian Fitzpatrick. But what makes the Green Button and Blue Button initiatives unique and important is that they create a common industry standard for sharing consumer data. This creates incentives for third parties to develop applications and websites that can analyze this data because these applications will scale across jurisdictions. Hence the Read Write Web article’s focus on a new market. It also makes the data easy to share. Healthcare records downloaded using the blue button are easily passed on to a new doctor or a new hospital since now people can design systems to consumer these healthcare records. Most importantly, it gives the option of sharing these records so they don’t have to wait for lumbering bureaucracies.

This is a whole new type of open data. Open not to the public but to the individual to whom the data really belongs.

A Proposal

I would love to see the blue button and green button initiative spread to companies and jurisdictions outside the United States. There is no reason why for examples there cannot be Blue Buttons on Provincial Health Care website in Canada, or the UK. Nor is there any reason why provincial energy corporations like BC Hydro or Bullfrog Energy (there’s a progressive company that would get this) couldn’t implement the Green Button. Doing so would enable Canadian software developers to create applications that could use this data and help citizens and tap into the US market. Conversely, Canadian citizens could tap into applications created in the US.

The opportunity here is huge. Not only could this revolutionize citizens access to their own health and energy consumption data, it would reduce the costs of sharing health care records, which in turn could potentially create savings for the industry at large.

Action

If you are a consumer, tell your local health agency, insurer and energy utility about this.

If you are a energy utility or Ministry of Health and are interested in this – please contact me.

Either way, I hope this is interesting. I believe there is huge potential in Personal Open Data, particular around data currently held by crown corporations and in critical industries, like healthcare.

When Industries Get Disrupted: Toronto Real Estate Boards Sad Campaign

As some of my readers know I’ve been engaged by the real estate industry at various points over the last year to share thoughts about how they might be impacted in a world where listings data might be more open.

So I was saddened to read the other day about this misleading campaign the Toronto Real Estate Board (TREB) has launched against the Competition Bureau. It’s got all the makings of a political attack ad. Ominous warnings, supportive polling and a selective use of facts. You can check it out at the Protectyourprivacy.ca website. (As an aside, those concerned with online issues like myself should be beating ourselves up for letting TREB snag that URL. There are literally dozens of more compelling uses for that domain, from Bill c-30 to advocacy around privacy setting in Facebook or Google.)

The campaign does, however, make a wonderful mini-case study in how some industries react when confronted with disruptive change. They don’t try to innovate out of the problem, they go to the lawyers (and the pollsters and marketers). To be fair, not everyone in the Real Estate industry is behaving this way. Over the past several months I’ve had the real pleasure of meeting many, many real estate agents across the country who have been finding ways to innovate.

Which is why I suspect this campaign is actually quite divisive. Indeed, since the public doesn’t really know or care who does what in the real estate industry, they’re just going lump everyone in together. Consequently, if this campaign backfires (and there is a risk that if anyone pays attention to it, it could) than the entire industry could be tarred, not just those at TREB.

So what is the big scary story? Well according to TREB the Competition Bureau has gone rogue and is going to force Canadians to disclose their every personal detail to the world! Specifically, in the words of the Protectyourprivacy.ca website:

The Competition Bureau is trying to dismantle the safeguards for consumers’ personal and private information.

If they get their way, your sensitive personal home information could be made publicly available to anyone on the internet.

Are your alarm bells going off yet? If you’re like me, the answer is probably yes. But like me it not for any of the reasons TREB wants.

To begin with, Canada has a fairly aggressive Privacy Commissioner who is very willing to speak her mind. I suspect she (and possibly her provincial counterparts) were consulted before Competition Commissioner issued her request. And like most Canadians I likely trust the Privacy Commissioner more than TREB. She’s been fairly quiet.

But of course, why speculate about issues! Let’s go straight to the source. What did the Competition Bureau actually ask for? Well you can find all the relevant documents here (funny how TREB’s campaign website does not link to any of these), but check it out yourself. Here is my breakdown of the issue:

1. This is actually about enabling new services – TREB essentially uses MLS – the online listing service where you look for homes, as a mechanism to prevent new ways of looking for homes online from emerging. I suspect that consumers are not well served by this outcome. That is certainly how the Competition Bureau feels.

2. The Competition Bureau is not asking for information like your name and street address to be posted online for all to see (although I actually think consumers should be given that choice). Indeed you can tell a lawyer was involved in drafting the protectyourprivacy.ca website. There are all these strategically inserted “could’s” as in “your sensitive personal home information could be made publicly available.” Err… that’s a fair degree less alarming.

What the Competition Bureau appears to want is to enable brokers’ client to browse homes on a password-protected site (called a “virtual office website”). Here they could get more details than what is currently available to the public at large on MLS. However, even these password protected site might not include things like the current occupants name. It would however (or at least hopefully) include previous sales prices, as knowing the history of the market is quite helpful. I think most consumers agree that a little more transparency around pricing in the real estate industry would be good for consumers.

3. Of course, anything that happens on such a website would still have to comply with Privacy Laws and would, ultimately, still require the sellers consent.

According to TREB however, implementing these recommendations will lead to mayhem and death. Literally. Here is a quote from their privacy officer:

“There is a real possibility of break-ins and assaults; you only have to read the headlines to imagine what might happen. You hear stories about realtors getting attacked and killed. Can you imagine if we put that information out there about consumers? You can only imagine the headlines.”

Happily the Globe confirmed that the Toronto Police department is not aware of realtors being targeted for attack.

But here is the real punchline. Everything the Competition Commissioner is asking for already exists in places like Nova Scotia or across the entire United States.

Here’s what these lucky jurisdictions have not experienced: a rash of violence resulting from burglars and others browsing homes online (mostly because if they were going to do that… they could JUST USE GOOGLE STREET VIEW.).

And here’s what they have experienced: an explosion in new and innovative ways to browse, buy and sell homes. From Trulia to Zillow to Viewpoint consumers can get a radically better online experience than what is available in Toronto.

I suspect that if consumers actually hear about this campaign many – including most under the age of 40 – are going to see it as an effort by an industry to protect itself from new competition, not as an effort to protect them. If the story does break that way, it will be evidence to many consumers that the gap between them and the Real Estate industry is growing, not shrinking.

 

Some upcoming talks

Sorry for the lack of posts this week – just some calm before gathering storm. April and May are going to be intense.

For those interested in these things I’ve a number of upcoming talks I’ll be giving and and conferences I’ll be attending. Many of these are open to the public in case you are in the neighborhood.

Conference: OECD High Level Meeting on E-Government: New ICT Solutions for Public Sector Agility
Topic: I’ll be on a panel discussing how governments can leverage technology to advance public policy objectives
Location: Mexico City
Date: Monday, March 26th

Conference: Creative Commons Salon on Open Data (hosted by CIPPIC and the Creative Law Society)
Topic: Open Data in Canada
Location: Ottawa
Date: Friday, March 30th

Conference: Democracy on Demand – hosted by the UBC International Relations Student Association
Topic: The Geopolitics of Open
Location: Vancouver
Date: Saturday, March 31st

Conference: Fostering a Critical Development Perspective on Open Government Data
Topic: Seminars on the implications of Open Data, particularly in southern economies
Location: Brasilia
Date: April 16-17

Conference: Open Government Partnership
Topic: (TBD) Panel on Access to Information and Open Data
Location: Brasilia
Date: April 18

Conference: Impacto del Derecho de Acceso a la Información en la participación ciudadana
Topic: Open Data, Transparency and the Future of Government
Location: Santiago, Chile
Date: April 19

Conference: Saskatchewan 3.0 Summit
Topic: The Possibilities and Challenges of Technology in Government
Location: Regina
Date: April 24

Citizen Surveillance and the Coming Challenge for Public Institutions

The other day I stumbled over this intriguing article which describes how a group of residents in Vancouver have started to surveille the police as they do their work in the downtown eastside, one of the poorest and toughest neighborhoods in Canada. The reason is simple. Many people – particularly those who are marginalized and most vulnerable – simply do not trust the police. The interview with the founder of Vancouver Cop Watch probably sums it up best:

“One of the complaints we have about District 2 is about how the Vancouver police were arresting people and taking them off to other areas and beating them up instead of taking them to a jail,” Allan told the Georgia Straight in a phone interview. “So what we do is that, when in the Downtown Eastside, whenever we see the police arresting someone, we follow behind them to make sure that the person makes it to the jail.”

In a world where many feel it is hard to hold accountable government in general and police forces specifically, finding alternative means of creating such accountability will be deeply alluring. And people no longer need the funding and coordination of organizations like Witness (which initially focused on getting videocameras into peoples hands in an effort to prevent human rights abuses). Digital video cameras and smart phones coupled with services like youtube now provide this infrastructure for virtually nothing.

This is the surveillance society – predicted and written about by authors like David Brin – and it is driven as much by us, the citizens, as it is by government.

Vancouver Cop Watch is not the first example of this type of activity – I’ve read about people doing this across the United States. What is fascinating is watching the state try to resist and fail miserably. In the US the police have lost key battles in the courts. This after the police arrested people filming them even when while on their own property. And despite the ruling people continue to be arrested for filming the police – a choice I suspect diminishes public confidence in the police and the state.

And it is not just the police getting filmed. Transit workers in Toronto have taken a beating of late as they are filmed asleep on the job. Similarly, a scared passenger filmed an Ottawa bus driver who was aggressive and swearing at an apologizing mentally ill passenger. A few years ago the public in Montreal was outraged as city crews were filmed repairing few potholes and taking long breaks.

The simple fact is, if you are a front line worker – in either the private, but especially, the public sector – there is a good chance that at some point in your career you’re going to be filmed. And even when you are not being filmed, more data is going to be collected about what you do and how you do it.

Part of this reality is that it is going to require a new level of training for front line workers, this will be particularly hard on the police, but they should expect more stories like this one.

I also suspect there will be two reactions to it. Some government services will clam up and try to become more opaque, fearing all public inquiry. Their citizens – armed with cameras – all become potential threats. Over time, it is hard not imagining their legitimacy becoming further and further eroded (I’m thinking of you RCMP) as a video here, and audio clip there, shapes the publics image of the organization. Others will realize that anecdotal and chance views of their operations represents a real risk to their image. Consequently they may strive to be more transparent – sharing more data about their operations and their processes – in an effort to provide the public with greater context. The goal here will be to provide a counter point to any unfortunate incidents, trying to make a single negative anecdotal data point that happened to be filmed part of a larger complex number of data points.

Obviously, I have strong suspicions regarding which strategy will work, and which one won’t, in a democratic society but am confident many will disagree.

Either way, these challenges are going to require adaptation strategies and it won’t be easy for public institutions adverse to both negative publicity and transparency.

Data.gc.ca – Data Sets I found that are interesting, and some suggestions

Yesterday was the one year anniversary of the Canadian federal government’s open data portal. Over the past year government officials have been continuously adding to the portal, but as it isn’t particularly easy to browse data sets on the website, I’ve noticed a lot of people aren’t aware of what data is now available (self included!). Consequently, I want to encourage people to scan the available data sets and blog about ones that they think might be interesting to them personally, to others, or to communities of interests they may know.

Such an undertaking has been rendered MUCH easier thanks to the data.gc.ca administrators decision to publish a list of all the data sets available on the site. Turns out, there are 11680 data sets listed in this file. Of course, reviewing all this data took me much longer than I thought it would! (and to be clear, I didn’t explore each one in detail), but the process has been deeply interesting. Below are some thoughts, ideas and data sets that have come out of this exploration – I hope you’ll keep reading, and that it will be of interest to ordinary citizens, prospective data users and to managers of open government data portals.

TagCloud_GC_OpenData

A TagCloud of the Data Sets on data.gc.ca

Some Brief Thoughts on the Portal (and for others thinking about exploring the data)

Trying to review all the data sets on the portal is a enormous task and trying to do it has taught me some lessons about what works and doesn’t. The first is that, while the search function on the website is probably good if you have a keyword or a specific data you are looking for, it is much easier to browse the data in an excel than on the website. What was particularly nice about this is that, in excel, the data was often clustered by type. This made easy to spot related data sets – a great example of this when I found the data on “Building permits, residential values and number of units, by type of dwelling” I could immediately see there were about 12 other data sets on building permits available.

Another issue that became clear to me is the problem of how a data set is classified. For example, because of the way the data is structured (really as a report) the Canadian Dairy Exports data has a unique data file for every month and year (you can look at May 1988 as an example). That means each month is counted as a unique “data set” in the catalog. Of course, French and English versions are also counted as unique. This means that what I would consider to be a single data set “Canadian Dairy Exports Month Dairy Year from 1988 to present” actually counts as 398 data sets. This has two outcomes. First, it is hard to imagine anyone wants the data for just one month. This means a user looking for longitudinal data on this subject has to download 199 distinct data sets (very annoying). Why not just group it into one? Second, given that governments like to keep score about how many data sets they share – counting each month as a unique data set feels… unsportsmanlike. To be clear, this outcome is an artifact of how Agriculture Canada gathers and exports this data, but it is an example of the types of problems an open data catalog needs to come to grips with.

Finally, many users – particularly, but not exclusively, developers – are looking for data that is up to date. Indeed, real time data is particularly sexy since its dynamic nature means you can do interesting things with it. This it was frustrating to occasionally find data sets that were no longer being collected. A great example of this was the Provincial allocation of corporate taxable income, by industry. This data set jumped out at me as I thought it could be quite interesting. Sadly, StatsCan stopped collecting data on this in 1987 so any visualization will have limited use today. This is not to say data like this should be pulled from the catalog, but it might be nice to distinguish between datasets that are being collected on an ongoing basis versus those that are no longer being updated.

Data Sets I found Interesting

Just quickly before I begin, some quick thoughts on my very unscientific methodology for identifying interesting data sets.

  • First, browsing the data sets really brought home to me how many will be interesting to different groups – we really are in the world of the long tail of public policy. As a result, there is lots of data that I think will be interesting to many, many people that is not on this list.
  • Second, I tried to not include too much of StatsCan’s data. StatsCan data already has a fairly well developed user base. And while I’m confident that base is going to get bigger still now that its data is free, I figure there are already a number of people who will be sharing/talking about it
  • Finally, I’ve tried to identify some data sets that I think would make for good mashups or apps. This isn’t easy with federal government data sets since they tend do be more aggregate and high-level than say municipal data sets… but I’ve tried to tease out what I can. That said, I’m sure there is much, much more.

New GeoSpatial API!

So the first data set is a little bit of a cheat since it is not on the open data portal, but I was emailed about it yesterday and it is so damn exciting, I’ve got to share it. It is a recently released public BETA of a new RESTful API from the very cool people at GeoGratis that provides a consolidated access point to several repositories of geospatial data and information products including GeoGratis, GeoPub and Mirage. (huge thank you to the GeoGratis team for sending this to me).

Documentation can be found here (and in french here) and a sample search client that demonstrates some of its functionality and how to interact with the API can be found here. Formats include ATOM, HTML Fragment, CSV, RSS, JSON, and KML. (So you can see results – for example – in Google Earth by using the KML format (example here).

I’m also told that these fine folks have been working on geolocation service, so you can do sexy things like search by place name, by NTS map or by the first three characters of a postal code. Documentation will be posted here in english and french. Super geeks may notice that there is a field in the JSON called CGNDBkey. I’m also told you can use this key to select an individual placename according to the Canadian Geographic names board. Finally, you can also search all their Metadata through search engines like google (here is a sample search for gold they sent me).

All data is currently licensed under GeoGratis.

The National Pollutant Release Inventory

Description: The National Pollutant Release Inventory (NPRI) is Canada’s public inventory of pollutant releases (to air, water and land), disposals and transfers for recycling.

Notes: This is the same data set (but updated) that we used to create emitter.ca. I frankly feel like the opportunities around this data set, for environmentalists, investors (concerned about regulatory and lawsuit risks), the real estate industry, and others, is enormous. The public could be very interested in this.

Greenhouse Gas Emissions Reporting Program

Description: The Greenhouse Gas Emissions Reporting Program (GHGRP) is Canada’s legislated, publicly-accessible inventory of facility-reported greenhouse gas (GHG) data and information.

Notes: What interesting here is that while it doesn’t have lat/longs, it does have facility names and addresses. That means you should be able to cross reference it with the NPRI (which does have lat/longs) to be able to plot where the big greenhouse gas emitters are on a map. Think the same people as the NPRI might be interested in this data.

The Canadian Ice Thickness Program

Description: The Ice Thickness program dataset documents the thickness of ice on the ocean. Measurements begin when the ice is safe to walk on and continue until it is no longer safe to do so. This data can help gauge the impact of global warming and is relevant to shipping data in the north of Canada.

Notes: Students interested in global warming… this could make for some fun visualization.

Argo: Canadian Tracked Data

Description: Argo Data documents some of the approximately 3000 profiling floats were deployed around the world. Once at sea, the float sinks to a preprogrammed target depth of 2000 meters for a preprogrammed period of time. It then floats to the surface, taking temperature and salinity values during its ascent at set depths. — The Canadian Tracked Argo Datadescribes the Argo programme in Canada and provides data and information about Canadian floats.

Notes: Okay, so I can think of no use for this data, but I just that it was so awesome that people are doing this that I totally geeked out.

Civil Aircraft Register Database

Description: Civil Aircraft Register Database – this file contains the current mark, aircraft and owner information of all Canadian civil registered aircraft.

Notes: Here I really think there could be a geeky app. Just a simple app that you can type an aircraft’s number into and it will tell you the owner and details about the plane. I actually think the government could do a lot of work with this data. If regulatory and maintenance data were made available as well – then you’d have a powerful app that would tell you a lot about the planes you fly in. At a minimum would be of interest to flight enthusiasts.

Real Time Hydrometric Data Tool

Description: Real Time Hydrometric Data Tool – this site provides public access to real-time hydrometric (water level and streamflow) data collected at over 1700 locations in Canada. These data are collected under a national program jointly administered under federal-provincial and federal-territorial cost-sharing agreements. It is through partnerships that the Water Survey of Canada program has built a standardized and credible environmental information base for Canada. This dataset contains both current and historical datasets. The current month can be viewed in an HTML table, and historical data can be downloaded in CSV format.

Notes: So ripe for an API! What is cool is that the people at Environment Canada have integrated it into google maps. I could imagine fly fisherman and communities at risk of flooding being interested in this data set.

Access to information data sets

Description: 2006-2010 Access to Information and Privacy Statistics (With the previous years here, here and here.) is a compilation of statistical information about access to information and privacy submitted by government institutions subject to the Access to Information Act and the Privacy Act for 2006-2010.

Notes: I’d love to crunch this stuff again and see whose naughty and nice in the ATIP world…

Poultry and Forestry data

No links, BECAUSE THERE IS SO MUCH OF IT. Anyone interested in the Poultry or Forestry industry will find lots of data… obviously this stuff is useful to people who analyze these industries but I suspect there are a couple of “A” university level papers hidden in that data set as well.

Building Permits

There is tons on building permits., construction.. Actually one of the benefits of looking at the data in a spread sheet, easy to see other related data sets.

StatsCan

It really is amazing how much Statistic Canada data there is. Even reviewing something like the supply and demand of natural gas liquids got me thinking about the wealth of information trapped in there. One thing I do hope statscan starts to do is geolocate its data whenever possible.

Crime Data

As this has been in the news I couldn’t help but include it. It’s nice that any citizen can look at the crime data direct from StatsCan too see how our crime rate is falling (which is why we should build more expensive prisons) Crime statistics, by detailed offences. Of course unreported crime, which we all know is climbing at 3000% a year, is not included in these stats.

Legal Aid Applications

Legal aid applications, by status and type of matter. This was interesting to me since, here in BC there is much talk about funding for the Justice system and yet, the number of legal aid applications has remained more or less flat over the past 5 years.

National Broadband Coverage data

Description: The National Broadband Coverage Data represents broadband coverage information, by technology, for existing broadband service providers as of January 2012. Coverage information for Broadband Canada Program projects is included for all completed projects. Coverage information is aggregated over a grid of hexagons, which are each 6 km across. The estimated range of unserved / underserved population within in each hexagon location is included.

Notes: What’s nice is that there is lat/long data attached to all this, so mapping it, and potentially creating a heat map is possible. I’m certain the people at OpenMedia might appreciate such a map.

Census Consolidated Subdivision

Description: Census Consolidated Subdivision Cartographic Boundary Files portrays the geographic limits used for the 2006 census dissemination. The Census Consolidated Subdivision Boundary Files contain the boundaries of all 2,341 census consolidated subdivisions.

Notes: Obviously this one is on every data geeks radar, but just in case you’ve been asleep for the past 5 months, I wanted to highlight it.

Non-Emergency Surgeries, distribution of waiting times

Description: Non-emergency surgeries, distribution of waiting times, household population aged 15 and over, Canada, provinces and territories

Notes: Would love to see this at the hospital and clinic level!

Border Wait Times

Description: Estimates Border Wait Times (commercial and travellers flow) for the top 22 Canada Border Services Agency land border crossings.

Notes: Here I really think there is an app that could be made. At the very least there is something that could tell you historical averages and ideally, could be integrated into Google and Bing maps when calculating trip times… I can also imagine a lot of companies that export goods to the US are concerned about this issue and would be interested in better data to predict the costs and times of shipping goods. Big potential here.

Okay, that’s my list. Hope it inspires you to take a look yourself, or play with some of the data listed above!

Sharing ideas about data.gc.ca

As some of you may remember, the other week I suggested that on its one year anniversary we hack data.gc.ca – specifically, that people share what data sets they find most interesting on the website, especially as it is hard to search it.

Initially I’d uploaded a list of all the data sets on the catalog to buzzdata. However the other day the data.gc.ca administrators added a data set that is a list of all the data sets available on the site (meta, I know). This new list is, apparently, an even more robust and up to date list than the one I shared earlier and is available in both official languages.

If you do end up finding data you think is particularly interesting, creating a list of your favourite data sets, doing a mash up or visualization or (most ambitiously) creating a better way to search data.gc.ca please send me your results, a link, or at least an email. I’ll be posting what I find interesting tonight or tomorrow morning and would love to link to anything anyone else has done too!

 

Want to Find Government Innovation? US Military is often leading the way.

When it comes to see what trends will impact government in 20-30 years I’m a big fan of watching the US military. They may do lot of things wrong but, when it comes to government, they are on the bleeding edge of being a “learning organization.” It often feels like they are less risk averse, more likely to experiment, and, (as noted) more likely to learn, than almost any government agency I can think of (hint, those things maybe be interconnected). Few people realize that to rise above Colonel in many military organizations, you must have at least a masters degree. Many complete PhDs. And these schools often turn into places where people challenge authority and the institution’s conventional thinking.

Part of it, I suspect, has to do with the whole “people die when you make mistakes” aspect of their work. It may also have to do with the seriousness with which they take their mandate. And part of it has to do with the resources they have at their disposal.

But regardless of the cause, I find they are often at the cutting edge of ideas in the public sector. For example, I can’t think of a government organization that empowers the lowest echelons of its employee base more than the US military. Their network centric vision of the world means those on the front lines (both literally and figuratively) are often empowered, trusted and strongly supported with tools, data and technology to make decisions on the fly. In an odd way, the very hierarchical system that the rest of government has been modeled on, has really transcended into something different. Still very hierarchical but, at the same time, networked.

Frankly, if a 1-800 call operator at Service Canada or the guy working at the DMV in Pasadena, CA (or even just their managers) had 20% of the autonomy of a US Sargent, I suspect government would be far more responsive and innovative. Of course, Service Canada or the California DMV would have to have a network centric approach to their work… and that’s a ways off since it demands serious cultural challenge, the hardest thing to shift in an org.

Anyways… long rant. Today I’m interested in another smart call the US military is making that government procurement departments around the world should be paying attention to (I’m especially looking at you Public Works – pushers of Beehive over GCPEDIA). This article, Open source helicopters trivialize Europe’s ODF troubles, on Computer World’s Public Sector IT blog outlines how the next generation of US helicopters will be built on an open platform. No more proprietary software that binds hardware to a particular vendor.

Money quote from the piece:

Weapons manufacturers and US forces made an unequivocal declaration for royalty-free standards in January through the FACE (Future Airborne Capabilities Environment) Consortium they formed in response to US Defence Secretary Leon Panetta’s call for a “common aircraft architecture and subsystems”.

“The FACE Standard is an open, nonproprietary technical specification that is publicly available without restrictive contracts, licensing terms, or royalties,” the Consortium announced from its base at The Open Group, the industry association responsible for the POSIX open Unix specification.

“In business terms, the open standards specified for FACE mean that programmers are freely able to use them without monetary remuneration or other obligation to the standards owner,” it said.

While business software producers have opposed governments that have tried to implement identical open standards policies with the claim it will handicap innovation and dampen competition, the US military is embracing open standards for precisely the opposite reasons.

So suddenly the we are going to have an open source approach to innovation and program delivery (helicopter manufacturing, operation and maintenance) at major scale. Trust me, if the US military is trying to do this with helicopters you can convert you proprietary intranet to a open source wiki platform. I can’t believe the complexity is as great. But the larger point here is that this approach could be used to think about any system a government wants to develop, from earthquake monitoring equipment to healthcare systems to transit passes. From a “government as a platform perspective” this could be a project to watch. Lots of potential lessons here.