Category Archives: open data

Weaving Foreign Ministries into the Digital Era: Three ideas

Last week I was in Ottawa giving a talk at the Department of Foreign Affairs talking about how technology, new media and open innovation will impact the department’s it work internally, across Ottawa and around the world.

While there is lots to share, here are three ideas I’ve been stewing on:

Keep more citizens safe when abroad – better danger zone notification

Some people believe that open data isn’t relevant to departments like Foreign Affairs or the State Department. Nothing could be further than the truth.

One challenge the department has is getting Canadians to register with them when they visit or live in a country labeled by the department as problematic for traveling in its travel reports (sample here). As you can suspect, few Canadians register with the embassy as they are likely not aware of the program or travel a lot and simply don’t get around to  it.

There are other ways of tackling this problem that might yield broader participation.

Why not turn the Travel Report system into an open data with an API? I’d tackle this by approaching a company like TripIt. Every time I book an airplane ticket or a hotel I simply forward TripIt the reservation, which they scan and turn into events that then automatically appear my calendar. Since they scan my travel plans they also know which country, city and hotel I’m staying in… they also know where I live and could easily ask me for my citizenship. Working with companies like TripIt (or Travelocity, Expedia, etc…) DFAIT could co-design an API into the departments travel report data that would be useful to them. Specifically, I could imagine that if TripIt could query all my trips against those reports then any time they notice I’m traveling somewhere the Foreign Ministry has labelled “exercise a high-degree of caution” or worse trip TripIt could ask me if I’d be willing to let them forward my itinerary to the department. That way I could registry my travel automatically, making the service more convenient for me, and getting the department more information that it believes to be critical as well.

Of course, it might be wise to work with the State Department so that their travel advisories used a similarly structured API (since I can assume TripIt will be more interested in the larger US market than the Canadian market) But facilitating that conversation would be nothing but wins for the department.

More bang for buck in election monitoring

One question that arose during my talk came from an official interested in elections monitoring. In my mind, one thing the department should be considering is a fund to help local democracy groups spin up installations of Ushahidi in countries with fragile democracies that are gearing up for elections. For those unfamiliar with Ushahidi it is a platform developed after the disputed 2007 presidential election in Kenya that plotted eyewitness reports of violence sent in by email and text-message on a google map.

Today it is used to track a number of issues – but problems with elections remain one of its core purposes. The department should think about grants that would help spin up a Ushahidi install to enable citizens of the country register concerns and allegations around fraud, violence, intimidation, etc… It could then verify and inspect issues that are flagged by the countries citizens. This would allow the department to deploy its resources more effectively and ensure that its work was speaking to concerns raised by citizens.

A Developer version of DART?

One of the most popular programs the Canadian government has around international issues is the Disaster Assistance Response Team (DART). In particular, Canadians have often been big fans of DART’s work in purifying water after the boxing day tsunami in Asia as well as its work in Haiti. Maybe the department could have a digital DART team, a group of developers that, in an emergency could help spin up Ushahidi, Fixmystreet, or OpenMRS installations to provide some quick but critical shared infrastructure for Canadians, other countries’ response teams and for non-profits. During periods of non-crisis the team could work on these projects or supporting groups like CrisisCommons or OpenStreetMaps, helping contribute to open source projects that can be instrumental in a humanitarian crisis.

 

Using Open Data to drive good policy outcomes – Vancouver’s Rental Database

One of the best signs for open data is when governments are starting to grasp its potential to achieve policy objectives. Rather than just being about compliance, it is seen as a tool that can support the growth and management of a jurisdiction.

This why I was excited to see Vision Vancouver (in which I’m involved in generally, but was not involved in the development of this policy proposal) announced the other day that, if elected, it intends to create a comprehensive online registry that will track work orders and property violations in Vancouver apartments, highlighting negligent landlords and giving a new tool to empower renters.

As the press release goes on to state, the database is “Modeled after a successful on-line watchlist created by New York City’s Public Advocate, the database will allow Vancouver residents to search out landlords and identify any building or safety violations issued by the City of Vancouver to specific rental buildings.”

Much like the pieces I’ve written around restaurant inspection and product recall data, this is a great example of a data set, that when shared the right way, can empower citizens to make better choices and foster better behaviour from landlords.

My main hope is that in the implementation of this proposal, the city does the right thing and doesn’t create a searchable database on its own website, but actually creates an API that software developers and others can tap into. If they do this, someone may develop a mobile app for renters that would show you the repair record of the building you are standing in front of, or in. This could be very helpful for renters, one could even imagine an app where you SMS the postal code of a rental building and it sends you back some basic information. Also exciting to me is the possibility that a university student might look for trends in the data over time, maybe there is an analysis that my yield and insight that could help landlords mitigate against problems, and reduce the number of repairs they have to make (and so help reduce their costs).

But if Vancouver and New York actually structured the data in the same way, it might create an incentive for other cities to do the same. That might entice some of the better known services to use the data to augment their offerings as well. Imagine if PadMapper, in addition to allowing a prospective renter to search for apartments based on rent costs and number of rooms, could also search based on number of infractions?

pad-mapper-rental

That might have a salutary effect on some (but sadly not all) landlords. All an all an exciting step forward from my friends at Vision who brought open data to Canada.

And Now… Another Message on Open Innovation for Realtors

Over the past few months I’ve given a number of talks on open data and open innovation to groups of realtors around the country. During these talks I have cautioned that the more the real estate industry tries to protect (e.g. not share) its data, the more it risks making access to data (control) be the source of competition as opposed to accessibility with the data (allowing others to create value-added services).

Consequently, recreating already existing data sets will become the goal of competitors if working with the real estate industries data to innovate new services is not possible or prohibitively expensive. Competition on this axis has, I believe, two possible outcomes: One is a winner take all world where the person with the biggest data set wins, or put another way, either the current monopoly maintains its grip or a new monopoly takes over. The other is that there ceases to be, in a meaningful way, a single data repository on real estate and market place data gets broken into lots of different silos. This implications of this outcome are less clear, but there is a risk it could be bad for consumers as, essentially, market information would be fragmented. In either case, both outcomes carry significant risks for organized real estate.

Despite this, many realtors don’t believe it is likely to happen, because they don’t believe their data can be duplicated, no matter how much I try to tell them otherwise.

But…. whoops! Look what happened!

Today, as if to hammer home what I believe is the inevitable, the Globe and Mail has an article today on a new service that allows one to get an estimate on the assessed value of one’s home by accessing an alternative data set (one essentially created by the banks). It is basically a data set that is outside of any owned by organized real estate (that found in MLS). Look! Someone has recreated what was previously seen as a data set that could not be replicated!

Of course the counter is: “It isn’t as good.” Well, two things here. First, it may not have to be. If it offers 80% of the accuracy and 20% of the cost then it will probably be good enough for at least part of the market. And once it is established, I’m confident the owners of the website will find ways to make their service better and the data more accurate.

The real estate industry has an opportunity to shape its future or be shaped by the future. The market (and the competition bureau) isn’t going to give them for forever to make up their minds.

 

As Canada Searches for its Open Government Partnership Commitments: A Proposal

Just before its launch in New York on September 20th, the Canadian Government agreed to be a signatory of the Open Government Partnership (OGP). Composed of over 40 countries the OGP signatories are required to create a list of commitments they promise to implement. Because Canada signed on just before the deadline it has not – to date – submitted its commitments. As a result, there is a fantastic window for the government to do something interesting with this opportunity.

So what should we do? Here are the top 5 suggestions I propose for Canada’s OGP Commitments:

Brief Background on Criteria:

Before diving in, it is worth letting readers know that there are some criteria for making commitments. Specifically, any commitment must tackle at least one of the five “core” challenges: improve public services, increase public integrity, more effectively manage public resources, create safer communities, and increase corporate accountability.

In addition, each recommendation should reflect at least one of the core OGP principles, which are: transparency, citizen participation, accountability, and technology and innovation.

The Top Ten

Having reviewed several other countries commitments and being familiar with both what Canada has already done and what it could do, attached are 10 commitments I would like to see our government make to the OGP.

1. Be open about developing the commitments

Obviously there are a number of commitments the government is going to make since they are actions or programs that government was going to launch anyways. In addition, there will be some that will be new ideas that public servants or politicians have been looking for an opportunity to champion and now have an excuse. This is all fine and part of the traditional way government works.

But wouldn’t it be nice if – as part of the open government partnership – we asked citizens what they thought the commitments should be? That would make the process nicely consistent with the principles and goals of the OGP.

Thus the government should launch a two week crowd sourced idea generator, much like it did during the Digital Economy consultations. This is not suggestion that the ideas submitted must become part of the commitments, but they should inform the choices. This would be a wonderful opportunity to hear what Canadians have to say. In addition, the government could add some of its own proposal into the mix and see what type of response they get from Canadians.

2. Redefine Public as Digital: Pass an Online Information Act

At this year’s open government data camp in Warsaw, the always excellent Tom Steinberg noted that creating a transparent government and putting in place the information foundations of a digital economy will be impossible unless access to government data is not a gift from government (that can be taken away) but a right every citizen has. At the same time Andrew Rasiej of Tech President advocated that we must redefine public as digital. A paper print out in a small office in the middle of nowhere, does not make for  “public disclosure” in the 21st century. It’s bad for democracy, it’s bad for transparency, and it is grossly inefficient for government.

Thus, the government should agree to pass a Online Information Act, perhaps modeled on that proposed in the US Senate, that

a) Any document it produces should be available digitally, in a machine readable format. The sham that the government can produce 3000-10,000 printed pages about Afghan detainees or the F-35 and claim it is publicly disclosing information must end.

b) Any data collected for legislative reasons must be made available – in machine readable formats – via a government open data portal.

c) Any information that is ATIPable must be made available in a digital format. And that any excess costs of generating that information can be born by the requester, up until a certain date (say 2015) at which point the excess costs will be born by the ministry responsible. There is no reason why, in a digital world, there should be any cost to extracting information – indeed, I fear a world where the government can’t cheaply locate and copy its own information for an ATIP request as it would suggest it can’t get that information for its own operations.

3. Sign the Extractive Industries Transparency Initiative

As a leader in the field of resource extraction it is critical that Canada push for the highest standards in a sector that all too often sees money that should be destined for the public good get diverted into the hands of a few well connected individuals. Canada’s reputation internationally has suffered as our extractive resource sector is seen as engaging in a number of problematic practices such as bribing public officials – this runs counter to the Prime Minister’s efforts to promote democracy.

As a result, Canada should sign, with out delay, the Extractive Industries Transparency Initiative, much like the United States did in September. This can help signal our desire for a transparent extractive industry, one in which we play a significant role.

4. Sign on to the International Aid Transparency Initiative

Canada has already taken significant steps to publishing its aid data online, in machine readable formats. This should be applauded. The next step is to do so in a way that conforms with international standards so that this data can be assessed against the work of other donors.

The International Aid Transparency Initiative (IATI) offers an opportunity to increase transparency in foreign aid, better enable the public to understand its aid budget, compare the country’s effectiveness against others and identify duplication (and thus poorly used resources) among donors. Canada should agree to implement IATI immediately. In addition, it should request that the organizations it funds also disclose their work in ways that are compliant with IATI.

5. Use Open Data to drive efficiency in Government Services: Require the provinces to share health data – particularly hospital performance – as part of its next funding agreement within the Canada Health Act.

Comparing hospitals to one another is always a difficult task, and open data is not a panacea. However, more data about hospitals is rarely harmful and there are a number of issues on which it would be downright beneficial. The most obvious of these would be deaths caused by infection. The number of deaths that occur due to infections in Canadian hospitals is a growing problem (sigh, if only open data could help ban the antibacterial wipes that are helping propagate them). Having open data that allows for league tables to show the scope and location of the problem will likely cause many hospitals to rethink processes and, I suspect, save lives.

Open data can supply some of the competitive pressure that is often lacking in a public healthcare system. It could also better educate Canadians about their options within that system, as well as make them more aware of its benefits.

6. Reduce Fraud: Find Fraud by Creating a Death List

In an era where online identity is a problem it is surprising to me that I’m unable to locate a database of expired social insurance numbers. Being able to querry a list of social security numbers that belong to dead people might be a simple way to prevent fraud. Interestingly, the United States has just such a list available for free online. (Side fact: Known as the Social Security Death Index this database is also beloved by genealogist who use it to trace ancestry).

7. Save lives by publishing a API of recall data

The only time the public finds out about a product recall is after someone has died. This is a terribly tragic, not to mention grossly inefficient, outcome. Indeed, the current approach is a classic example of using 21st century technology to deliver a service in a 19th century manner. If the government is interested in using the OGP to improve government services it should stop just issuing recall press releases and also create an open data feed of recalled products. I expand on this idea here.

If the government were doubly smart it would work with major retailers – particularly in the food industry – to ensure that they regularly tap into this data. In an ideal world any time Save-on-Foods, Walmart, Safeway, or any other retailers scans product in their inventory it would immediately check it against the recall database, allowing bad food to be pulled out of production before it hits the shelves. In addition, customers who use loyalty cards could be called or emailed to be informed that they had bought a product that had been recalled. This would likely be much more effective than hoping the media picks the story up.

8. Open Budget and Actual Spending Data

For almost a year the UK government has published all spending data, month by month, for each government ministry (down to the £500 in some, £25,000 in others). More over, as an increasing number of local governments are required to share their spending data it has lead to savings, as government begin to learn what other ministries and governments are paying for similar services.

Another bonus is that it becomes possible to talk about the budget in new and interesting ways. This BEAUTIFUL graphic was published in the Guardian, while still complicated it is much easier to understand than any government document about the budget I have ever seen.

Public-spending-graphic-0051

9. Allow Government Scientists to speak directly to the media about their research.

It has become a reoccurring embarrassment. Scientists who work for Canada publish an internationally recognized ground break paper that provides some insight about the environment or geography of Canada and journalists must talk to government scientists from other countries in order to get the details. Why? Because the Canadian government blocks access. Canadians have a right to hear the perspectives of scientists their tax dollars paid for – and enjoy the opportunity to get as well informed as the government on these issues.

Thus, lift the ban that blocks government scientists from speaking with the media.

10. Create a steering group of leading Provincial and Municipal CIOs to create common schema for core data about the country.

While open data is good, open data organized the same way for different departments and provinces is even better. When data is organized the same way it makes it easier to citizens to compare one jurisdiction against another, and for software solutions and online services to emerge that use that data to enhance the lives of Canadians. The Federal Government should use its convening authority to bring together some of the countries leading government CIOs to establish common data schemas for things like crime, healthcare, procurement, and budget data. The list of what could be worked on is virtually endless, but those four areas all represent data sets that are frequently requested, so might make for a good starting point.

International Open Data Hackathon 2011: Better Tools, More Data, Bigger Fun

Last year, with only a month of notice, a small group passionate people announced we’d like to do an international open data hackathon and invited the world to participate.

We were thinking small but fun. Maybe 5 or 6 cities.

We got it wrong.

In the end people from over 75 cities around the world offered to host an event. Better still we definitively heard from people in over 40. It was an exciting day.

Last week, after locating a few of the city organizers email addresses, I asked them if we should do it again. Every one of them came back and said: yes.

So it is official. This time we have 2 months notice. December 3rd will be Open Data Day.

I want to be clear, our goal isn’t to be bigger this year. That might be nice if it happens. But maybe we’ll only have 6-7 cities. I don’t know. What I do want is for people to have fun, to learn, and to engage those who are still wrestling with the opportunities around open data. There is a world of possibilities out there. Can we seize on some of them?

Why.

Great question.

First off. We’ve got more data. Thanks to more and more enlightened governments in more and more places, there’s a greater amount of data to play with. Whether it is Switzerland, Kenya, or Chicago there’s never been more data available to use.

Second, we’ve got better tools. With a number of governments using Socrata there are more API’s out there for us to leverage. Scrapperwiki has gotten better and new tools like Buzzdata, TheDataHub and Google’s Fusion Tables are emerging every day.

And finally, there is growing interest in making “openess” a core part of how we measure governments. Open data has a role to play in driving this debate. Done right, we could make the first Saturday in December “Open Data Day.” A chance to explain, demo and invite to play, the policy makers, citizens, businesses and non-profits who don’t yet understand the potential. Let’s raise the world’s data literacy and have some fun. I can’t think of a better way than with another global open data hackathon – an maker’s fair like opportunity for people to celebrate open data by creating visualizations, writing up analyses, building apps or doing what ever they want with data.

Of course, like last time, hopefully we can make the world a little better as well. (more on that coming soon)

How.

The basic premises for the event would be simple, relying on 5 basic principles.

1. Together. It can be as big or as small, as long or as short, as you’d like it, but we’ll be doing it together on Saturday, December 3rd, 2011.

2. It should be open. Around the world I’ve seen hackathons filled with different types of people, exchanging ideas, trying out new technologies and starting new projects. Let’s be open to new ideas and new people. Chris Thorpe in the UK has done amazing work getting young and diverse group hacking. I love Nat Torkington’s words on the subject. Our movement is stronger when it is broader.

3. Anyone can organize a local event. If you are keen help organize one in your city and/or just participate add your name to the relevant city on this wiki page. Where ever possible, try to keep it to one per city, let’s build some community and get new people together. Which city or cities you share with is up to you as it how you do it. But let’s share.

4. You can work on anything that involves open data. That could be a local or global app, a visualization, proposing a standard for common data sets, scraping data from a government website to make it available for others in buzzdata.

It would be great to have a few projects people can work on around the world – building stuff that is core infrastructure to future projects. That’s why I’m hoping someone in each country will create a local version of MySociety’s Mapit web service for their country. It will give us one common project, and raise the profile of a great organization and a great project.

We also hope to be working with Random Hacks of Kindness, who’ve always been so supportive, ideally supplying data that they will need to run their applications.

5. Let’s share ideas across cities on the day. Each city’s hackathon should do at least one demo, brainstorm, proposal, or anything that it shares in an interactive way with at members of a hackathon in at least one other city. This could be via video stream, skype, by chat… anything but let’s get to know one another and share the cool projects or ideas we are hacking on. There are some significant challenges to making this work: timezones, languages, culture, technology… but who cares, we are problem solvers, let’s figure out a way to make it work.

Like last year, let’s not try to boil the ocean. Let’s have a bunch of events, where people care enough to organize them, and try to link them together with a simple short connection/presentation.Above all let’s raise some awareness, build something and have some fun.

What next?

1. If you are interested, sign up on the wiki. We’ll move to something more substantive once we have the numbers.

2. Reach out and connect with others in your city on the wiki. Start thinking about the logistics. And be inclusive. Someone new shows up, let them help too.

3. Share with me your thoughts. What’s got you excited about it? If you love this idea, let me know, and blog/tweet/status update about it. Conversely, tell me what’s wrong with any or all of the above. What’s got you worried? I want to feel positive about this, but I also want to know how we can make it better.

4. Localization. If there is bandwidth locally, I’d love for people to translate this blog post and repost it locally. (let me know as I’ll try cross posting it here, or at least link to it). It is important that this not be an english language only event.

5. If people want a place to chat with other about this, feel free to post comments below. Also the Open Knowledge Foundation’s Open Data Day mailing list will be the place where people can share news and help one another out.

Once again, I hope this will sound like fun to a few committed people. Let me know what you think.

The Science of Community Management: DjangoCon Keynote

At OSCON this year, Jono Bacon, argued that we are entering a era of renaissance in open source community management – that increasingly we don’t just have to share stories but that repeatable, scientific approaches are increasingly available to us. In short, the art of community management is shifting to a science.

With an enormous debt to Jono, I contend we are already there. Indeed the tools for enable a science of community management have existed for at least 5 years. All that is needed is an effort to implement them.

A few weeks ago the organizers of DjangoCon were kind enough to invite me to give the keynote at their conference in Portland and I made these ideas the centerpiece of my talk.

Embedded below is the result: a talk that that starts slowly, but that grew with passion and engagement as it progressed. I really want to thank the audience for the excellent Q&A and for engaging with me and the ideas as much as they did. As someone from outside their community, I’m grateful.

My hope in the next few weeks is to write this talk up in a series of blog posts or something more significant, and, hopefully, to redo this video in slideshare (although I’m going to have to get my hands on the audio of this). I’ll also be giving a version of this talk at the Drupal Pacific Northwest Summit in a few weeks. Feedback, as always, is not only welcome, but gratefully received. None of this happens in a vacuum, it is always your insights that help me get better, smarter and more on target.

Big thanks to Dierderik Van Liere and Lauren Bacon for inspiration and help as well as Mike Beltzner, Daniel Einspanjer, David Ascher and Dan Mosedale (among many others) at Mozilla who’ve been supportive and a big assistance.

In the meantime, I hope this is enjoyable, challenging and spurs good thoughts.

The Geopolitics of the Open Government Partnership: the beginning of Open vs. Closed

Aside from one or two notable exceptions, there hasn’t been a ton of press about the Open Government Partnership (OGP). This is hardly surprising. The press likes to talk about corruption and bad government, people getting together to talk about actually address these things in far less sexy.

But even where good coverage exists analysts and journalists are, I think, misunderstanding the nature of the partnership and its broader implications should it take hold. Presently it is generally seen as a do good project, one that will help fight corruption and hopefully lead to some better governance (both of which I hope will be true). However, the Open Government Partnership isn’t just about doing good, it has real strategic and geopolitical purposes.

In fact, the OGP is, in part, about a 21st century containment strategy.

For those unfamiliar with 20th century containment, a brief refresher. Containment refers to a strategy outlined by a US diplomat – George Kennan – who while posted in Moscow wrote the famous The Long Telegram in which he outlined the need for a more aggressive policy to deal with an expansionist post-WWII Soviet Union. He argued that such a policy would need to seek to isolate the USSR politically and strategically, in part by positioning the United States as a example in the world that other countries would want to work with. While discussions of “containment” often focus on its military aspects and the eventual arms race, it was equally influential in prompting the ideological battle between the USA and USSR as they sought to demonstrate whose “system” was superior.

So I repeat. The OGP is part of a 21st century containment policy. And I’d go further, it is a effort to forge a new axis around which America specifically, and a broader democratic camp more generally, may seek to organize allies and rally its camp. It abandons the now outdated free-market/democratic vs. state-controlled/communist axis in favour of a more subtle, but more appropriate, open vs. closed.

The former axis makes little sense in a world where authoritarian governments often embrace (quasi) free-market to reign, and even have some of the basic the trappings of a democracy. The Open Government Partnership is part of an effort to redefine and shift the goal posts around what makes for a free-market democracy. Elections and a market place clearly no longer suffice and the OGP essentially sets a new bar in which a state must (in theory) allow itself to be transparent enough to provide its citizens with information (and thus power), in short: it is a state can’t simple have some of the trappings of a democracy, it must be democratic and open.

But that also leaves the larger question. Who is being contained? To find out that answer take a look at the list of OGP participants. And then consider who isn’t, and likely never could be, invited to the party.

OGP members Notably Absent
Albania
Azerbaijan
Brazil
Bulgaria
Canada
Chile
Colombia
Croatia
Czech Republic
Dominican Republic
El Salvador
Estonia
Georgia
Ghana
Guatemala
Honduras
Indonesia
Israel
Italy
Jordon
Kenya
Korea
Latvia
Liberia
Lithuania
Macedonia
Malta
Mexico
Moldova
Mongolia
Montenegro
Netherlands
Norway
Peru
Philippines
Romania
Slovak Republic
South Africa
Spain
Sweden
Tanzania
Turkey
Ukraine
United Kingdom
United States
Uruguay
ChinaIran

Russia

Saudi Arabia

(Indeed much of the middle East)

Pakistan

*India is not part of the OGP but was involved in much of initial work and while it has withdrawn (for domestic political reasons) I suspect it will stay involved tangentially.

So first, what you have here is a group of countries that are broadly democratic. Indeed, if you were going to have a democratic caucus in the United Nations, it might look something like this (there are some players in that list that are struggling, but for them the OGP is another opportunity to consolidate and reinforce the gains they’ve made as well as push for new ones).

In this regards, the OGP should be seen as an effort by the United States and some allies to find some common ground as well as a philosophical touch point that not only separates them from rivals, but that makes their camp more attractive to deal with. It’s no trivial coincidence that on the day of the OGP launch the President announced the United States first fulfilled commitment would be its decision to join the Extractive Industries Transparency Initiative (EITI). The EITI commits the American oil, gas and mining companies to disclose payments made to foreign governments, which would make corruption much more difficult.

This is America essentially signalling to African people and their leaders – do business with us, and we will help prevent corruption in your country. We will let you know if officials get paid off by our corporations. The obvious counter point to this is… the Chinese won’t.

It’s also why Brazil is a co-chair, and the idea was prompted during a meeting with India. This is an effort to bring the most important BRIC countries into the fold.

But even outside the BRICs, the second thing you’ll notice about the list is the number of Latin American, and in particular African countries included. Between the OGP, the fact that the UK is making government transparency a criteria for its foreign aid, and that World Bank is increasingly moving in the same direction, the forces for “open” are laying out one path for development and aid in Africa. One that rewards governance and – ideally – creates opportunities for African citizens. Again, the obvious counter point is… the Chinese won’t.

It may sounds hard to believe but the OGP is much more than a simple pact designed to make heads of state look good. I believe it has real geopolitical aims and may be the first overt, ideological salvo in the what I believe will be the geopolitical axis of Open versus Closed. This is about finding ways to compete for the hearts and minds of the world in a way that China, Russia, Iran and others simple cannot. And, while I agree we can debate the “openness” of the various the signing countries, I like the idea of world in which states compete to be more open. We could do worse.

Canada Joins the Open Government Partnership

I’m in New York today for the launch of the Open Government Partnership and it looks as the Canada is now a signatory (or at least has signed a letter of intent).

No commitments are outlined, but I will link to them when they are posted.

The Open Government Partnership was launched by the White House and the State Department earlier this year with 8 founding countries. The goal is to get a coalition of governments around the world to commit to implementing a series of initiatives to improve government transparency, effectiveness and accountability. You can read more here.

For those interested, the launch of the event will be livestreamed here. If you’re at the event, I’ll be hosting the lunch on “How to identify and prioritize core classes of information for public disclosure.”

Updated: here’s a video…

Interview with Charles Leadbeater – Monday September 19th

I’m excited to share that I’ll be interviewing British public policy and open innovation expert Charles Leadbeater on September 19th as part of a SIG’s webinar series. For readers not familiar with Charles Leadbeater, he is the author of We-Think and numerous other chapters, pamphlets and articles, ranging in focus from social innovation, to entrepreneurship to public sector reform. He served as an adviser to Tony Blair and has a long standing relationship with the British think tank Demos.

Our conversation will initially focus on open innovation, but I’m sure will range all over, touching on the impact of open source methodologies on the private, non-profit and public sector, the future of government services and, of course, the challenges and opportunities around open data.

If you are interested in participating in the webinar you can register here. There is a small fee I’m told is being charged to recover some of the costs for running the event.

If you are participating and have a question you’d like to see asked, or a theme or topic you’d like to see covered, please feel free to comment below or, if you prefer more discretion, send me an email.

The Economics of Open Data – Mini-Case, Transit Data & TransLink

TransLink, the company that runs public transit in the region where I live (Vancouver/Lower Mainland) has launched a real time bus tracking app that uses GPS data to figure out how far away the next the bus you are waiting for really is. This is great news for everyone.

Of course for those interested in government innovation and public policy it also leads to another question. Will this GPS data be open data?

Presently TransLink does make its transit schedule “open” under a non-commercial license (you can download it here). I can imagine a number of senior TransLink officials (and the board) scratching their head asking: “Why, when we are short of money, would we make our data freely available?”

The answer is that TransLink should make its current data, as well as its upcoming GPS data, open and available under a license that allows for both non-commercial and commercial re-use, not just because it is the right thing to do, but because the economics of it make WAY MORE SENSE FOR TRANSLINK.

Let me explain.

First, there are not a lot of obvious ways TransLink could generate wealth directly from its data. But let’s take two possible opportunities: the first involves selling a transit app to the public (or advertising in such an app), the second is through selling a “next bus” service to companies (say coffee shops or organizations) that believe showing this information might be a convenience to their employees or customers.

TransLink has already abandoned doing paid apps – instead it maintains a mobile website at m.translink.ca – but even if it created an app and charged $1 per download, the revenue would be pitiful. Assuming a very generous customer base of 100,000 users, TransLink would generate maybe $85,000 dollars (once Apple takes its cut from the iPhone downloads, assuming zero cut for Androids). But remember, this is not a yearly revenue stream, it is one time. Maybe, 10-20,000 people upgrade their phone, arrive in Vancouver and decide to download every year. So your year on year revenue is maybe $15K? So over a 5 year period, TransLink ends up with an extra, say $145,000 dollars. Nothing to sneeze at, but not notable.

In contrast a free application encourages use. So there is also a cost to not giving it away. It could be that, having transit data more readily available might cause some people to choose taking transit over say, walking, or taking a taxi or driving. Last year TransLink handled 211.3 million trips. Let’s assume that more accessible data from wider access to the data meant there was a .1% increase in the number of trips. An infinitesimally small increase – but it means 211,300 more trips. Assuming each rider pays a one zone $2.50 fare that would still translate in an additional revenue of $528,250. Over the same five year period cited above… that’s revenue of $2.641M, much better than $145,000. And this is just calculating money. Let’s say nothing of less congested roads, less smog and a lower carbon footprint for the region…

When the this analysis is applied to licensing data it produces the same result. Will UBC pay to have TransLink’s real time data on terminals in the Student Union building? I doubt it. Would some strategically placed coffee shops… possibly. Obviously organizations would have to pay for the signs, but adding on annual “data license fee” to display’s cost would cause some to opt out. And once you take into account managing the signs, legal fees, dealing with the contract and going through the sales process, it is almost inconceivable that TransLink would make more money from these agreements than it would from simply having more signs everywhere created by other people that generated more customers for its actual core business: moving people from A to B for a fee. Just to show you the numbers, if shops that weren’t willing to pay for the data put up “next bus” screens that generated a mere 1000 new regular bus users who did only 40 one way trips a year (or 40,000 new trips), this would equal revenue of $100,000 every year at no cost to translink. Someone else could install and maintain the signs, no contracts or licenses would need to be managed.

From a cost recovery perspective it is almost impossible to imagine a scenario where TransLink is better off not allowing commercial re-use of its data.

My point is that TransLink should not be focused on creating a few bucks from licensing its data (which it doesn’t do right now anyway). It should be focused on shifting the competitive value in the marketplace from access to accessibility.

Being the monopoly holder of transit data does not benefit TransLink. All it means is that fewer people see and engage with its data. When it makes the data open and available “access” no longer becomes the defining advantage. When anybody (e.g. TransLink, Google, independent developers) can access the data, the market place shifts to competing on access to competing on accessibility. Consumers don’t turn to who has the data, they turn to who makes the data easiest to use.

For example, Translink has noted that in 2011 it will have a record number of trips. Part of me wonders to what degree the increase in trips over the past few years is a result of making transit data accessible in Google Maps. (Has anyone done a study on this in any jurisdiction?) The simple fact is that Google maps is radically easier to use for planning transit journeys than Translink’s own website AND THAT IS A GOOD THING FOR TRANSLINK. Now imagine if lots of companies were sharing translink’s data? The local Starbucks and Blenz Coffee, to colleges and universities and busy buildings downtown. Indeed, the real crime right now is that Translink has handed Google a defacto monopoly. It is allowed to use the data for commercial re-use. Local tax-paying developers…? Not so according to the license they have to click through.

Translink, you want a world where everyone is competing (including against you) on accessibility. In the end… you win with greater use and revenue.

But let me go further. There are other benefits to having Translink share its data for commercial re-use.

Procurement

Some riders will note that there are already bus stops in Vancouver which display “next bus” data (e.g. how many minutes away the next bus is). If TransLink made its next bus data freely available via an API it could conceivably alter the procurement process for buying and maintaining these signs. Any vendor could see how the data is structured and so take over the management of the signs, and/or experiment with creating more innovative or cheaper ways of manufacturing them.

The same is true of creating the RFP for TransLink’s website. With the data publicly available, TransLink could simple ask developers to mock up what they think is the most effective way of displaying the data. More development houses might be enticed to respond to the RFP increasing the likelihood of innovations and putting downward pressure of fees.

Analysis

Of course, making GPS data free could have an additional benefit. Local news companies might be able to use the bus’s GPS data to calculate traffic flow rates and so predict traffic jams. Might they be willing to pay TransLink for the data? Maybe, but again probably not enough to justify the legal and sales overhead. Moreover, TransLink would benefit from this analysis – as it could use the reports to adjust its schedule and notify its drivers of problems beforehand. Of course everyone would benefit as well as better informed commuters might change their behaviour (including taking transit!) reducing congestion, smog, carbon footprint, etc…

Indeed, the analysis opportunities using GPS data are potentially endless – much of which might be done by bloggers and university students. One could imagine correlating actual bus/subway times with any other number of data sets (crime, commute times, weather) that could yield interesting information that could help TransLink with its planning. There is no world where TransLink has the resources to do all this analysis, so enabling others to do it, can only benefit it.

Conclusion

So if you are at TransLink/Coast Mountain Bus Company (or any transit authority in the world), this post is for you. Here’s what I suggest as next steps:

1) Add GPS bus tracking API to your open data portal.

2) Change your license. Drop the non-commercial part. It hurts your business more than you realize and is anti competitive (why does can Google use the data for a commercial application while residents of the lower mainland cannot?). My suggestion, adopt the BC Government Open Government License or the PDDL.

3) Add an RSS feed to your GTFS data. Like Google, we’d all like to know when you update your data. Given we live here and are users, it be nice to extend the same service to us as you do them.

4) Maybe hold a Transit Data Camp where you could invite local developers and entrepreneurs to meet your staff and encourage people to find ways to get transit data into the hands of more Lower Mainlanders and drive up ridership!