Category Archives: technology

Using Open Data to drive good policy outcomes – Vancouver’s Rental Database

One of the best signs for open data is when governments are starting to grasp its potential to achieve policy objectives. Rather than just being about compliance, it is seen as a tool that can support the growth and management of a jurisdiction.

This why I was excited to see Vision Vancouver (in which I’m involved in generally, but was not involved in the development of this policy proposal) announced the other day that, if elected, it intends to create a comprehensive online registry that will track work orders and property violations in Vancouver apartments, highlighting negligent landlords and giving a new tool to empower renters.

As the press release goes on to state, the database is “Modeled after a successful on-line watchlist created by New York City’s Public Advocate, the database will allow Vancouver residents to search out landlords and identify any building or safety violations issued by the City of Vancouver to specific rental buildings.”

Much like the pieces I’ve written around restaurant inspection and product recall data, this is a great example of a data set, that when shared the right way, can empower citizens to make better choices and foster better behaviour from landlords.

My main hope is that in the implementation of this proposal, the city does the right thing and doesn’t create a searchable database on its own website, but actually creates an API that software developers and others can tap into. If they do this, someone may develop a mobile app for renters that would show you the repair record of the building you are standing in front of, or in. This could be very helpful for renters, one could even imagine an app where you SMS the postal code of a rental building and it sends you back some basic information. Also exciting to me is the possibility that a university student might look for trends in the data over time, maybe there is an analysis that my yield and insight that could help landlords mitigate against problems, and reduce the number of repairs they have to make (and so help reduce their costs).

But if Vancouver and New York actually structured the data in the same way, it might create an incentive for other cities to do the same. That might entice some of the better known services to use the data to augment their offerings as well. Imagine if PadMapper, in addition to allowing a prospective renter to search for apartments based on rent costs and number of rooms, could also search based on number of infractions?

pad-mapper-rental

That might have a salutary effect on some (but sadly not all) landlords. All an all an exciting step forward from my friends at Vision who brought open data to Canada.

As Canada Searches for its Open Government Partnership Commitments: A Proposal

Just before its launch in New York on September 20th, the Canadian Government agreed to be a signatory of the Open Government Partnership (OGP). Composed of over 40 countries the OGP signatories are required to create a list of commitments they promise to implement. Because Canada signed on just before the deadline it has not – to date – submitted its commitments. As a result, there is a fantastic window for the government to do something interesting with this opportunity.

So what should we do? Here are the top 5 suggestions I propose for Canada’s OGP Commitments:

Brief Background on Criteria:

Before diving in, it is worth letting readers know that there are some criteria for making commitments. Specifically, any commitment must tackle at least one of the five “core” challenges: improve public services, increase public integrity, more effectively manage public resources, create safer communities, and increase corporate accountability.

In addition, each recommendation should reflect at least one of the core OGP principles, which are: transparency, citizen participation, accountability, and technology and innovation.

The Top Ten

Having reviewed several other countries commitments and being familiar with both what Canada has already done and what it could do, attached are 10 commitments I would like to see our government make to the OGP.

1. Be open about developing the commitments

Obviously there are a number of commitments the government is going to make since they are actions or programs that government was going to launch anyways. In addition, there will be some that will be new ideas that public servants or politicians have been looking for an opportunity to champion and now have an excuse. This is all fine and part of the traditional way government works.

But wouldn’t it be nice if – as part of the open government partnership – we asked citizens what they thought the commitments should be? That would make the process nicely consistent with the principles and goals of the OGP.

Thus the government should launch a two week crowd sourced idea generator, much like it did during the Digital Economy consultations. This is not suggestion that the ideas submitted must become part of the commitments, but they should inform the choices. This would be a wonderful opportunity to hear what Canadians have to say. In addition, the government could add some of its own proposal into the mix and see what type of response they get from Canadians.

2. Redefine Public as Digital: Pass an Online Information Act

At this year’s open government data camp in Warsaw, the always excellent Tom Steinberg noted that creating a transparent government and putting in place the information foundations of a digital economy will be impossible unless access to government data is not a gift from government (that can be taken away) but a right every citizen has. At the same time Andrew Rasiej of Tech President advocated that we must redefine public as digital. A paper print out in a small office in the middle of nowhere, does not make for  “public disclosure” in the 21st century. It’s bad for democracy, it’s bad for transparency, and it is grossly inefficient for government.

Thus, the government should agree to pass a Online Information Act, perhaps modeled on that proposed in the US Senate, that

a) Any document it produces should be available digitally, in a machine readable format. The sham that the government can produce 3000-10,000 printed pages about Afghan detainees or the F-35 and claim it is publicly disclosing information must end.

b) Any data collected for legislative reasons must be made available – in machine readable formats – via a government open data portal.

c) Any information that is ATIPable must be made available in a digital format. And that any excess costs of generating that information can be born by the requester, up until a certain date (say 2015) at which point the excess costs will be born by the ministry responsible. There is no reason why, in a digital world, there should be any cost to extracting information – indeed, I fear a world where the government can’t cheaply locate and copy its own information for an ATIP request as it would suggest it can’t get that information for its own operations.

3. Sign the Extractive Industries Transparency Initiative

As a leader in the field of resource extraction it is critical that Canada push for the highest standards in a sector that all too often sees money that should be destined for the public good get diverted into the hands of a few well connected individuals. Canada’s reputation internationally has suffered as our extractive resource sector is seen as engaging in a number of problematic practices such as bribing public officials – this runs counter to the Prime Minister’s efforts to promote democracy.

As a result, Canada should sign, with out delay, the Extractive Industries Transparency Initiative, much like the United States did in September. This can help signal our desire for a transparent extractive industry, one in which we play a significant role.

4. Sign on to the International Aid Transparency Initiative

Canada has already taken significant steps to publishing its aid data online, in machine readable formats. This should be applauded. The next step is to do so in a way that conforms with international standards so that this data can be assessed against the work of other donors.

The International Aid Transparency Initiative (IATI) offers an opportunity to increase transparency in foreign aid, better enable the public to understand its aid budget, compare the country’s effectiveness against others and identify duplication (and thus poorly used resources) among donors. Canada should agree to implement IATI immediately. In addition, it should request that the organizations it funds also disclose their work in ways that are compliant with IATI.

5. Use Open Data to drive efficiency in Government Services: Require the provinces to share health data – particularly hospital performance – as part of its next funding agreement within the Canada Health Act.

Comparing hospitals to one another is always a difficult task, and open data is not a panacea. However, more data about hospitals is rarely harmful and there are a number of issues on which it would be downright beneficial. The most obvious of these would be deaths caused by infection. The number of deaths that occur due to infections in Canadian hospitals is a growing problem (sigh, if only open data could help ban the antibacterial wipes that are helping propagate them). Having open data that allows for league tables to show the scope and location of the problem will likely cause many hospitals to rethink processes and, I suspect, save lives.

Open data can supply some of the competitive pressure that is often lacking in a public healthcare system. It could also better educate Canadians about their options within that system, as well as make them more aware of its benefits.

6. Reduce Fraud: Find Fraud by Creating a Death List

In an era where online identity is a problem it is surprising to me that I’m unable to locate a database of expired social insurance numbers. Being able to querry a list of social security numbers that belong to dead people might be a simple way to prevent fraud. Interestingly, the United States has just such a list available for free online. (Side fact: Known as the Social Security Death Index this database is also beloved by genealogist who use it to trace ancestry).

7. Save lives by publishing a API of recall data

The only time the public finds out about a product recall is after someone has died. This is a terribly tragic, not to mention grossly inefficient, outcome. Indeed, the current approach is a classic example of using 21st century technology to deliver a service in a 19th century manner. If the government is interested in using the OGP to improve government services it should stop just issuing recall press releases and also create an open data feed of recalled products. I expand on this idea here.

If the government were doubly smart it would work with major retailers – particularly in the food industry – to ensure that they regularly tap into this data. In an ideal world any time Save-on-Foods, Walmart, Safeway, or any other retailers scans product in their inventory it would immediately check it against the recall database, allowing bad food to be pulled out of production before it hits the shelves. In addition, customers who use loyalty cards could be called or emailed to be informed that they had bought a product that had been recalled. This would likely be much more effective than hoping the media picks the story up.

8. Open Budget and Actual Spending Data

For almost a year the UK government has published all spending data, month by month, for each government ministry (down to the £500 in some, £25,000 in others). More over, as an increasing number of local governments are required to share their spending data it has lead to savings, as government begin to learn what other ministries and governments are paying for similar services.

Another bonus is that it becomes possible to talk about the budget in new and interesting ways. This BEAUTIFUL graphic was published in the Guardian, while still complicated it is much easier to understand than any government document about the budget I have ever seen.

Public-spending-graphic-0051

9. Allow Government Scientists to speak directly to the media about their research.

It has become a reoccurring embarrassment. Scientists who work for Canada publish an internationally recognized ground break paper that provides some insight about the environment or geography of Canada and journalists must talk to government scientists from other countries in order to get the details. Why? Because the Canadian government blocks access. Canadians have a right to hear the perspectives of scientists their tax dollars paid for – and enjoy the opportunity to get as well informed as the government on these issues.

Thus, lift the ban that blocks government scientists from speaking with the media.

10. Create a steering group of leading Provincial and Municipal CIOs to create common schema for core data about the country.

While open data is good, open data organized the same way for different departments and provinces is even better. When data is organized the same way it makes it easier to citizens to compare one jurisdiction against another, and for software solutions and online services to emerge that use that data to enhance the lives of Canadians. The Federal Government should use its convening authority to bring together some of the countries leading government CIOs to establish common data schemas for things like crime, healthcare, procurement, and budget data. The list of what could be worked on is virtually endless, but those four areas all represent data sets that are frequently requested, so might make for a good starting point.

The State of Open Data 2011

What is the state of the open data movement? Yesterday, during my opening keynote at the Open Government Data Camp (held this year in Warsaw, Poland) I sought to follow up on my talk from last year’s conference. Here’s my take of where we are today (I’ll post/link to a video of the talk as soon as the Open Knowledge Foundation makes it available).

Successes of the Past Year: Crossing the Chasm

1. More Open Data Portals

One of the things that has been amazing to witness in 2011 is the veritable explosion of Open Data portals around the world. Today there are well over 50 government data catalogs with more and more being added. The most notable of these was probably the Kenyan Open Data catalog which shows how far, and wide, the open data movement has grown.

2. Better Understanding and More Demand

The things about all these portals is that they are the result of a larger shift. Specifically, more and more government officials are curious about what open data is. This is not to say that understanding has radically shifted, but many people in government (and in politics) now know the term, believe there is something interesting going on in this space, and want to learn more. Consequently, in a growing number of places there is less and less headwind against us. Rather than screaming from the rooftops, we are increasingly being invited in the front door.

3. More Experimentation

Finally, what’s also exciting is the increased experimentation in the open data space. The number of companies and organizations trying to engage open data users is growing. ScraperWiki, the DataHub, BuzzData, Socrata, Visua.ly, are some of the products and resources that have emerged out of the open data space. And the types of research and projects that are emerging – the tracking of the Icelandic volcano eruptions, the emergence of hacks and hackers, micro projects (like my own Recollect.net) and the research showing that open data could be generating savings of £8.5 million a year to governments in the Greater Manchester area, is deeply encouraging.

The Current State: An Inflection Point

The exciting thing about open data is that increasingly we are helping people – public servants, politicians, business owners and citizens imagine a different future, one that is more open, efficient and engaging. Our impact is still limited, but the journey is still in its early days. More importantly, thanks to success (number 2 above) our role is changing. So what does this mean for the movement right now?

Externally to the movement, the work we are doing is only getting more relevant. We are in an era of institution failure. From the Tea Party to Occupy Wall St. there is a recognition that our institutions no longer sufficiently serve us. Open data can’t solve this problem, but it is part of the solution. The challenge of the old order and the institutions it fostered is that its organizing principle is built around the management (control) of processes, it’s been about the application of the industrial production model to government services. This means it can only move so fast, and because of its strong control orientation, can only allow for so much creativity (and adaption). Open data is about putting the free flow of information at the heart of government – both internally and externally – with the goal of increasing government’s metabolism and decentralizing societies’ capacity to respond to problems. Our role is not obvious to the people in those movements, and we should make it clearer.

Internally to the movement, we have another big challenge. We are at a critical inflection point. For years we have been on the outside, yelling that open data matters. But now we are being invited inside. Some of us want to rush in, keen to make advances, others want to hold back, worried about being co-opted. To succeed, it is essential we must become more skilled at walking this difficult line: engaging with governments and helping them make the right decisions, while not being co-opted or sacrificing our principles. Choosing to not engage would, in my opinion, be to abscond from our responsibility as citizens and open data activists. This is a difficult transition, but it will be made easier if we at least acknowledge it, and support one another in it.

Our Core Challenges: What’s next

Looking across the open data space, my own feeling is that there are three core challenges that are facing the open data movement that threaten to compromise all the successes we’ve currently enjoyed.

1. The Compliance Trap

One key risk for open data is that all our work ends up being framed as a transparency initiative and thus making data available is reduced to being a compliance issue for government departments. If this is how our universe is framed I suspect in 5-10 years governments, eager to save money and cut some services, will choose to cut open data portals as a cost saving initiative.

Our goal is not to become a compliance issue. Our goal is to make governments understand that they are data management organizations and that they need to manage their data assets with the same rigour with which they manage physical assets like roads and bridges. We are as much about data governance as we are open data. This means we need to have a vision for government, one where data becomes a layer of the government architecture. Our goal is to make data platform one that not only citizens outside of government can build on, but one that government reconstructs its policy apparatus as well as its IT systems at top of. Achieving this will ensure that open data gets hardwired right into government and so cannot be easily shut down.

2. Data Schemas

This year, in the lead up to the Open Data Camp, the Open Knowledge Foundation created a map of open data portals from around the world. This was fun to look at, and I think should be the last time we do it.

We are getting to a point where the number of data portals is becoming less and less relevant. Getting more portals isn’t going to enable open data to scale more. What is going to allow us to scale is establishing common schemas for data sets that enable them to work across jurisdictions. The single most widely used open government data set is transit data, which because it has been standardized by the GTFS is available across hundreds of jurisdictions. This standardization has not only put the data into google maps (generating millions of uses everyday) but has also led to an explosion of transit apps around the world. Common standards will let us scale. We cannot forget this.

So let’s stop mapping open data portals, and start mapping datasets that adhere to common schemas. Given that open data is increasingly looked upon favourably by governments, creating these schemas is, I believe, now the central challenge to the open data movement.

3. Broadening the Movement

I’m impressed by the hundreds and hundreds of people here at the Open Data Camp in Warsaw. It is fun to be able to recognize so many of the faces here, the problem is that I can recognize too many of them. We need to grow this movement. There is a risk that we will become complacent, that we’ll enjoy the movement we’ve created and, more importantly, our roles within it. If that happens we are in trouble. Despite our successes we are far from reaching critical mass.

The simple question I have for us is: Where is the United Way, Google, Microsoft, the Salvation Army, Oxfam, and Greenpeace? We’ll know were are making progress when companies – large and small – as well as non-profits – start understanding how open government data can change their world for the better and so want to help us advance the cause.

Each of us needs to go out and start engaging these types of organizations and helping them see this new world and the potential it creates for them to make money or advance their own issues. The more we can embed ourselves into other’s networks, the more allies we will recruit and the stronger we will be.

 

The Science of Community Management: DjangoCon Keynote

At OSCON this year, Jono Bacon, argued that we are entering a era of renaissance in open source community management – that increasingly we don’t just have to share stories but that repeatable, scientific approaches are increasingly available to us. In short, the art of community management is shifting to a science.

With an enormous debt to Jono, I contend we are already there. Indeed the tools for enable a science of community management have existed for at least 5 years. All that is needed is an effort to implement them.

A few weeks ago the organizers of DjangoCon were kind enough to invite me to give the keynote at their conference in Portland and I made these ideas the centerpiece of my talk.

Embedded below is the result: a talk that that starts slowly, but that grew with passion and engagement as it progressed. I really want to thank the audience for the excellent Q&A and for engaging with me and the ideas as much as they did. As someone from outside their community, I’m grateful.

My hope in the next few weeks is to write this talk up in a series of blog posts or something more significant, and, hopefully, to redo this video in slideshare (although I’m going to have to get my hands on the audio of this). I’ll also be giving a version of this talk at the Drupal Pacific Northwest Summit in a few weeks. Feedback, as always, is not only welcome, but gratefully received. None of this happens in a vacuum, it is always your insights that help me get better, smarter and more on target.

Big thanks to Dierderik Van Liere and Lauren Bacon for inspiration and help as well as Mike Beltzner, Daniel Einspanjer, David Ascher and Dan Mosedale (among many others) at Mozilla who’ve been supportive and a big assistance.

In the meantime, I hope this is enjoyable, challenging and spurs good thoughts.

The Geopolitics of the Open Government Partnership: the beginning of Open vs. Closed

Aside from one or two notable exceptions, there hasn’t been a ton of press about the Open Government Partnership (OGP). This is hardly surprising. The press likes to talk about corruption and bad government, people getting together to talk about actually address these things in far less sexy.

But even where good coverage exists analysts and journalists are, I think, misunderstanding the nature of the partnership and its broader implications should it take hold. Presently it is generally seen as a do good project, one that will help fight corruption and hopefully lead to some better governance (both of which I hope will be true). However, the Open Government Partnership isn’t just about doing good, it has real strategic and geopolitical purposes.

In fact, the OGP is, in part, about a 21st century containment strategy.

For those unfamiliar with 20th century containment, a brief refresher. Containment refers to a strategy outlined by a US diplomat – George Kennan – who while posted in Moscow wrote the famous The Long Telegram in which he outlined the need for a more aggressive policy to deal with an expansionist post-WWII Soviet Union. He argued that such a policy would need to seek to isolate the USSR politically and strategically, in part by positioning the United States as a example in the world that other countries would want to work with. While discussions of “containment” often focus on its military aspects and the eventual arms race, it was equally influential in prompting the ideological battle between the USA and USSR as they sought to demonstrate whose “system” was superior.

So I repeat. The OGP is part of a 21st century containment policy. And I’d go further, it is a effort to forge a new axis around which America specifically, and a broader democratic camp more generally, may seek to organize allies and rally its camp. It abandons the now outdated free-market/democratic vs. state-controlled/communist axis in favour of a more subtle, but more appropriate, open vs. closed.

The former axis makes little sense in a world where authoritarian governments often embrace (quasi) free-market to reign, and even have some of the basic the trappings of a democracy. The Open Government Partnership is part of an effort to redefine and shift the goal posts around what makes for a free-market democracy. Elections and a market place clearly no longer suffice and the OGP essentially sets a new bar in which a state must (in theory) allow itself to be transparent enough to provide its citizens with information (and thus power), in short: it is a state can’t simple have some of the trappings of a democracy, it must be democratic and open.

But that also leaves the larger question. Who is being contained? To find out that answer take a look at the list of OGP participants. And then consider who isn’t, and likely never could be, invited to the party.

OGP members Notably Absent
Albania
Azerbaijan
Brazil
Bulgaria
Canada
Chile
Colombia
Croatia
Czech Republic
Dominican Republic
El Salvador
Estonia
Georgia
Ghana
Guatemala
Honduras
Indonesia
Israel
Italy
Jordon
Kenya
Korea
Latvia
Liberia
Lithuania
Macedonia
Malta
Mexico
Moldova
Mongolia
Montenegro
Netherlands
Norway
Peru
Philippines
Romania
Slovak Republic
South Africa
Spain
Sweden
Tanzania
Turkey
Ukraine
United Kingdom
United States
Uruguay
ChinaIran

Russia

Saudi Arabia

(Indeed much of the middle East)

Pakistan

*India is not part of the OGP but was involved in much of initial work and while it has withdrawn (for domestic political reasons) I suspect it will stay involved tangentially.

So first, what you have here is a group of countries that are broadly democratic. Indeed, if you were going to have a democratic caucus in the United Nations, it might look something like this (there are some players in that list that are struggling, but for them the OGP is another opportunity to consolidate and reinforce the gains they’ve made as well as push for new ones).

In this regards, the OGP should be seen as an effort by the United States and some allies to find some common ground as well as a philosophical touch point that not only separates them from rivals, but that makes their camp more attractive to deal with. It’s no trivial coincidence that on the day of the OGP launch the President announced the United States first fulfilled commitment would be its decision to join the Extractive Industries Transparency Initiative (EITI). The EITI commits the American oil, gas and mining companies to disclose payments made to foreign governments, which would make corruption much more difficult.

This is America essentially signalling to African people and their leaders – do business with us, and we will help prevent corruption in your country. We will let you know if officials get paid off by our corporations. The obvious counter point to this is… the Chinese won’t.

It’s also why Brazil is a co-chair, and the idea was prompted during a meeting with India. This is an effort to bring the most important BRIC countries into the fold.

But even outside the BRICs, the second thing you’ll notice about the list is the number of Latin American, and in particular African countries included. Between the OGP, the fact that the UK is making government transparency a criteria for its foreign aid, and that World Bank is increasingly moving in the same direction, the forces for “open” are laying out one path for development and aid in Africa. One that rewards governance and – ideally – creates opportunities for African citizens. Again, the obvious counter point is… the Chinese won’t.

It may sounds hard to believe but the OGP is much more than a simple pact designed to make heads of state look good. I believe it has real geopolitical aims and may be the first overt, ideological salvo in the what I believe will be the geopolitical axis of Open versus Closed. This is about finding ways to compete for the hearts and minds of the world in a way that China, Russia, Iran and others simple cannot. And, while I agree we can debate the “openness” of the various the signing countries, I like the idea of world in which states compete to be more open. We could do worse.

Neo-Progressive Watch: Rahm Emanuel vs. Teachers Union

Anyone who read Obama’s book, The Audacity of Hope will have been struck with the amount of time the then aspiring presidential candidate spent writing about public education policy. More notably, he seemed to acknowledge that any effort at education reform was, at some point, going to butt heads with teachers unions and that new approaches were either going to have to be negotiated or imposed. It was a point of tension that wasn’t much talked about in the reviews I read. But it always struck me as interesting that here was Obama, a next generation progressive, railing against the conservatism of what is possible the original pillar of the progressive movement: public education.

All of this has, of course, been decidedly forgotten given both the bigger problems the president has faced and by the fact that he’s been basically disinterested in monkeying around in public education policy since taking office. That’s why it is still more fascinating to see what his disciples are doing as they get involved in levels of government that are in more direct contact with this policy area. Here, none is more interesting to watch than Rahm Emanuel.

This Saturday my friend Amy L. pointed me to a New York Times article outlining the most recent battle between Rahm Emanuel and the teacher’s union. My own take is that the specifics of the article are irrelevant, what matters is the broad theme. In short, Rahm Emanuel is on a short timeline. He needs to produce results immediately since local elections both happen more frequently and one is much, much closer to the citizen. That said, he doesn’t have to deliver uniform results, progress, in of itself may be sufficient. Indeed, a little experimentation is profoundly good given it can tease out faster and cheaper ways to deliver said results.

In contrast, the teacher’s union faces few of the pressures experienced by Rahm. It can afford to move at a slower pace and, more importantly, wants a uniform level of treatment across the entire system. Indeed, its entire structure is built around the guarantee of uniform treatment for its members. This uniformity is a value that evolved parallel to but not of progressive thinking. It is an artifact of industrial production that gets confused with progressive thought because of the common temporal lineage.

This skirmish offers a window into the major battle that is going to dominate the our politics in about a decade. I increasingly suspect we are moving into a world where the possibilities for education, thanks to the web and social networks, is going to be completely altered. What we deem is possible, what parents demand, and the skills that are seen as essential, are all going to shift. Our educational system, its schools, the school boards and, of course, the unions, are still bound in a world of mass production – shifting students from room to room to prepare them for the labour and production jobs of the 20th century. No matter how gifted the teachers (and there are many who are exceedingly gifted) they remain bound by the structure of the system the education system, the school boards, and the unions, have built and enforce.

Of course, what is going to be in demand are students that can thrive in the world of mass collaboration and peer production in the 21st century -behaviours that are generally viewed as “cheating” in the current model. And parents who are successful in 21st century jobs are going to be the first to ensure their children get the “right” kind of education. Which is going to put them at odds with the current education system.

This is all this is to say that the real question crisis is: how quickly will educational systems be able to adapt? Here both the school boards and the unions play an enormous role, but it is the unions that, it would appear, may be a constraining factor. If they find that having Rahm engage schools directly feels like a threat, I suspect they are going to find the next 20 years a rough, rough ride. Something akin to how the newspapers have felt regarding the arrival of the internet and craigslist.

What terrifies me most, is that unless we can devise a system where teachers are measured and so good results can be both rewarded and shared… and where parents and students have more choices around education, then families (that can afford to) are going to vote with their feet. In fact, you already see it in my home town.

The myth in Vancouver is that high property values are driving families – and thus children – out of the city. But this is patently not true. The fantastic guys over at Bing Thom Architects wrote a report on student populations in Vancouver. According to their research, in the last 10 years the estimated number of elementary and secondary aged children in Vancouver has risen by 3% (around 2,513 new students). And yet, the number of students enrolled in public education facilities has declined by 5.46%. (around 3,092 students). In fact, the Vancouver School Boards numbers seem to indicate the decline may be more pronounced.

In the meantime the number of private/independent schools has exploded by 43% going from 39 to 68 with enrollment increases of 13.8%. (Yes that does leave a surplus of students unaccounted for, I suspect they are also in private/independent schools, but outside of the City of Vancouver’s boundaries). As a public school graduate myself, one who had truly fantastic teachers but who also benefited from enormous choice (IB, French Immersion) the numbers of the past decade are very interesting to immerse oneself in.

Correct or incorrect, it would seem parents are opting for schools that offer a range of choices around education. Of course, it is only the parents who can afford to do this that are doing it. But that makes the outcome worse, not better. With or without the unions, education is going to get radically rethought. It would be nice if it was the public sector that lead that revolution, or at least was on the vanguard of it. But if our public sector managers and teachers are caught arguing over how to adjust the status quo by increments, it is hard to see how our education policy is going to make a quantum leap into the 21st century.

The Economics of Open Data – Mini-Case, Transit Data & TransLink

TransLink, the company that runs public transit in the region where I live (Vancouver/Lower Mainland) has launched a real time bus tracking app that uses GPS data to figure out how far away the next the bus you are waiting for really is. This is great news for everyone.

Of course for those interested in government innovation and public policy it also leads to another question. Will this GPS data be open data?

Presently TransLink does make its transit schedule “open” under a non-commercial license (you can download it here). I can imagine a number of senior TransLink officials (and the board) scratching their head asking: “Why, when we are short of money, would we make our data freely available?”

The answer is that TransLink should make its current data, as well as its upcoming GPS data, open and available under a license that allows for both non-commercial and commercial re-use, not just because it is the right thing to do, but because the economics of it make WAY MORE SENSE FOR TRANSLINK.

Let me explain.

First, there are not a lot of obvious ways TransLink could generate wealth directly from its data. But let’s take two possible opportunities: the first involves selling a transit app to the public (or advertising in such an app), the second is through selling a “next bus” service to companies (say coffee shops or organizations) that believe showing this information might be a convenience to their employees or customers.

TransLink has already abandoned doing paid apps – instead it maintains a mobile website at m.translink.ca – but even if it created an app and charged $1 per download, the revenue would be pitiful. Assuming a very generous customer base of 100,000 users, TransLink would generate maybe $85,000 dollars (once Apple takes its cut from the iPhone downloads, assuming zero cut for Androids). But remember, this is not a yearly revenue stream, it is one time. Maybe, 10-20,000 people upgrade their phone, arrive in Vancouver and decide to download every year. So your year on year revenue is maybe $15K? So over a 5 year period, TransLink ends up with an extra, say $145,000 dollars. Nothing to sneeze at, but not notable.

In contrast a free application encourages use. So there is also a cost to not giving it away. It could be that, having transit data more readily available might cause some people to choose taking transit over say, walking, or taking a taxi or driving. Last year TransLink handled 211.3 million trips. Let’s assume that more accessible data from wider access to the data meant there was a .1% increase in the number of trips. An infinitesimally small increase – but it means 211,300 more trips. Assuming each rider pays a one zone $2.50 fare that would still translate in an additional revenue of $528,250. Over the same five year period cited above… that’s revenue of $2.641M, much better than $145,000. And this is just calculating money. Let’s say nothing of less congested roads, less smog and a lower carbon footprint for the region…

When the this analysis is applied to licensing data it produces the same result. Will UBC pay to have TransLink’s real time data on terminals in the Student Union building? I doubt it. Would some strategically placed coffee shops… possibly. Obviously organizations would have to pay for the signs, but adding on annual “data license fee” to display’s cost would cause some to opt out. And once you take into account managing the signs, legal fees, dealing with the contract and going through the sales process, it is almost inconceivable that TransLink would make more money from these agreements than it would from simply having more signs everywhere created by other people that generated more customers for its actual core business: moving people from A to B for a fee. Just to show you the numbers, if shops that weren’t willing to pay for the data put up “next bus” screens that generated a mere 1000 new regular bus users who did only 40 one way trips a year (or 40,000 new trips), this would equal revenue of $100,000 every year at no cost to translink. Someone else could install and maintain the signs, no contracts or licenses would need to be managed.

From a cost recovery perspective it is almost impossible to imagine a scenario where TransLink is better off not allowing commercial re-use of its data.

My point is that TransLink should not be focused on creating a few bucks from licensing its data (which it doesn’t do right now anyway). It should be focused on shifting the competitive value in the marketplace from access to accessibility.

Being the monopoly holder of transit data does not benefit TransLink. All it means is that fewer people see and engage with its data. When it makes the data open and available “access” no longer becomes the defining advantage. When anybody (e.g. TransLink, Google, independent developers) can access the data, the market place shifts to competing on access to competing on accessibility. Consumers don’t turn to who has the data, they turn to who makes the data easiest to use.

For example, Translink has noted that in 2011 it will have a record number of trips. Part of me wonders to what degree the increase in trips over the past few years is a result of making transit data accessible in Google Maps. (Has anyone done a study on this in any jurisdiction?) The simple fact is that Google maps is radically easier to use for planning transit journeys than Translink’s own website AND THAT IS A GOOD THING FOR TRANSLINK. Now imagine if lots of companies were sharing translink’s data? The local Starbucks and Blenz Coffee, to colleges and universities and busy buildings downtown. Indeed, the real crime right now is that Translink has handed Google a defacto monopoly. It is allowed to use the data for commercial re-use. Local tax-paying developers…? Not so according to the license they have to click through.

Translink, you want a world where everyone is competing (including against you) on accessibility. In the end… you win with greater use and revenue.

But let me go further. There are other benefits to having Translink share its data for commercial re-use.

Procurement

Some riders will note that there are already bus stops in Vancouver which display “next bus” data (e.g. how many minutes away the next bus is). If TransLink made its next bus data freely available via an API it could conceivably alter the procurement process for buying and maintaining these signs. Any vendor could see how the data is structured and so take over the management of the signs, and/or experiment with creating more innovative or cheaper ways of manufacturing them.

The same is true of creating the RFP for TransLink’s website. With the data publicly available, TransLink could simple ask developers to mock up what they think is the most effective way of displaying the data. More development houses might be enticed to respond to the RFP increasing the likelihood of innovations and putting downward pressure of fees.

Analysis

Of course, making GPS data free could have an additional benefit. Local news companies might be able to use the bus’s GPS data to calculate traffic flow rates and so predict traffic jams. Might they be willing to pay TransLink for the data? Maybe, but again probably not enough to justify the legal and sales overhead. Moreover, TransLink would benefit from this analysis – as it could use the reports to adjust its schedule and notify its drivers of problems beforehand. Of course everyone would benefit as well as better informed commuters might change their behaviour (including taking transit!) reducing congestion, smog, carbon footprint, etc…

Indeed, the analysis opportunities using GPS data are potentially endless – much of which might be done by bloggers and university students. One could imagine correlating actual bus/subway times with any other number of data sets (crime, commute times, weather) that could yield interesting information that could help TransLink with its planning. There is no world where TransLink has the resources to do all this analysis, so enabling others to do it, can only benefit it.

Conclusion

So if you are at TransLink/Coast Mountain Bus Company (or any transit authority in the world), this post is for you. Here’s what I suggest as next steps:

1) Add GPS bus tracking API to your open data portal.

2) Change your license. Drop the non-commercial part. It hurts your business more than you realize and is anti competitive (why does can Google use the data for a commercial application while residents of the lower mainland cannot?). My suggestion, adopt the BC Government Open Government License or the PDDL.

3) Add an RSS feed to your GTFS data. Like Google, we’d all like to know when you update your data. Given we live here and are users, it be nice to extend the same service to us as you do them.

4) Maybe hold a Transit Data Camp where you could invite local developers and entrepreneurs to meet your staff and encourage people to find ways to get transit data into the hands of more Lower Mainlanders and drive up ridership!

 

 

Smarter Ways to Have School Boards Update Parents

Earlier this month the Vancouver School Board (VSB) released an iPhone app that – helpfully – will use push notifications to inform parents about school holidays, parent interviews, and scheduling disruptions such as snow days. The app is okay, it’s a little clunky to use, and a lot of the data – such as professional days – while helpful in an app, would be even more helpful as an iCal feed parents could subscribe to in their calendars.

That said, the VSB deserves credit for having the vision of developing an app. Positively, the VSB app team hopes to add new features, such as letting parents know about after school activities like concerts, plays and sporting events.

This is a great innovation and without a doubt, other school boards will want apps of their own. The problem is, this is very likely to lead to an enormous amount of waste and duplication. The last thing citizens want is for every school board to be spending $15-50K developing iPhone apps.

Which leads to a broader opportunity for the Minister of Education.

Were I the Education Minister, I’d have my technology team recreate the specs of the VSB app and propose an RFP for it but under an open source license and using phonegap so it would work on both iPhone and Android. In addition, I’d ensure it could offer reminders – like we do at recollect.net – so that people could get email or text messages without a smart phone at all.

I would then propose the ministry cover %60 percent of the development and yearly upkeep costs. The other 40% would be covered by the school boards interested in joining the project. Thus, assuming the app had a development cost of $40K and a yearly upkeep of $5K, if only one school board signed up it would have to pay $16K for the app (a pretty good deal) and $2K a year in upkeep. But if 5 school districts signed up, each would only pay $3.2K in development costs and $400 dollars a year in upkeep costs. Better still, the more that sign up, the cheaper it gets for each of them. I’d also propose a governance model in which those who contribute money for develop would have the right to elect a sub-group to oversee the feature roadmap.

Since the code would be open source other provinces, school districts and private schools could also use the app (although not participate in the development roadmap), and any improvements they made to the code base would be shared back to the benefit of BC school districts.

Of course by signing up to the app project school boards would be committing to ensure their schools shared up to date notifications about the relevant information – probably a best practice that they should be doing anyways. This process work is where the real work lies. However, a simple webform (included in the price) would cover much of the technical side of that problem. Better still the Ministry of Education could offer its infrastructure for hosting and managing any data the school boards wish to collect and share, further reducing costs and, equally important, ensuring the data was standardized across the participating school boards.

So why should the Ministry of Education care?

First, creating new ways to update parents about important events – like when report cards are issued so that parents know to ask for them – helps improve education outcomes. That should probably reason enough, but there are other reasons as well.

Second, it would allow the ministry, and the school boards, to collect some new data: professional day dates, average number of snow days, frequency of emergency disruptions, number of parents in a district interested in these types of notifications. Over time, this data could reveal important information about educational outcomes and be helpful.

But the real benefit would be in both cost savings and in enabling less well resourced school districts to benefit from technological innovation wealthier school districts will likely pursue if left to their own devices. Given there are 59 english school districts in BC, if even half of them spent 30K developing their own iPhone apps, then almost $1M dollars would be collectively spent on software development. By spending $24K, the ministry ensures that this $1M dollars instead gets spent on teachers, resources and schools. Equally important, less tech savvy or well equipped school districts would be able to participate and benefit.

Of course, if the City of Vancouver school district was smart, they’d open source their app, approach the Ministry of Education and offer it as the basis of such a venture. Doing that wouldn’t just make them head of the class, it’d be helping everyone get smarter, faster.

DataBC Hackathon this Saturday – inviting the public.

This Saturday, August 27, 2011 the Province of British Columbia is partnering with the Mozilla Foundation and OpenDataBC to host a open data hackathon.

The hackathon will be taking place at Mozilla Labs Vancouver. Their address is:
163 West Hastings Street, suite-200
Vancouver, BC V6B 1H5
(in the very beautiful Flack Building)

So three things:

First, as many of you are probably aware the province recently launched a data portal and so there is a lot of new data to play with. In addition the City of Vancouver continues to update its open data portal, so there is new data there as well. Will be interesting to see what people want to work on.

Second, please do not fret if you are not a developer. As we pointed out last year in the lead up to the open data day international hackathon, there are lots of ways non-developers can contribute – the easiest being… having ideas! So please come on by. I’m definitely going to be there (and I’m no coder) and look forward to seeing familiar and new faces.

Finally, and more to the point, if you are a company, non-profit, or citizen who is has a mashup, analysis, research paper, product, app or pretty much anything else you’d like to create but need data from the province, definitely swing by. I’m sure the staff on hand will be very keen to hear about what you want to do and see if they can make the data available in the near future.

The organizers are hoping that people will RVSP through their contact form (use the ‘other’ subject line) but if you decide last minute to come join, don’t be shy.

Hope to see you there!

 

Edmonton Heads for the Cloud

I’m confident that somewhere in Canada, some resource strapped innovative small town has abandoned desktop software and uses a cloud based service but so far no city of any real size has even publicly said they were considering the possibility.

That is, until today.

Looks like Edmonton’s IT group – which is not just one of the most forward looking in the country continues to make the rubber hit the road – is moving its email and office suite to the cloud. (I’ve posted the entire doc below since it isn’t easy to link to)

They aren’t the first city in the world to do this: Washington D.C., Orlando and Los Angeles have all moved to Google apps (in each case displacing Microsoft Office) but they are the first in Canada – a country not known for its risk taking IT departments.

I can imagine that a lot government IT people will be watching closely. And that’s too bad. There is far too much watching in Canada when there could be a lot of innovating and saving. While some will site LA’s bumpy transition, Orlando’s and DC’s were relatively smooth and are still cities that are far larger than most of their Canadian counterparts. LA is more akin to transitioning a province (or Toronto). Nobody else get’s that pass.

Two things:

1) I’ve highlighted what I think is some of the interesting points in the document being presented to council.

2) A lot of IT staff in other cities will claim that it is “too early” to know if this is going to work.

People. Wake up. It is really hard to imagine you won’t be moving to the cloud at some point in the VERY near future. I frankly don’t care which cloud solution you choose (Google vs. Microsoft) that choice is less important than actually making the move. Is Edmonton taking some risks? Yes. But it is also going to be the first city to learn the lessons, change its job descriptions, work flows, processes and the zillion other things that will come out of this. This means they’ll have a cost and productivity advantage over other cities as they play catch up. And I suspect, that there will never be a catch up, as Edmonton will already be doing the next obvious thing.

If your a IT person in a city, the question is no longer, do you lead or follow. It is merely, how far behind are you going to be comfortable being?

6. 13

Workspace Edmonton

Sole Source Agreement

Recommendation:

That, subject to the necessary funding being made available, Administration enter into a sole source agreement, in an amount not exceeding $5 million, and a period not exceeding five years, with Google Inc., for the provision of computing productivity tools, and that the contract be in form and content acceptable to the City Manager.

Report Summary

The IT Branch undertook a technical assessment of seven options for the delivery of desktop productivity tools. Software as a Service (‘cloud computing’) was identified as the preferred direction as it allows the corporation to work from anytime, place or device. Google Mail and Google Apps were determined to provide the best solution. The change will ensure ongoing sustainability of the services, provides opportunities for service and productivity gains, and align IT services with key principles in The Way We Green, The Way We Live and The Way We Move.

Report

The City Administration Bylaw 12005 requires approval from Executive Committee for Single Source Contracts (contracts to be awarded without tendering) in excess of $500,000, and those contracts that may exceed ten years in duration.

The Workspace Edmonton Program consists of two initiatives, which will allow the delivery of information technology software and services to employees, contractors and third party partners anytime and place, and on any device. In order to accomplish this the administration is proposing moving away from a model where software is installed on every computer to a solution where the software is housed on the internet (‘cloud computing’).

Administration is recommending the implementation of Google Apps Premier Edition as the primary computing productivity tool, with targeted use of Microsoft Office and SharePoint. The recommended direction will allow the City to move to Google Mail as the corporate messaging tool and Google Apps as the primary office productivity tools. It will also allow the corporation access to other applications offered by Google Inc. and partners to Google Inc. Microsoft Office and SharePoint will remain as the secondary office productivity tools for business areas that require these applications for specific business needs. Use of the Microsoft tools will require completion of the appropriate use case and approval by the Chief Information Officer.

Administration is requesting approval to proceed to negotiation of a contract with Google Inc. The sole source agreement is required at this time to allow the program to be developed in 2011. This is foundational work that will allow the program to proceed to implementation in 2012. The contract is also required in order to complete the Privacy Impact Assessment and develop implementation plans.

Benefits

Workspace Edmonton creates the opportunity for the City of Edmonton to significantly change the way we work. Administration will have increased options for delivering services to citizens, including enhanced mobile field services and new opportunities for community consultation and collaboration. The consumer version of Google is free to private citizens and not-for-profit groups and would allow additional options for collaboration with organizations such as community leagues with no net cost to the corporation or organization.

The move to G-Mail will allow the corporation to extend email access to all city employees, improving access to information and communications. It will also allow for implementation of a number of services without additional licensing costs, including:

  • audio and video chat
  • group sites to allow improved collaboration with external
    partners and community groups
  • internal Youtube for training and information sharing
  • increased collaboration through document sharing and simultaneous authoring capabilities

The program presents the opportunity for the City to better address the expectations of the next generation of workers by providing options to bring your device and to work with software many already use. Both Edmonton Public Schools and the University of Alberta have implemented Google Apps.

In addition, the implementation of Google Apps will include an e-records
solution for documents stored in Google Apps. This will be implemented in partnership with the Office of the City Clerk. The benefit of this being alignment with legislated and corporate requirements for records retention, retrieval, and disposal.

Moving to the Software as a Service Model (‘cloud computing’) through the internet will avoid additional hardware and support costs associated with increased service demands due to growth. This solution provides a more sustainable business model, reducing demands on resources for regular product upgrades and services support. Finally, the relocation of software and data to multiple secure data centres facilitates continuation of services during emergencies such as natural disasters and pandemics. City employees will be able to access email and documents through the internet from any office or home computer.

Solution Assessments

The IT Branch undertook a technical assessment of seven office productivity software and service delivery options. A financial assessment of the top three options was subsequently completed and the recommended direction to move to Google Inc. as the service provider was based on these assessments. Following this, the IT Branch undertook a security assessment to ensure the option chosen met security requirements and industry standards. A Privacy Impact Assessment has been initiated and will be completed upon negotiation of an agreement. Precedent in Alberta has been set with both the Edmonton Public School Board and the University of Alberta entering into agreements with Google Inc.

Strategic Direction

The Workspace Edmonton Program supports Council’s strategic direction for innovation and a well managed city, as well as key principles in The Way We Green, The Way We Move, and the Way We Live.

Budget/Financial Implications

Google Messaging and Apps will replace the existing Microsoft Exchange and majority of Office licenses. The funding currently in place for Microsoft license maintenance will be sufficient to fund the annual Google services.

2011 funding for the implementation of overall Workspace Edmonton Program is within the current IT budgets and will be the source of funding. Funding for 2012 will be included in the 2012 budget request.    A business case for this initiative was completed and is available for review.

The Workspace Edmonton model aligns with and complements the corporate initiative of Transforming Edmonton. The administration will look for opportunities to integrate the programs and utilize a portion of the funding for Transforming Edmonton to fund Workspace Edmonton change and transition requirements.

Risks

If the recommendation is not supported, Workspace Edmonton will stop and the corporation will be required to either go to Request For Proposal or remain on the existing platform. Remaining on the existing platform will require additional funding in future years to support continued maintenance costs and future growth. (Extending email only to city staff who do not currently have email accounts would cost the corporation approximately $900,000 per year with the existing solution.) Delaying the implementation to 2012 would result in delays to return on investment and achievement of the benefits.

Justification of Recommendation
Technical, financial and security assessments have been completed. The recommended solution meets business requirements, provides opportunities to increase and improve service delivery and is projected to garner a return on investment within 18 to 24 months of implementation. Approval of this recommendation will allow Administration to proceed to negotiation of a contract.

Others Reviewing this Report
• L. Rosen, Chief Financial Officer and Treasurer

WRITTEN BY – D. Kronewitt-Martin | August 24, 2011 – Corporate Services 2011COT006