Category Archives: vancouver

The Curious Case of Media Opposing Government Transparency

My gosh there is a lot going on. Republicans – REPUBLICANS(!) who were in charge of America’s prison system are warning Canada not to follow the Conservatives plan on prisons, the Prime Minister has renamed the government, after himself and my friends at Samara had in Toronto the Guardian’s Emily Bell to talk wikileaks and data journalism (wish I could have been there).

It’s all very interesting… and there is a media story here in British Columbia that’s been brewing where a number of journalists have become upset about a government that has become “too” transparent.

It’s an important case as it highlights some of the tensions that will be emerging in different places as governments rethink how they share information.

The case involves BC Ferries, a crown corporation that runs ferries along critical routes around the province. For many years the company was not subject to the province’s Freedom of Information legislation. However, a few months ago the government stated the crown corporation would need to comply with the act. This has not pleased the corporation’s president.

To comply with the act BC Ferries has created an FOI tracker website on which it posts the text of FOI requests received. Once the records are processed they are posted online and some relevant listservs. As a result they can be read by an audience (that cares).

Broadly, journalists, are up in arms for two reasons. One bad, the other even worse.

The terrible reasons was raised by Chad Skelton (who’s a great reporter for whom I have a lot of respect and whose column should be read regularly).

Skelton argues that BC Ferries deserves part of the blame for stories with errors as the process lead news agencies to rush (carelessly) in order to beat each other in releasing the story. This is a disappointing position. It’s the news media’s job to get the facts right. (It’s also worth noting here that Skelton’s own media organizations did not make the mistakes in question). Claiming that BC Ferries is even partly responsible seems beyond problematic since they are in no way involved in the fact and error checking processes. We trust the media (and assess it) to get facts right in fast moving situations… why should this be different?

More interesting is the critique that this model of transparency undermines the ability of journalists to get a scoup and thus undermines the business model of traditional media.

What makes this so interesting is that is neither true nor, more importantly, relevant.

First, it’s not the job of government to support the business model of the media. The goal of government should be to be as transparent as possible about its operations. This can, and should, include its FOI requests. Indeed, one thing I like about this process is that an FOI request that is made but isn’t addressed starts to linger on the site – and that the organization can be held to account, publicly, for the delay. More importantly, however, I’m confident that the media will find new ways to exploit the process and that, while painful, new business models will emerge.

Second, the media is not the only user of FOI. It strikes me as problematic to expect that the FOI system should somehow be tailored to meet needs alone. Individuals, non-profits, businesses, opposition politicians and others all use the FOI process. Indeed, the policy strengthens many of these use cases since, as mentioned above,  delays in processing will be visible and open the organization up to greater pressure and scrutiny. Why are all the use cases of these other institutions somehow secondary to those of journalists and the media? Indeed, the most important use case – that of the citizen – is better served. Isn’t that the most important outcome?

Third, this form of transparency could make for better media. One of my favourite quotes (which I got via Tim O’Reilly) comes from Clayton Christensen in a 2005 Harvard Business Review article:

“When attractive profits disappear at one stage in the value chain because a product becomes modular and commoditized, the opportunity to earn attractive profits with proprietary products will usually emerge at an adjacent stage.”

So BC Ferries has effectively commoditized FOI requests. That simply means that value will shift elsewhere. One place it could shift to is analysis. And wouldn’t that be a good thing to have the media compete on? Rather than simply who got the fact fastest (a somewhat silly model in the age of the internet) readers instead started to reward the organization with the best insights? Indeed, it makes me think that on superficial issues, like say, the salary of an employee, it may be hard for one individual or organization to scoop another. But most often the value of these stories is also pretty low. On a more significant story, one that requires research and digging and a knowledge of the issue, it’s unclear that transparency around FOI requests will allow others to compete. More interestingly, some media organizations, now that they have access to all FOI requests, might start analyzing them for deeper more significant patterns or trends that might reveal more significant problems that the current scattered approach to FOI might never reveal.

What’s also been interesting is the reaction stories by journalists complaining about this issue have been received. It fits nicely in with the piece I wrote a while ago (and now published as part of a journalism textbook) about Journalism in an Open Era. The fact is, the public trust of opaque institutions is in decline – and the media is itself a pretty opaque institution. Consider these three separate comments people wrote after the stories I’ve linked to above:

“I wonder over the years how many nuggets of information reporters got through FOI but the public never heard about because they didn’t deem it “newsworthy”. Or worse, that it was newsworthy but didn’t follow their storyline.” (found here)

“And the media whining about losing scoops — well, tough beans. If they post it all online and give it to everyone, they are serving the public –the media isn’t the public, and never has been.” (found here)

“The media’s track record, in general, for owning up to its blunders continues to be abysmal. Front page screw-ups are fixed several days (or weeks) later with a little “setting it straight” box buried at the bottom of P. 2 — and you think that’s good enough. If the media were more open and honest about fixing its mistakes, I might cut you a little slack over the BC Ferries’ policy of making your life difficult. But whining about it is going to be counterproductive, as you can see from most of the comments so far.” (found here)

While some comments were supportive of the articles, the majority have not been. Suggesting that at the minimum that the public does not share the media’s view that this new policy is a “controversial.”

This is not, of course, to say that BC Ferries implemented its policy because it sought to do the right thing. I’m sure it’s president would love for their to be fewer requests and impede the efforts of journalists. I just happen to think he will fail. Dismally. More concerning is the fact that FOI requests are not archived on the site and are removed after a few months. This is what should get the media, the public and yes, the Information and Privacy Commissioner, up in arms.

Vancouver International Open Data Hackathon Event Agenda/Invite

Tomorrow is December 4th. The International Open Data Hackathon will be taking place around the world. Here in Vancouver, we’ll be contributing as well.
Here are some details:
Goals/Important points:
  • This is about having fun and working on something that makes you feel good. If, at any point you aren’t feeling that… then start doing something that does or come talk to me. :)
  • Our main goal is to create artifacts that will help strengthen our democracy, be fun to use, or just make life a little better – we want more open data, let’s show the world why it matters.
  • Our other goal is to build community, both here in Vancouver, and around the world, so let’s help one another, both in city, and those elsewhere…
  • Remember, this is not just for programmers. Any project will need a variety of skills.
  • If you don’t think you can do anything helpful, trust me, that is not the case.
Location:
  • W2/Storyeum 151 W. Cordova (map) in Vancouver
  • Phone: 604-689-9896
Schedule:
  • 9:30-10am People can start arriving any time after 9:30am
  • 10:00-10:30am We’ll be starting at 10am. We’ll begin with brief introductions and give those with ideas an opportunity to share them
  • 10:30 Sort into teams. If you haven’t already chosen a project to work on for the day… now is the time.
  • 10:30-12:00pm Hacking. Lots here to do for designers, developers, citizens to get materials organized, write copy or code, etc…
  • 12:00pm Check in. 5 minutes for teams to share progress, challenges, ask for suggestions
  • 12:30-3:30pm More hacking goodness
  • 3:30 project update/presentations

Vancouver Hack Space will be opening its doors after the hackathon for people who want to keep hacking over there. Great people at VHS so it should be good…

What to bring & expect:
  • A laptop.  If you absolutely can’t bring a laptop, please come anyway, there will be things to do.
  • We have a number of ideas that can be worked on. If you have one… great! If you are looking for cool people to work with… you’re coming to the right place.
  • Ideas that no one wants to work on will be removed.
  • We will ask people to vote with their feet, gathering into self forming teams for each project.

Open Data planning session at BarCamp Vancouver

With the International Open Data Hackathon a little more the 2 weeks away a lot has happened.

On the organizing wiki people in over 50 cities in 21 countries and 4 continents have offered to organize local events. Open data sets that people can use have been posted to a specially created page, a few nascent app ideas have been shared, as has advice on how to run a hackathon. (on twitter, the event hashtag is #odhd)

In Vancouver, the local BarCamp will be taking place this weekend. I’m not in town, however, Aaron Gladders, local hacker with a ton of experience working with and opening up data sets, contacted me to let me know he’d like to do a planning session for the hackathon at Barcamp. If you’re in Vancouver I hope you can attend.

Why? Because this is a great opportunity. And it has lessons for the hackathons around the world.

I love it because it means people can share ideas and projects they would like to hack on, recruit others, as well as hear feedback about challenges, obstacles, alternative approaches, and think about all of this  for two weeks before the hackathon. A planning session also has  has an even bigger benefit. It means more people are likely to arrive on the day with something specific ready to work on. I want the hackathons to be social. But they can’t be exclusively so. It is important that we actually try to create some real products that are useful to us and/or our fellow citizens.

For those elsewhere in the world who are also thinking about December 4th I hope that some of us will start reaching out to one another and thinking about how we will spend the day. A few thoughts on this:

1. Take a look at the data sets that are out there before Dec 4th. People have been putting together a pretty good list here.

2. Localization. I think some of the best wins will be around localizing successful apps from other places. For example, I’ve been encouraging the team in Bangalore to consider localizing Michael Mulley’s OpenParliament.ca application (the source code for which is here). If you have an application you think others might want to localize, add it to the application page on the wiki. If there is an app out there you’d like to localize, write its author/developer team. Ask them if they might be willing to share the code.

3. Get together with 2-3 friends and come up with a plan. What do you want to accomplish on the 4th?

4. If you are looking for a project, let people know on the wiki, leave a twitter handle or some way for people with idea to contact you before the 4th.

Okay, that’s it for now. I’m really excited about how much progress we’ve made in a few short weeks. Ideally at the end of the 4th I’d love for some cities to be able to showcase some apps to the world that they’ve created. We have an opportunity to show the media, politicians, public servants, our fellow citizens, but most importantly, each other, just want is possible with open data.


					

Links from Gov2.0 Summit talk and bonus material

My 5 minute lightening fast jam packed talk (do I do other formats? answer… yes) from yesterday’s Gov2.0 summit hasn’t yet been has just been posted to youtube. I love that this year the videos have the slides integrated into it.

For those who were, and were not, there yesterday, I wanted to share links to all the great sites and organizations I cited during my talk, I also wanted to share one or two quick stories I didn’t have time to dive into:

VanTrash and 311:

Screen-shot-2010-09-09-at-3.07.32-AM-1024x640As one of the more mature apps in Vancouver using open data Vantrash keeps being showing us how these types of innovations just keep giving back in new and interesting ways.

In addition to being used by over 3000 households (despite never being advertised – this is all word of mouth) it turns out that the city staff are also finding a use for vantrash.

I was recently told that 311 call staff use Vantrash to help trouble shoot incoming calls from residents who are having problems with garbage collection. The first thing one needs to do in such a situation is identify which collection zone the caller lives in – turns out VanTrash is the fastest and more effective way to accomplish this. Simply input the caller’s address into the top right hand field and presto – you know their zone and schedule. Much better than trying to find their address on a physical map that you may or may not have near your station.

TaxiCity, Open Data and Game Development

Another interesting spin off of open data. The TaxiCity development team, which recreated downtown Vancouver in 2-D using data from the open data catalog, noted that creating virtual cities in games could be a lot easier with open data. You could simply randomize the height of buildings and presto an instant virtual city would be ready. While the buildings would still need to be skinned one could recreate cities people know quickly or create fake cities that felt realistic as they’d be based on real plans. More importantly, this process could help reduce the time and resources needed to create virtual cities in games – an innovation that may be of interest to those in the video game industry. Of course, given that Vancouver is a hub for video game development, it is exactly these types of innovations the city wishes to foster and will help sustain Vancouver’s competitive advantage.

Links (in order of appearance in my talk)

Code For America shirt design can be seen in all their glory here and can be ordered here. As a fun aside, I literally took that shirt of Tim O’Reilly’s back! I saw it the day before and said, I’d wear that on stage. Tim overheard me and said he’d give me his if I was serious…

Vancouver’s Open Motion (or Open3, as it is internally referred to by staff) can be read in the city’s PDF version or an HTML version from my blog.

Vancouver’s Open Data Portal is here. keep an eye on this page as new data sets and features are added. You can get RSS feed or email updates on the page, as well as see its update history.

Vantrash the garbage reminder service’s website is here. There’s a distinct mobile interface if you are using your phone to browse.

ParkingMobility, an app that crowdsources the location of disabled parking spaces and enables users to take pictures of cars illegally parked in disabled spots to assist in enforcement.

TaxiCity, the Centre for Digital Media Project sponsored by Bing and Microsoft has its project page here. Links to the sourcecode, documentation, and a ton of other content is also available. Really proud of these guys.

Microsoft’s Internal Vancouver Open Data Challenge fostered a number of apps. Most have been opensourced and so you can get access to the code as well. The apps include:

The Graffiti Analysis written by University of British Columbia undergraduate students can be downloaded from this blog post I posted about their project.

BTA Works – the research arm of Bing Thom Architects has a great website here. You can’t download their report about the future of Vancouver yet (it is still being peer-reviewed) but you can read about it in this local newspaper article.

Long Tail of Public Policy – I talk about this idea in some detail in my chapter on O’Reilly Media’s Open Government. There is also a brief blog post and slide from my blog here.

Vancouver’s Open Data License – is here. Edmonton, Ottawa and Toronto use essentially the exact same thing. Lots that could be done on this front still mind you… Indeed, getting all these cities on a single standard license should be a priority.

Vancouver Data Discussion Group is here. You need to sign in to join but it is open to anyone.

Okay, hope those are interesting and helpful.

Creating Open Data Apps: Lessons from Vantrash Creator Luke Closs

Last week, as part of the Apps for Climate Action competition (which is open to anyone in Canada), I interviewed the always awesome Luke Closs. Luke, along with Kevin Jones, created VanTrash, a garbage pick up reminder app that uses open data from the City of Vancouver. In it, Luke shares some of the lessons learned while creating an application using open data.

As the deadline for the Apps for Climate Action competition approaches (August 8th) we thought this might help those who are thinking about throwing their hat in the ring last minute.

Some key lessons from Luke:

  • Don’t boil the ocean: Keep it simple – do one thing really, really well.
  • Get a beta up fast: Try to scope something you can get a rough version working in day or evening – that is a sure sign that it is doable
  • Beta test: On friends and family. A lot.
  • Keep it fun: do something that develops a skill or let’s you explore a technology you’re interested in

Apps for Climate Action Update – Lessons and some new sexy data

ttl_A4CAOkay, so I’ll be the first to say that the Apps4Climate Action data catalog has not always been the easiest to navigate and some of the data sets have not been machine readable, or even data at all.

That however, is starting to change.

Indeed, the good news is three fold.

First, the data catalog has been tweaked and has better search and an improved capacity to sort out non-machine readable data sets. A great example of a government starting to think like the web, iterating and learning as the program progresses.

Second, and more importantly, new and better sets are starting to be added to the catalog. Most recently the Community Energy and Emissions Inventories were released in an excel format. This data shows carbon emissions for all sorts of activities and infrastructure at a very granular level. Want to compare the GHG emissions of a duplex in Vancouver versus a duplex in Prince George? Now you can.

Moreover, this is the first time any government has released this type of data at all, not to mention making it machine readable. So not only have the app possibilities (how green is your neighborhood, rate my city, calculate my GHG emissions) all become much more realizable, but any app using this data will be among the first in the world.

Finally, probably one of the most positive outcomes of the app competition to date is largely hidden from the public. The fact that members of the public have been asking for better data or even for data sets at all(!) has made a number of public servants realize the value of making this information public.

Prior to the competition making data public was a compliance problem, something you did but you figured no one would ever look at or read it. Now, for a growing number of public servants, it is an innovation opportunity. Someone may take what the government produces and do something interesting with it. Even if they don’t, someone is nonetheless taking interest in your work – something that has rewards in of itself. This, of course, doesn’t mean that things will improve over night, but it does help advance the goal of getting government to share more machine readable data.

Better still, the government is reaching out to stakeholders in the development community and soliciting advice on how to improve the site and the program, all in a cost-effective manner.

So even within the Apps4Climate Action project we see some of the changes the promise of Government 2.0 holds for us:

  • Feedback from community participants driving the project to adapt
  • Iterations of development conducted “on the fly” during a project or program
  • Success and failures resulting in queries in quick improvements (release of more data, better website)
  • Shifting culture around disclosure and cross sector innovation
  • All on a timeline that can be measured in weeks

Once this project is over I’ll write more on it, but wanted to update people, especially given some of the new data sets that have become available.

And if you are a developer or someone who would like to do a cool visualization with the data, check out the Apps4Climate Action website or drop me an email, happy to talk you through your idea.

Saving Millions: Why Cities should Fork the Kuali Foundation

For those interested in my writing on open source, municipal issues and technology, I want to be blunt: I consider this to be one of the most important posts I’ll write this year.

A few months ago I wrote an article and blog post about “Muniforge,” an idea based on a speech I’d given at a conference in 2009 in which I advocated that cities with common needs should band together and co-develop software to reduce procurement costs and better meet requirements. I continued to believe in the idea, but have recognized that cultural barriers would likely mean it would be difficult to realize.

Last month that all changed. While at Northern Voice I ended up talking to Jens Haeusser an IT strategist at the University of British Columbia and confirmed something I’d long suspected: that some people much smarter than me had already had the same idea and had made it a reality… not among cities but among academic institutions.

The result? The Kuali foundation. “…A growing community of universities, colleges, businesses, and other organizations that have partnered to build and sustain open-source administrative software for higher education, by higher education.”

In other words for the past 5 years over 35 universities in the United States, Canada, Australia and South Africa have been successfully co-developing software.

For cities everywhere interested in controlling spending or reducing costs, this should be an earth shattering revelation – a wake up call – for several reasons:

  • First, a viable working model for muniforge has existed for 5 years and has been a demonstrable success, both in creating high quality software and in saving the participating institutions significant money. Devising a methodology to calculate how much a city could save by co-developing software with an open source license is probably very, very easy.
  • Second, what is also great about universities is that they suffer from many of the challenges of cities. Both have: conservative bureaucracies, limited budgets, and significant legacy systems. In addition, neither have IT as the core competency and both are frequently concerned with licenses, liability and the “owning” intellectual property.
  • Which thirdly, leads to possibly the best part. The Kuali Foundation has already addressed all the critical obstacles to such an endeavour and has developed licensing agreements, policies, decision-making structures, and work flows processes that address necessary for success. Moreover, all of this legal, policy and work infrastructure is itself available to be copied. For free. Right now.
  • Fourth, the Kuali foundation is not a bunch of free-software hippies that depend on the kindness of strangers to patch their software (a stereotype that really must end). Quite the opposite. The Kuali foundation has helped spawn 10 different companies that specialize in implementing and supporting (through SLAs) the software the foundation develops. In other words, the universities have created a group of competing firms dedicated to serving their niche market. Think about that. Rather than deal with vendors who specialize in serving large multinationals and who’ve tweaked their software to (somewhat) work for cities, the foundation has fostered competing service providers (to say it again) within the higher education niche.

As a result, I believe a group of forwarding thinking cities – perhaps starting with those in North America – should fork the Kuali Foundation. That is, they should copy Kuali’s bylaws, it structure, its licenses and pretty much everything else – possibly even the source code for some of its projects – and create a Kuali for cities. Call it Muniforge, or Communiforge or CivicHub or whatever… but create it.

We can radically reduce the costs of software to cities, improve support by creating the right market incentive to help foster companies whose interests are directly aligned with cities and create better software that meets cities unique needs. The question is… will we? All that is required is for CIO’s to being networking and for a few to discover some common needs. One I idea I have immediately is for the City of Nanaimo to apply the Kuali modified Apache license to its council monitoring software package it developed in house, and to upload it to GitHub. That would be a great start – one that could collectively save cities millions.

If you are a city CIO/CTO/Technology Director and are interested in this idea, please check out these links:

The Kuali Foundation homepage

Open Source Collaboration in Higher Education: Guidelines and Report of the Licensing and Policy Framework Summit for Software Sharing in Higher Education by Brad Wheeler and Daniel Greenstein (key architects behind Kuali)

Open Source 2010: Reflections on 2007 by Brad Wheeler (a must read, lots of great tips in here)

Heck, I suggest looking at all of Brad Wheeler’s articles and presentations.

Another overview article on Kuali by University Business

Phillip Ashlock of Open Plans has an overview article of where some cities are heading re open source.

And again, my original article on Muniforge.

If you aren’t already, consider reading the OpenSF blog – these guys are leaders and one way or another will be part of the mix.

Also, if you’re on twitter, consider following Jay Nath and Philip Ashlock.

Open Data: An Example of the Long Tail of Public Policy at Work

VancouverGraffiti_AnalysisAs many readers know, Vancouver passed what has locally been termed the Open3 motion a year ago and has had a open data portal up and running for several months.

Around the world much of the focus of open data initiatives have focused on the development of applications like Vancouver’s Vantrash, Washington DC’s Stumble Safely or Toronto’s Childcare locator. But the other use of data portals is to actually better understand and analyze phenomena in a city – all of which can potentially lead to a broader diversity of perspectives, better public policy and a more informed public and/or decision makers.

I was thus pleased to find out about another example of what I’ve been calling the Long Tail of Public Policy when I received an email from Victor Ngo, a student at the University of British Columbia who just completed his 2nd year in the Human Geography program with an Urban Studies focus (He’s also a co-op student looking for a summer job – nudge to the City of Vancouver).

It turns out that last month, he and two classmates did a project on graffiti occurrence and its relationship to land use, crime rates, and socio-economic variables. As Victor shared with me:

It was a group project I did with two other members in March/April. It was for an introductory GIS class and given our knowledge, our analysis was certainly not as robust and refined as it could have been. But having been responsible for GIS analysis part of the project, I’m proud of what we accomplished.

The “Graffiti sites” shapefile was very instrumental to my project. I’m a big fan of the site and I’ll be using it more in the future as I continue my studies.

So here we have University students in Vancouver using real city data to work on projects that could provide some insights, all while learning. This is another small example of why open data matters. This is the future of public policy development. Today Victor may be a student, less certain about the quality of his work (don’t underestimate yourself, Victor) but tomorrow he could be working for government, a think tank, a consulting firm, an insurance company or a citizen advocacy group. But wherever he is, the open data portal will be a resource he will want to turn to.

With Victor’s permission I’ve uploaded his report, Graffiti in the Urban Everyday – Comparing Graffiti Occurrence with Crime Rates, Land Use, and Socio-Economic Indicators in Vancouver, to my site so anyone can download it. Victor has said he’d love to get people’s feedback on it.

And what was the main drawback of using the open data? There wasn’t enough of it.

…one thing I would have liked was better crime statistics, in particular, the data for the actual location of crime occurrence. It would have certainly made our analysis more refined. The weekly Crime Maps that the VPD publishes is an example of what I mean:

http://vancouver.ca/police/CrimeMaps/index.htm

You’re able to see the actual location where the crime was committed. We had to tabulate data from summary tables found at:

http://vancouver.ca/police/organization/planning-research-audit/neighbourhood-statistics.html

To translate: essentially the city releases this information in a non-machine-readable format, meaning that citizens, public servants at other levels of government and (I’m willing to wager) City of Vancouver public servants outside the police department have to recreate the data in a digital format. What a colossal waste of time and energy. Why not just share the data in a structured digital way? The city already makes it public, why not make it useful as well? This is what Washington DC (search crime) and San Francisco have done.

I hope that more apps get created in Vancouver, but as a public policy geek, I’m also hoping that more reports like these (and the one Bing Thom architects published on the future of Vancouver also using data from the open data catalog) get published. Ultimately, more people learning, thinking, writing and seeking solutions to our challenges will create a smarter, more vibrant and more successful city. Isn’t that what you’d want your city government (or any government, really…) to do?

On Journalism & Crowdsourcing: the good, the bad, the ugly

Last week the Vancouver Sun (my local paper) launched a laudable experiment. They took all of the campaign finance data from the last round of municipal elections in the Lower Mainland (the Greater Vancouver area in Canada) and posted a significant amount of it on their website. This is exactly the type of thing I’ve been hoping that newspapers would do more of in Canada (much like British newspapers – especially The Guardian – have done). I do think there are some instructive lessons, so here is a brief list of what I think is good, bad and ugly about the experiment.

The Good:

That it is being done at all. For newspapers in Canada to do anything other than simply repackage text that was (or wasn’t) going to end up in the newsprint sadly still counts as innovation here. Seriously, someone should be applauding the Vancouver sun team. I am. I hope you will to. Moreover, enabling people to do some rudimentary searches is interesting – mostly as people will want to see who the biggest donors are. Of course, no surprise to learn that in many cases the biggest donors in municipal elections (developers) give to all the major parties or players… just to cover their bets. Also interesting is that they’ve invited readers to add “If you find something interesting in the database that you want to share with other readers, go to The Sun’s Money & Influence blog at vancouversun.com/influence and post a comment” and is looking for people to sniff out news stories.

While it is great that the Vancouver Sun has compiled this data, it will be interesting to see who, if anyone uses their data. A major barrier here is the social contract between the paper and those it is looking to engage. The paper won’t actually let you access the data – only run basic searches. This is because they don’t want readers running off and doing something interesting with the data on another website. But this constraint also means you can’t visualize it, (for example put it into a spread sheet and graph) or try to analyze it in some interesting ways. Increasingly our world isn’t one where we tell the story in words, we tell is visually with graphs, charts and visuals… that is the real opportunity here.

I know a few people who would love to do something interesting with the data (like John Jensen or Luke Closs), if they could access it. I also understand that the Vancouver Sun wants the discussion to take place on their page. But if you want people to use the data and do something interesting with it, you have to let them access it: that means downloading it or offering up an API (This is what The Guardian, a newspaper that is serious about letting people use their data, does.). What the Sun could have done was distribute it with an attribution license, so that anybody who used the API had to at least link back to The Sun. But I don’t know a single person out there who with or without a license wouldn’t have linked back to the Sun, thanked them, and driven a bunch a traffic to them. Moreover, if The Sun had a more open approach, it could have likely even enlisted people to to data entry on campaign donations in other districts around the province. Instead, many of the pages for this story sit blank. There are few comment but some like these two that are not relevant and the occasional gem like this one). There is also one from John Jensen, open data hackathon regular who has been trying to visualize this data for months but been unable to since typing up all the data has been time consuming.

At the end of the day, if you want readers to create content for you, to sniff out stories and sift through data, you have to trust them, and that means giving them real access. I can imagine that feels scary. But I think it would work out.

The Ugly:

The really ugly part about this story is that the Vancouver Sun needed to do all this data entry in the first place. Since campaigns are legally required to track donations most track them using… MicroSoft Excel. Then, because the province requires that candidates disclose donations the city in which the candidate is running insists that they submit the list of donations in print. Then that form gets scanned and saved as a PDF. If, of course, the province’s campaign finance law’s were changed so as to require you to submit your donations in an electronic format, then all of the data entry the Sun had to do would disappear and suddenly anyone could search and analyze campaign donations. In short, even though this system is suppose to create transparency, we’ve architected it to be opaque. The information is all disclosed, we’ve just ensured that it is very difficult and expensive to sort through. I’m sadly, not confident that the BC Election Task Force is going to change that although I did submit this as a recommendation.

Some Ideas:

1) I’d encourage the Vancouver Sun to make available the database they’ve cobbled together. I think if they did, I know I would be willing to help bring together volunteers to add donation data from more municipalities and to help create some nice visualizations of the data. I also think it would spark a larger discussion both on their website, and elsewhere across the internet (and possibly even other mediums) around the province. This could become a major issue. I even suspect that there would be a number of people at the next open data hackathon who would take this issue up.

2) Less appealing is to scrape the data set off the Vancouver Sun’s website and then do something interesting with it. I would, of course, encourage whoever did that to attribute the good work of the Vancouver Sun, link back to them and even encourage readers to go and participate in their discussion forum.

Case Study: 3 Ways Open Data are making Vancouver better

It is still early days around the use of Open Data in Vancouver but already a number of interesting things are afoot.

Everybody here knows about Vantrash – which has just garnered its 1500th user. Our goal was to get to 2500 users (as this would represent 1% of the city’s households) and would really be more like 3% market penetration given that many households have private garbage contractors. This without any advertising or marketing.

But Vantrash is no longer the only example of open data hard at work. Three other stories have emerged – each equally interesting:

Big Players Start to Experiment – Microsoft:

Microsoft recently held an internal apps competition – I served as a judge – and many of the winners I blogged about back in February have been released and updated for public use (and the code, so that others can fork or improve the applications). Indeed, on Thursday at Goldfish in Yaletown, Microsoft held a demo event so people could see what they’ve been up to. (There’s a full article here.)

My favourite was VanPark2010 – an application for finding parking spaces, and parking meter costs/hr around the city. One of the things I loved about this app is how it prompted other actors – like the various parking companies to share some of their data as well.

Also of interest is VanGuide (also available on the iPhone, yes, a Microsoft app coded for the iPhone…) – a platform any number of companies could use to create mashups of whatever they wanted around a map of Vancouver. Personally, I like the geo-tagged tweet indicator – let’s you see what people who geo-tag their tweets within Vancouver are talking about.

The linked news article above also talks about FreeFinders (another app that some local newspapers or arts groups should consider looking at) that can allow you to see what free events are taking place around the city; MoBuddy (for planning trips and then caching your trip plans so you don’t have to use data roaming when traveling) and Mapway.

The lesson: A large company like Microsoft can see open data as a catalyst for new applications and services, and for getting developers excited about Microsofts tools. They are willing to experiment and see open data as part of the future of a software/service ecosystem.

Open Data Drives Research and Development:

Over at the Centre for Digital Media at the Great Northern Way campus, a group of students has being experimenting with the city’s open data catalog and Bing Maps and have created a taxi simulator that allows you to drive through the streets of downtown Vancouver. This is exactly the type of early R&D that cities that do open data get to capitalize on. In the future I can imagine not only video games being developed that use open data, but also driving or even traffic simulators. I’m really pumped about the great work the Taxicity team at GNW has been doing (and, full disclosure, it has been a real pleasure advising them). Check out their website here – and yes, that it me in the Ryerson sweatshirt…

Open Data Allows for Better Policy-Making and Research:

For a policy wonk like me I’m really excited about this last example.Bing Thom Architects Foundation released a report analyzing the impact of rising sea levels on the City of Vancouver. In a recent Georgia Straight article on the report, the researchers explained how:

The firm was able to conduct this research thanks to the city’s open-data catalogue, which makes information about the shoreline available on the city’s Web site. Heeney, Keenan, and Yan recently visited the Georgia Straight office to talk about their work, which examined the impact of sea level rising in one-metre increments up to seven metres.

Now city councilors are better able to assess the risks and costs around rising sea levels thanks, in part, to open data. This is the type of analysis and knowledge I hoped open data would enable – so great to see it happening so quickly. (sorry for the lack of link – I’ve been unable to find a link to the report, will post it as soon as I find it)