Monthly Archives: November 2011

Canada’s Foreign Aid Agency signs on to IATI: Aid Data get more transparent

Last night, while speaking at the High Level Forum on Aid Effectiveness in Busan Korea, Minister of International Cooperation Bev Oda announced that Canada would be signing on to the International Aid Transparency Initiative (IATI).

So what is IATI and why does this matter?

IATI has developed a common, open and international standard for sharing foreign aid data. By signing on to IATI Canada is agreeing to publish all the data about its projects and who it funds in a form and structure that makes it easy to compare with others who use the IATI standard. This should make it easier to understand where Canadian aid money ends up, in turn allowing analysts to spot efficiencies as well as compare funding and efforts across donor and recipient countries as well as other stakeholders. In short, aid data should become easier to understand, to compare, and to use.

In the medium term it should also make the data available on CIDA’s open data portal (already helpful to non-profits, development groups and students) even more useful.

This is an enormous win for the good people at Engineers Without Borders, as well as the team at Publish What You Fund. Both groups have been working hard for over a year talking Canadian politicians and public servants through the ins and outs – as well as the benefits – of signing onto IATI. I’ve been working with both groups as well, pushing IATI when meeting with Federal Ministers (I recommended we make it part of our Open Government Partnership goals) as well as writing supportive op-eds in newspapers, so needless to say I’m excited about this development.

This really is good news. As governments become increasingly aware of the power data can have in facilitating cooperation and coordination as well as in improving effectiveness and efficiency, it will be critical to push standards around structuring and sharing data so that such coordination can happen easily across and between jurisdictions. IATI is a great example of such an effort and I hope there are more of these, with Canada taking an early lead, in the months and years ahead.

 

 

International Open Data Hackathon, Dec 3rd. It's coming together.

So a number of things have started to really come together for this Saturday Dec 3rd. I’ve noticed a number of new cities being tweeted about (hello Kuala Lumpur & Oakland!) and others adding themselves to the wiki. Indeed, we seem to be above 40 cities. It is hard to know how many people will be showing up in each given city, but in Vancouver I know that we already over 20 registered, while in Ottawa they are well above 40. If other cities have similar numbers it’s a great testament to the size of the community out there interested in playing with open government data.

A few thoughts to share with people as we get ready for the big day.

1. Leverage existing projects.

I’ve mentioned a few times that there are some great existing projects out there that can be easily leveraged.

In that vein I’ve noticed the good people at the Open Knowledge Foundation, who are behind OpenSpending (the project that powers WherDoesMyMoneyGo.org) have not only made their software easier to use but have put up some helpful instructions for creating your own instance up on the wiki. One hope I have for Saturday is that a number of different places might be able to visualize local budgets in much easier to understand ways. OpenSpending has the potential of being an enormously helpful tool for communities trying to understand their budget – hopefully we can provide some great examples and feedback for its creators.

In addition, the folks at MySociety have provided some helpful advice on the wiki for those interested in spinning up a version of MapIt for their country.

2. Get Data Now, Not on Saturday!

Here in Vancouver, my friend Luke C asked if we could get bicycle accident data for the city or province as he wanted to play around with it and maybe visualize it on December 3rd. It just so happened I had a contact at the Insurance Company of British Columbia (ICBC) which insures every vehicle in the province. I reached out and, after going through their request process, now have the data set to share with Luke.

The key piece here: now is the time to check and see if data you are interested in is available, see investigate what is out there, and request it from various stakeholders if it is not.

3. Share Your Code, Share your Data

Indeed, one advantage of having the BC bicycle accident data early is that I can start sharing it with people immediately. I’ve already uploaded the data set (all 6400 lines) onto BuzzData’s site here so others can download it, clone it, and share their own work on it. That way, even if Luke and I get separated, he’s still got something to hack on!

So please do let people know where they can find data you are hacking on, as well as project you’re hacking on. The Open Data Day Projects 2011 wiki page currently sits empty (as should be expected). But take a swing by the page 2010 project page, notice how it is quite full… I’d love to see us replicate this success. I’m hoping people link to not just their projects, but also Github repos, scraperwiki creations, BuzzData accounts and other places.

If you have a project and you think people in open data day hackathons in other cities might be interested, put it in the project page and tweet about it using the #odhd hashtag. You may discover there are people out there who feel as passionately about your project as you do!

4. Let’s Get Connected

Speaking of sharing, my friend Edward O-G, who is organizing the hackathon in Ottawa, did a great job last year setting up some infrastructure so people from different hackathons could video conference with one another. This year I think we’ll try using Google hangouts on google+. However, there is a non-trivial risk that this will not scale super well.

So…

Edward also suggested (brilliantly) that people create YouTube videos of whatever they create during the hackathon or in the days and weeks that follow. Please post those links to the Open Data Day Projects 2011 wiki page as well. There were a few projects last year that had youtube videos and they were very helpful, particularly when a project isn’t quite ready for prime time. It gives us a taste of what will be available. It also becomes something we can point people to.

5. Have Fun, Do What Is Interesting

Remember, Open Data Day is about meeting people, learning about open data, and working on something that you feel passionate about. This is all very decentralized and informal – no one is going to come and save your hackathon… it is up to you! So make sure you find something you think is worth caring about and work on it. Share your idea, and your passion, with others, that’s what makes this fun.

Can’t wait to hear what people are up to. Please feel free to email or tweet at me what you’re working on. I’d love to hear about it and blog about them.

 

Here in Vancouver, the open data hackathon will be happening at the offices of FoodTree, which has some of its own developers working on Open Food data.(If you haven’t signed up yet, definitely do so here).

 

 

Statistics Canada Data to become OpenData – Background, Winners and Next Steps

As some of you learned last night, Embassy Magazine broke the story that all of Statistics Canada’s online data will not only be made free, but released under the Government of Canada’s Open Data License Agreement (updated and reviewed earlier this week) that allows for commercial re-use.

This decision has been in the works for months, and while it does not appear to have been formally announced, Embassy Magazine does appear to have managed to get a Statistics Canada spokesperson to confirm it is true. I have a few thoughts about this story: Some background, who wins from this decision, and most importantly, some hope for what it will, and won’t lead to next.

Background

In the embassy article, the spokesperson claimed this decision had been in the works for years, something that is probably technically true. Such a decision – or something akin to it – has likely been contemplated a number of times. And there have been a number of trials and projects that have allowed for some data to be made accessible albeit under fairly restrictive licenses.

But it is less clear that the culture of open data has arrived at StatsCan, and less clear to me that this decision was internally driven. I’ve met many a Statscan employee who encountered enormous resistance while advocating for data open. I remember pressing the issue during a talk at one of the department’s middle managers conference in November of 2008 and seeing half the room nod vigorously in agreement, while the other half crossed it arms in strong disapproval.

Consequently, with the federal government increasingly interested in open data, coupled with a desire to have a good news story coming out of statscan after last summer census debacle, and with many decisions in Ottawa happening centrally, I suspect this decision occurred outside the department. This does not diminish its positive impact, but it does mean that a number of the next steps, many of which will require StatsCan to adapt its role, may not happen as quickly as some will hope, as the organization may take some time to come to terms with the new reality and the culture shift it will entail.

This may be compounded by the fact that there may be tougher news on the horizon for StatsCan. With every department required to have submitted proposal to cut their budgets by either 5% and 10%, and with StatsCan having already seen a number of its programs cut, there may be fewer resources in the organization to take advantage of the opportunity making its data open creates, or even just adjust to what has happened.

Winners (briefly)

The winners from this decision are of course, consumers of statscan’s data. Indirectly, this includes all of us, since provincial and local governments are big consumers of statscan data and so now – assuming it is structured in such a manner – they will have easier (and cheaper) access to it. This is also true of large companies and non-profits which have used statscan data to locate stores, target services and generally allocate resources more efficiently. The opportunity now opens for smaller players to also benefit.

Indeed, this is the real hope. That a whole new category of winners emerges. That the barrier to use for software developers, entrepreneurs, students, academics, smaller companies and non-profits will be lowered in a manner that will enable a larger community to make use of the data and therefor create economic or social goods.

Such a community, however, will take time to evolve, and will benefit from support.

And finally, I think StatsCan is a winner. This decision brings it more profoundly into the digital age. It opens up new possibilities and, frankly, pushes a culture change that I believe is long over due. I suspect times are tough at StatsCan – although not as a result of this decision – this decision creates room to rethink how the department works and thinks.

Next Steps

The first thing everybody will be waiting for is to see exactly what data gets shared, in what structure and to what detail. Indeed this question arose a number of times on twitter with people posting tweets such as “Cool. This is all sorts of awesome. Are geo boundary files included too, like Census Tracts and postcodes?” We shall see. My hope is yes and I think the odds are good. But I could be wrong, at which point all this could turn into the most over hyped data story of the year. (Which actually matters now that data analysts are one of the fastest growing categories of jobs in North America).

Second, open data creates an opportunity for a new and more relevant role for StatsCan to a broader set of Canadians. Someone from StatsCan should talk to the data group at the World Bank around their transformation after they launched their open data portal (I’d be happy to make the introduction). That data portal now accounts for a significant portion of all the Bank’s web traffic, and the group is going through a dramatic transformation, realizing they are no longer curators of data for bank staff and a small elite group of clients around the world but curators of economic data for the world. I’m told a new, while the change has not been easy, a broader set of users have brought a new sense of purpose and identity. The same could be true of StatsCan. Rather than just an organization that serves the government of Canada and a select groups of clients, StatsCan could become the curators of data for all Canadians. This is a much more ambitious, but I’d argue more democratized and important goal.

And it is here that I hope other next steps will unfold. In the United States, (which has had free census data for as long as anyone I talked to can remember) whenever new data is released the census bureau runs workshops around the country, educating people on how to use and work with its data. StatsCan and a number of other partners already do some of this, but my hope is that there will be much, much more of it. We need a society that is significantly more data literate, and StatsCan along with the universities, colleges and schools could have a powerful role in cultivating this. Tracey Lauriault over at the DataLibre blog has been a fantastic advocate of such an approach.

I also hope that StatsCan will take its role as data curator for the country very seriously and think of new ways that its products can foster economic and social development. Offering APIs into its data sets would be a logical next step, something that would allow developers to embed census data right into their applications and ensure the data was always up to date. No one is expecting this to happen right away, but it was another question that arose on twitter after the story broke, so one can see that new types of users will be interested in new, and more efficient ways, of accessing the data.

But I think most importantly, the next step will need to come from us citizens. This announcement marks a major change in how StatsCan works. We need to be supportive, particularly at a time of budget cuts. While we are grateful for open data, it would be a shame if the institution that makes it all possible was reduced to a shell of its former self. Good quality data – and analysis to inform public policy – is essential to a modern economy, society, and government. Now that we will have free access to what our tax dollars have already paid for, let’s make sure that it stays that way, by both ensure it continues to be available, and that there continues to be a quality institution capable of collecting and analyzing it.

(sorry for typos – it’s 4am, will revise in the morning)

The Canadian Government's New Web 2.0 Guidelines: the Good, the Bad & the Ugly

Yesterday, the government of Canada released its new Guidelines for external use of Web 2.0. For the 99.99% of you unfamiliar  with what this is, it’s the guidelines (rules) that govern how, and when, public servants may use web 2.0 tools such as twitter and facebook.

You, of course, likely work in organization that survives without such documents. Congratulations. You work in a place where the general rule is “don’t be an idiot” and your bosses trust your sense of judgement. That said, you probably also don’t work somewhere where disgruntled former employees and the CBC are trolling the essentially personal online statements of your summer interns so they can turn it into a scandal. (Yes, summer student border guards have political opinions, don’t like guns and enjoy partying. Shocker). All this to say, there are good and rational reasons why the public service creates guidelines: to protect not just the government, but public servants.

So for those uninterested in reading the 31 page, 12,055 word guidelines document here’s a review:

The Good

Sending the right message

First off, the document, for all its faults, does get one overarching piece right. Almost right off the bat (top of section 3.2) is shares that Ministries should be using Web 2.0 tools:

Government of Canada departments are encouraged to use Web 2.0 tools and services as an efficient and effective additional channel to interact with the public. A large number of Canadians are now regularly using Web 2.0 tools and services to find information about, and interact with, individuals and organizations.

Given the paucity of Web 2.0 use in the Federal government internally or externally this clear message from Treasury Board, and from a government minister, is the type of encouragement needed to bring government communications into 2008 (the British Government, with its amazing Power of Information Taskforce, has been there for years).

Note: there is a very, very, ugly counterpart to this point. See below.

Good stuff for the little guy

Second, the rules for Professional Networking & Personal Use are fairly reasonable. There are some challenges (notes below), but if any public servant ever finds them or has the energy to read the document, they are completely workable.

The medium is the message

Finally, the document acknowledges that the web 2.0 world is constantly evolving and references a web 2.0 tool by which public servants can find ways to adapt. THIS IS EXACTLY THE RIGHT APPROACH. You don’t deal with fast evolving social media environment by handing out decrees in stone tablets, you manage it by offering people communities of practice where they can get the latest and best information. Hence this line:

Additional guidance on the use of Web 2.0 tools and services is in various stages of development by communities of expertise and Web 2.0 practitioners within the Government of Canada. Many of these resources are available to public servants on the Government of Canada’s internal wiki, GCpedia. While these resources are not official Government of Canada policies or guidelines, they are valuable sources of information in this rapidly evolving environment.

Represents a somewhat truly exciting development in the glacially paced evolution of government procedures. The use of social media (GCPEDIA) to manage social media.

Indeed, still more exciting for me is that this was the first time I’ve seen an official government document reference GCPEDIA as a canonical source of information. And it did it twice, once, above, pointing to a community of practice, the second was pointing to the GCPEDIA “Social media procurement process” page. Getting government to use social media internally is I think the biggest challenge at the moment, and this document does it.

The Bad

Too big to succeed

The biggest problem with the document is its structure. It is so long, and so filled with various forms of compliance, that only the most dedicated public servant (read, communications officer tasked with a social media task) will ever read this. Indeed for a document that is supposed to encourage public servants to use social media, I suspect it will do just the opposite. Its density and list of controls will cause many who were on the fence to stay there – if not retreat further. While the directions for departments are more clear, for the little guy… (See next piece)

Sledgehammers for nails

The documents main problem is that it tries to address all uses of social media. Helpfully, it acknowledges there are broadly two types of uses “Departmental Web 2.0 initiatives” (e.g. a facebook group for a employment insurance program) and “personnel/professional use” (e.g. a individual public servant’s use of twitter or linked in to do their job). Unhelpfully, it addresses both of them.

In my mind 95% of the document relates to departmental uses… this is about ensuring that someone claiming to represent the government in an official capacity does not screw up. The problem is, all those policies aren’t as relevant to Joe/Jane public servant in their cubicle trying to find an old colleague on LinkedIn (assuming they can access linkedin). It’s overkill. These should be separate documents, that way the personal use document could be smaller, more accessible and far less intimidating. Indeed, as the guidelines suggest, all it should really have to do is reference the Values and Ethics Code for the Public Service (essentially the “idiots guide to how not to be an idiot on the job” for public servants) and that would have been sufficient. Happily most public servants are already familiar with this document, so simply understanding that those guidelines apply online as much as offline, gets us 90% of the way there.

In summary, despite a worthy effort, it seem unlikely this document will encourage public servants to use Web 2.0 tools in their jobs. Indeed, for a (Canadian) comparison consider the BC Government’s guidelines document, the dryly named “Policy No. 33: Use of Social Media in the B.C. Public Service.”  Indeed, despite engaging both use cases it manages covers all the bases, is straightforward, and encouraging, and treats the employee with an enormous amount of respect. All this in a nifty 2 pages and 1,394 words. Pretty much exactly what a public servant is looking for.

The Ugly

Sadly, there is some ugliness.

Suggestions, not change

In the good section I mentioned that the government is encouraging ministries to use social media… this is true. But it is not mandating it. Nor does these guidelines say anything to Ministerial IT staff, most of whom are blocking public servant’s access to sites like facebook, twitter, in many cases, my blog, etc… The sad fact is, there may now be guidelines that allow public servants to use these tools, but in most cases, they’d have to go home, or to a local coffee shop (many do) in order to actually make use of these guidelines. For most public servants, much of the internet remains beyond their reach, causing them to fall further and further behind in understanding how technology will effect their jobs and their department/program’s function in society.

It’s not about communication, it’s about control

In his speech at PSEngage yesterday the Treasury Board Minister talked about the need for collaboration on how technology can help the public service reinvent how it collaborates:

The Government encourages the use of new Web 2.0 tools and technologies such as blogs, wikis, Facebook, Twitter and YouTube. These tools help create a more modern, open and collaborative workplace and lead to more “just-in-time” communications with the public.

This is great news. And I believe the Minister believes it too. He’s definitely a fan of technology in all the right ways. However, the guidelines are mostly about control. Consider this paragraph:

Departments should designate a senior official accountable and responsible for the coordination of all Web 2.0 activities as well as an appropriate governance structure. It is recommended that the Head of Communications be the designated official. This designate should collaborate with departmental personnel who have expertise in using and executing Web 2.0 initiatives, as well as with representatives from the following fields in their governance structure: information management, information technology, communications, official languages, the Federal Identity Program, legal services, access to information and privacy, security, values and ethics, programs and services, human resources, the user community, as well as the Senior Departmental Official as established by the Standard on Web Accessibility. A multidisciplinary team is particularly important so that policy interpretations are appropriately made and followed when managing information resources through Web 2.0 tools and services.

You get all that? That’s at least 11 variables that need to be managed. Or, put another way, 11 different manuals you need to have at your desk when using social media for departmental purposes. That makes for a pretty constricted hole for information to get out through, and I suspect it pretty much kills most of the spontaneity, rapid response time and personal voice that makes social media effective. Moreover, with one person accountable, and this area of communications still relatively new, I suspect that the person in charge, given all these requirements, is going to have a fairly low level of risk. Even I might conclude it is safer to just post an ad in the newspaper and let the phone operators at Service Canada deal with the public.

Conclusion

So it ain’t all bad. Indeed, there is much that is commendable and could be worked with. I think, in the end, 80% of the problems with the document could be resolved if the government simply created two versions, one for official departmental uses, the other for individual public servants. If it could then restrain the lawyers from repeating everything in the Values and Ethics code all over again, you’d have something that social media activists in the public service could seize upon.

My sense is that the Minister is genuinely interested in enabling public servants to use technology to do their jobs better – he knows from personal experience how helpful social media can be. This is great news for those who care about these issues, and it means that pressing for a better revised version might yield a positive outcome. Better to try now, with a true ally in the president’s office than with someone who probably won’t care.

 

The New Government of Canada Open Data License: The OGL by another name

Last week the Minister Clement issued a press release announcing some of the progress the government has made on its Open Government Initiatives. Three things caught my eye.

First, it appears the government continues to revise its open data license with things continuing to trend in the right direction.

As some of you will remember, when the government first launched data.gc.ca it had a license that was so onerous that it was laughable. While several provisions were problematic, my favourite was the sweeping, “only-make-us-look-good-clause” which, said, word for word: “You shall not use the data made available through the GC Open Data Portal in any way which, in the opinion of Canada, may bring disrepute to or prejudice the reputation of Canada.”

After I pointed out the problems with this clause to then Minister Day, he managed to have it revoked within hours – very much to his credit. But it is a good reminder to the starting point of the government license and to the mindset of government Canada lawyers.

With the new license, almost all the clauses that would obstruct commercial and non-profit reuse have effectively been eliminated. It is no longer problematic to identify individual companies and the attribution clauses have been rendered slightly easier. Indeed, I would argue that the new license has virtually the same constraints as the UK Open Government License (OGL) and even the Creative Commons CC-BY license.

All this begs the question… why not simply use the language and structure of the OGL in much the same manner that British Columbia Government tried to with its own BC OGL? Such a standardized license across jurisdictions might be helpful, it would certainly simply life for think tanks, academics, developers and other users of the data. This is something I’m pushing for and hope that we might see progress on.

Second, the idea that the government is going to post completed access to information (ATIP) requests online is also a move in the right direction. I suspect that the most common ATIP request is one that someone else has already made. Being able to search through previous requests would enable you to find what you are looking for without having to wait weeks or make public servants redo the entire search and clearing process. What I don’t understand is why only post the summaries? In a digital world it would be better for citizens, and cheaper for the government to simply post the entire request whenever privacy policies wouldn’t prevent it.

Third, and perhaps most important were the lines noting that “That number (of data sets) will continue to grow as the project expands and more federal departments and agencies come onboard. During this pilot project, the Government will also continue to monitor and consider national and international best practices, as well as user feedback, in the licensing of federal open data.”

This means that we should expect more data to hit the site. I seems as though more departments are being asked to figure out what data they can share – hopefully this means that real, interesting data sets will be made public. In particular one hopes that data sets which legislation mandates the government collect, will be high on the list of priorities. Also interesting in this statement is the suggestion that the government will consider national and international best practices. I’ve talked to both the Minister and officials about the need to create common standards and structures for open data across jurisdictions. Fostering and pushing these is an area where the government could take a leadership role and it looks like there may be interesting in this.

 

If you are in Vancouver Vote Open Data, Vote Vision

If you are a Vancouver resident tomorrow is election day. I’m hoping if you are a resident and a reader of this blog, you’ll consider voting for Vision Vancouver.

As many of you know just over two years ago the city launched Vancouver’s Open Data portal – the first of its kind in Canada and the second municipal open data portal in the World.

This didn’t happen by chance. It took leadership from politicians who were a) willing to see and grasp a good idea long before it was widely celebrated, and b) able to drive it through city hall. It is a testament to those leaders – and the city staff who worked on it – that the city went from talking about the idea to implementing it in less than 3 months.

I know this is not an issue that everyone cares about. Personally, I believe open data is going to play a critical role in helping us rethink how government works, enabling it to be more effective and efficient while also empowering citizens to better understand and contribute to policy discussions. All that isn’t going to capture people’s imagination as much as ensuring the garbage gets collected regularly and on time (that’s important too!), but I do think the open data issue is a proxy, something that shows which leaders are willing to engage in innovative approaches and work with new technologies.

So if you are in Vancouver and you care about this issue, please vote Vision tomorrow. This is the party that made open data happen in Vancouver – and cleared the way for it in Canada. (location of voting stations is available here)

International Open Data Hackathon Updates and Apps

With the International Open Data Hackathon getting closer, I’m getting excited. There’s been a real expansion on the wiki of the number of cities where people are sometimes humbly, sometimes grandly, putting together events. I’m seeing Nairobi, Dublin, Sydney, Warsaw and Madrid as some of the cities with newly added information. Exciting!

I’ve been thinking more and more about applications people can hack on that I think would be fun, engage a broad number of people and that would help foster a community around viable, self-sustaining projects.

I’m of course, all in favour of people working on whatever peaks their interest, but here are a few projects I’m encouraging people to look at:

1. Openspending.org

What I really like about openspending.org is that there are lots of ways non-coders can contribute. Specifically finding, scraping and categorizing budget data, which (sadly) is often very messy are things almost anyone with a laptop can do and are essential to getting this project off the ground. In addition, the reward for this project can be significant, a nice visualization of whatever budget you have data for – a perfect tool for helping people better understand where their money (or taxes) go. Another big factor in its favour… openspending.org – a project of the Open Knowledge Foundation who’ve been big supporters and sponsors of the international open data hackathon – is also perfect because, if all goes well, it is the type of project that a group can complete in one day.

So I hope that some people try playing with website using your own local data. It would be wonderful to see the openspending.org community grow.

2. Adopt a Hydrant

Some of you have already seen me blog about this app – a project that comes of out Code for America. If you know of a government agency, or non profit, that has lat/long information for a resource that it wants people to help take care of… then adopt a hydrant could be for you. Essentially adopt a hydrant – which can be changed to adopt an anything – allows people to sign up and “adopt” what ever the application tracks. Could be trees, hydrants, playgrounds… you name it.

Some of you may be wondering… why adopt a hydrant? Well because in colder places, like Boston, MA, adopt a hydrant was created in the hopes that citizens might adopt a hydrant and so agree that when it snows they would keep the hydrant clear of snow. That way, in case their is a fire, the emergency responders don’t end up wasting valuable minutes locating and then digging out, the hydrant. Cool eh?

I think adopt a hydrant has the potential of become a significant open source project, one widely used by cities and non-profits. Would be great to see some people turned on to it!

3. Mapit

What I love about mapit is that it is the kind of application that can help foster other open data applications. Created by the wonderful people over at Mysociety.org this open source software essentially serves as a mapping layer so that you can find out what jurisdictions a given address or postal code or GPS device currently sits in (e.g. what riding, ward, city, province, county, state, etc… am I in?). This is insanely useful for lots of developers trying to build websites and apps that tell their users useful information about a given address or where they are standing. Indeed, I’m told that most of Mysociety.org’s project use their instance of MapIt to function.

This project is for those seeking a more ambitious challenge, but I love the idea that this service might exist in multiple countries and that a community might emerge around another one of mysociety.org’s projects.

No matter what you intend to work on, drop me a line! Post it to the open data day mailing list and let me know about it. I’d love to share it with the world.

How Architecture Made SFU Vancouver’s University

For those unfamiliar with Vancouver, it is a city that enjoys a healthy one way rivalry between two university: the University of British Columbia (UBC) and Simon Fraser University (SFU).

Growing up here I didn’t think much of Simon Fraser. I don’t mean that in a disparaging way, I mean it literally. SFU was simply never on my radar. UBC I knew. As high school students we would sneak out to its libraries to study for finals and pretend we were mature than we were. But SFU? It was far away. Too remote. Too inaccessible by public transit. Heck, too inaccessible by car(!).

And yet today when I think of Vancouver’s two universities UBC is the one that is never on my radar. After noticing that several friends will be on a panel tonight on How Social Media is Changing Politics at UBC’s downtown campus I was reminded of the fact that UBC has a downtown campus. It may be the most underutilized and unloved space in the University. This despite the fact it sits in the heart of Vancouver and under some of the most prime real estate in the city. In fact I don’t think I’ve actually ever been to UBC’s downtown campus.

In contrast I can’t count the number of time’s I’ve been to SFU’s downtown campus. And the reason is simple: architecture. It’s not that SFU simple invests in its downtown campus making it part of the university, it’s that it invested in Vancouver by building one of the most remarkable buildings in the city. If you are in Vancouver, go visit The Wosk Centre for Dialogue. It is amazing. Indeed, I feel so strongly about it, I included it in my top ten favourite places when Google Maps added me to their list of Lat/Long experts for the 2010 Winter Olympics.

Photo by Bryan Hughes

What makes the Wosk Centre so fantastic? It seats 180 or so people in concentric circles, each with their own a mic. It may be the only place where I’ve felt a genuine conversation can take place with such a large group. I’ve seen professors lecture, drug addicts share stories, environmentalists argue among one another and friends debate one another, and it has always been eye opening. Here, in the heart of the city, is a disarming space where stakeholders, experts, citizens or anyone, can be gathered to share ideas and explore their differences in a respectful manner. Moreover, the place just looks beautiful.

Centre-for-dialogue-300x300

The building is a testament to how architecture and design can fundamentally alter the relationship between an institution and the city within which it resides. Without the Wosk Centre I’m confident SFU’s downtown presence would have meant much less to me. Moreover, I’m fully willing to agree that UBC is the better university. It ranks better in virtually ever survey, it has global ambitions that even achievable and likely does not want to be involved in the city. That’s a strategic choice I can, on one level, respect. But on a basic level, the Wosk Centre makes SFU relevant to Vancouverites and in doing so, allows the University to punch above its weight, at least locally. And that has real impact, at least for the city’s residents. But I think for the university as well.

Reading the always excellent Steven Johnson’s Where Good Ideas Come From I can’t help but think the UBC is missing out on something larger. As Johnson observers, good ideas arise from tensions, from the remixing of other ideas, particularly those from disparate places. They rarely come from the deep thinker isolated out in the woods (UBC lies at the edge of Vancouver beyond a large park) or meditating on a mountain top (SFU’s core campus is atop a small mountain) but out of dense networks where ideas, hunches and thoughts can find one another. Quiet meditation is important. But so to is engagement. Being in the heart of a bustling city is perhaps a distraction, but that may be the point. Those distractions create opportunities, new avenues for exploration and, for universities concerned with raising money from their intellectual capital, to find problems in search of solutions. So raising a structure that is designed to explicitly allow tensions and conflicts to play out… I can’t help but feel that is a real commitment to growth and innovation in a manner that not only gives back to its host community, but positions one to innovate in a manner and pace the 21st century demands.

As such, the Wosk Centre, while maybe a shade formal, is a feat of architecture and design, a building that I hope enables a university to rethink itself, but that has definitely become a core part of the social infrastructure of the city and redrawn at least my own relationship with SFU.

Open Data Day – a project I'd like to be doing

As some readers and International Open Data Hackathon participants know, I’m really keen on developers reusing each others code. All too often, in hackathons, we like to build something from scratch (which can be fun) but I’ve always liked the idea of hackathons either spurring genuine projects that others can reuse, or using a hackathon as an excuse to find a project they’d like support and contribute to.

That’s why I’ve been really encouraging people to find open source projects out there that they’d find interesting and that will support others efforts. This is a big reason I’ve been thinking about MapIt and the Open Knowledge Foundation’s Where does my Money Go project.

In Vancouver, one project I’m eventually hoping we can contribute to is Adopt-a-Hydrant. A project out of Code for America, the great thing about Adopt-a-Hydrant is that it can be adapted to become an adopt an anything app. It’s the end goal of a project I’m hoping to plan out and start on during the Hackathon.

Here in Vancouver, I’ve been talking with the Parks Board around getting a database of all the cities trees opened up. Interestingly this dataset does not include location data (Lat/Long) for each tree… So what would initially be great is to build a mobile phone app that will show you a list of trees near the address the user is currently at, and then allow the user to use their phone’s GPS to add the lat/long data to the database. That way we can help augment the city’s database. Once you begin to add lat long data then you could map trees in Adopt-a-Hydrant and create an Adopt-a-Tree app. Citizens could then sign up to adopt a tree, offer to take care of it, maybe notify the parks board if something is wrong.

I consider this a fairly ambitious project, but it could end up engaging a number of stakeholders – students, arborists, retirees, and others – that don’t normally engage in open data.

I know that the crew organizing a hackathon in Hamilton, Ontario are also looking to create an instance of Adopt-a-Hydrant, which is awesome. We need to both track what worked and what didn’t work so that the kinks in Adopt-a-hydrant can be worked out. More users and developers like us will help refine it further.

If you are planning a hackathon for the Dec 3rd International Open Data Hackathon, please be sure to update the wiki, join the mailing list, and if you have a project your are planning on working on, please email the list, or me directly, I’d love to blog about it!

 

 

Weaving Foreign Ministries into the Digital Era: Three ideas

Last week I was in Ottawa giving a talk at the Department of Foreign Affairs talking about how technology, new media and open innovation will impact the department’s it work internally, across Ottawa and around the world.

While there is lots to share, here are three ideas I’ve been stewing on:

Keep more citizens safe when abroad – better danger zone notification

Some people believe that open data isn’t relevant to departments like Foreign Affairs or the State Department. Nothing could be further than the truth.

One challenge the department has is getting Canadians to register with them when they visit or live in a country labeled by the department as problematic for traveling in its travel reports (sample here). As you can suspect, few Canadians register with the embassy as they are likely not aware of the program or travel a lot and simply don’t get around to  it.

There are other ways of tackling this problem that might yield broader participation.

Why not turn the Travel Report system into an open data with an API? I’d tackle this by approaching a company like TripIt. Every time I book an airplane ticket or a hotel I simply forward TripIt the reservation, which they scan and turn into events that then automatically appear my calendar. Since they scan my travel plans they also know which country, city and hotel I’m staying in… they also know where I live and could easily ask me for my citizenship. Working with companies like TripIt (or Travelocity, Expedia, etc…) DFAIT could co-design an API into the departments travel report data that would be useful to them. Specifically, I could imagine that if TripIt could query all my trips against those reports then any time they notice I’m traveling somewhere the Foreign Ministry has labelled “exercise a high-degree of caution” or worse trip TripIt could ask me if I’d be willing to let them forward my itinerary to the department. That way I could registry my travel automatically, making the service more convenient for me, and getting the department more information that it believes to be critical as well.

Of course, it might be wise to work with the State Department so that their travel advisories used a similarly structured API (since I can assume TripIt will be more interested in the larger US market than the Canadian market) But facilitating that conversation would be nothing but wins for the department.

More bang for buck in election monitoring

One question that arose during my talk came from an official interested in elections monitoring. In my mind, one thing the department should be considering is a fund to help local democracy groups spin up installations of Ushahidi in countries with fragile democracies that are gearing up for elections. For those unfamiliar with Ushahidi it is a platform developed after the disputed 2007 presidential election in Kenya that plotted eyewitness reports of violence sent in by email and text-message on a google map.

Today it is used to track a number of issues – but problems with elections remain one of its core purposes. The department should think about grants that would help spin up a Ushahidi install to enable citizens of the country register concerns and allegations around fraud, violence, intimidation, etc… It could then verify and inspect issues that are flagged by the countries citizens. This would allow the department to deploy its resources more effectively and ensure that its work was speaking to concerns raised by citizens.

A Developer version of DART?

One of the most popular programs the Canadian government has around international issues is the Disaster Assistance Response Team (DART). In particular, Canadians have often been big fans of DART’s work in purifying water after the boxing day tsunami in Asia as well as its work in Haiti. Maybe the department could have a digital DART team, a group of developers that, in an emergency could help spin up Ushahidi, Fixmystreet, or OpenMRS installations to provide some quick but critical shared infrastructure for Canadians, other countries’ response teams and for non-profits. During periods of non-crisis the team could work on these projects or supporting groups like CrisisCommons or OpenStreetMaps, helping contribute to open source projects that can be instrumental in a humanitarian crisis.