Category Archives: technology

How to Engage Citizens on a Municipal Website…

Sometimes, it’s nice to be small, the City of Nanaimo has been pushing the envelop on open data and open government for a number of years now.

Recently, I was directed to their new Council Agendas and Minutes webpage. I recommend you check it out.

Here’s why.

At first blush the site seems normal. There is the standard video of the council meeting (queue cheesy local cable access public service announcement), but them meeting minutes underneath are actually broken down by the second and by clicking on them you can jump straight to that moment in the meeting.

As anyone who’s ever attended a City Council meeting (or the legislature, or parliament) knows, the 80/20 rule is basically always in effect. About 80% of the time the proceedings are either dead boring and about 20% (often much less) of the time the proceedings are exciting, or more importantly, pertinent to you. One challenge with getting citizens engaged on the local level is that they often encounter a noise to signal problem. The ratio of “noise” (issues a given citizen doesn’t care about) drowns out the “signal” (the relatively fewer issues they do care about).

The City of Nanaimo’s website helps address this problem. It enables citizens to find what matters to them without having to watch or scroll through a long and dry council meeting. Better still, they are given a number of options by which to share that relevant moment with friends, neighbours, allies or colleagues via twitter, facebook, delicious or any other number of social media tools.

One might be wondering: can my city afford such a wizbang setup?

Excellent question.

Given Nanaimo’s modest size (it has 78,692 citizens) suggests they have a modest IT budget. So I asked Chris McLuckie, a City of Nanaimo public servant who worked on the project. He informed me that the system was built in-house by him and another city staff member, it uses off-the-shelf hardware and software and so cost under $2000 and it took 2 week to code up.

2 weeks?

No million dollar contract? No 8 month timeline? No expensive new software?

No. Instead, if you’re smart, you might find a couple of local creative citizen-hackers to put something together in no time at all.

You know what’s more, because Chris and the City of Nanaimo want to help more cities learn how to think like the web, I bet if the IT director from any city (or legislative body) asked nicely, they would just give them the code.

So how Open is your city? And if not, do they have $2000 lying around to change that?

Opendata & Opencities: Proposed panel for SXSWi

panel pickerOver the past year I’ve been inspired by the fact that an increasing number of cities are thinking about how to more effectively share the data they generate with their citizens.

As most readers of this blog are probably aware, I’ve been engrossed advising the Mayor’s Office here in Vancouver on the subject and am excited about the progress being made on the City’s open data project.

Since there is so much energy around this topic across North America I thought there might be interest among SXSWers on the opportunities, challenges and benefits surrounding open data.

Here’s my proposed panel, and if you think it is a good idea I’d be elated if you took the time to head over to the panel picker website and voted for it!

Title:

OpenData: Creating Cities That Think Like the Web

Level:

Beginner

Category:

Community / Online Community, Government and Technology, Social Issues, User Generated Content, Web Apps / Widgets

Questions:

  1. What is open data?
  2. How can I effectively mobilize people to get my local government to share data?
  3. How can open data be shared most effectively?
  4. What are the benefits of open data?
  5. What business models are emerging around municipal open data?
  6. How can citizens/citizen coders help government bureaucracies share open data?
  7. How do government bureaucracies centered on secrecy and security shift to being interested in open?
  8. How is open data changing the role of government?
  9. How is open data changing the relationship between citizens and government?

Description:

Across North America municipal governments are opening up their data and encouraging citizens to create online applications, mash-ups and tools to improve city services and foster engagement. Panelists from cities leading this open movement will discuss the challenges, lessons, benefits and opportunities of open data and open government.

Some of the people I’d love to have as panelists include:

Kelly Pretzer (@kellypretzer) Is a City of SF employee who has been working with a team on an open data initiative with the city of SF. You can track their work here.

Peter Corbett (@corbett3000) is CEO of iStrategyLabs. iStrategy Labs is the organization that ran the Apps for Democracy competition in Washington DC. If Peter can’t make it, we’d hope iStrategy could send a representative.

Ryan Merkley (@ryanmerkley) Political advisor to the Mayor of Toronto and helping oversee the open Toronto Initiative.

Myself! (@david_a_eaves) I’ve been advising the Mayor of Vancouver on open government and open data and co-drafted the Open Motion, passed by the City of Vancouver on May 21st.

It would, of course, be nice to have Vivek Kundra, but I’ll confess, I’m not sure I have that kind of pull…

5 Ways to get to the Next Million Mozillians

Mark Surman, Executive Director of the Mozilla Foundation has been ruminating on:

how Mozilla can actively encourage large numbers of people to participate on making the web more open and awesome.”

For a long time I’ve been a supporter of the idea that supporters of an Open Web are part of a social movement and that mobilizing these supporters could be a helpful part of a strategy for preserving and promoting the openness of the web. More importantly, I think the rise of open source, and in particular the rise of Mozilla tracks shockingly well against the structure of a social movement.

So if we are interested in increasing interest in the openness of the web and believe that recruiting the next million Mozillians can helps us accomplish that, then I think there are 3 things any strategy must do:

1. Increase the range of stake holders involved (this is part of why I write about women in open source so much) as this gives open web supporters more leverage when negotiating with those who threaten the web’s openness or who influence its development

2. Connect nebulous ideas like “security” and “openness” to tangible experiences people (users?) can relate to and to core values they believe in. (This is why Mark’s “seatbelt moment” narrative is awesome in this regard)

3. Outline actions that stakeholders and supporters can take.

So with the (not always successful) intent to focus on these 3 objectives here are five ideas I think could help us:

Idea 1: Partner with Consumer Reports and help shape the criteria by which they evaluate ISPs

One key to ensuring an open web is ensuring that people’s connection to the web is itself open. ISPs are a critical component in this ecosystem. As some observers have noted, ISPs engage in all sorts of nefarious activities such as bandwidth shaping, throttling, etc… Ensure that the net stays neutral feels like a critical part of ensuring it stays open.

One small way to address this would be make the neutrality of a network part of the evaluation criteria for Consumer Report reviews of ISPs. This would help make the openness of an ISP a competitive, would increase the profile of this problem and would engage a group of people (Consumer Report users) that are probably not generally part of the Mozilla Community.

Idea 2: Invest in an enterprise level support company for Firefox & Thunderbird.

Having a million citizens supporting Firefox and Mozilla is great, but if each of those supporters looks and acts the same then their impact is limited. Successful movements are not just large they are also diverse. This means having a range of stakeholders to help advocate for the open web. One powerful group of stakeholders are large enterprises & governments. They have money, they have clout and they have large user bases. They are also – as far as I can tell – one of the groups that Firefox has had the hardest time achieving market share with.

From my limited experience working with governments, adopting Firefox is difficult. There is no one to sign an SLA with, no dedicated support desk and no assurances problems will be escalated to the right developer within a fixed time period. Many of these challenges are highlighted by Tauvix in the comment section of this post). We could spend our time arguing about whether these issues are legitimate or if those large organizations simply need a culture shift. But such a shift will take a LONG time to materialize, if it ever does.

Finding a way to satisfy the concerns of large organizations – perhaps through a Redhat type model – might be a good way to invest Mozilla Foundation money. Not only could there be a solid return on this investment, but it could bring a number of large powerful companies and governments into the Mozilla camp. These would be important allies in the quest for an open web.

Idea 3: Promote add-ons that increase security, privacy and control in the cloud.

One reason behind Mozilla’s enormous success is that the community has always provided innovative technical solutions to policy/privacy/openness problems. Don’t like they way Microsoft is trying to shape the internet? Here, use a better browser. Don’t want to receive target advertising on website? Here, download this plug-in. Don’t want to see any advertising? Here, download this plug-in. Not sure if a website is safe? Here, use this plug-in. In short, Mozilla has allowed its software to serve as a laboratory to experiment with new and interesting ways to allow users to control their browsing experience.

While not a complete solution, it might be interesting to push the community to explore how Greasemonkey scripts, Jetpack plug-ins, or ordinary plug-ins might provide users with greater control over the cloud. For example, could a plug-in create automatic local backups of google docs on your computer? Could a Thunderbird plugin scan facebook messages and allow users a choice of mediums to respond with (say email). Fostering a “product-line” of cloud specific plug-ins that increase user control over their experience might be an interesting place to start.

Idea 4: Create and brand the idea of an openness audit

As more and more personal data ends up in servers controlled by companies, governments and non-profits there are real concerns around how secure and private this information is. Does anyone know that Google isn’t peeking at your Google docs every once in a while? Do you know if you’ll ever be able to delete your personal information from facebook?

These are legitimate questions. Outlining some guidelines around how companies manage privacy and security and then creating an audit system might be an interesting way to nudge companies towards adopting stronger standards and policies in the cloud. This might also increase public awareness and encourage a upwards spiral among competing service providers. Working with companies like KPMG and Deloitte Mozilla and others could help foster a new type of audit, one that would allow consumers to easily discriminate against cloud service providers that respect their rights, and those that don’t.

Idea 5: Let’s use that Firefox launch screen to create the next million Mozillians

At the moment, when you download and install Firefox the first website you see when you load the program congratulates you on downloading the program, tells you that you are helping keep the internet open and outlines some of Firefox’s new features. We could do more. Why not prompt people to join a “Mozillians” club where they will be kept up to date on threats and opportunities around the open web. Or maybe we should list 3 actions (with hyperlinks) they can take to increase the openness of the web (say, upgrade a friend, send a form letter to their member of congress and read an intro article on internet security?)

With maybe 300+ million people likely to download Firefox 3.5, that’s a lot of people we could be mobilizing to be more active, technically, socially and politically, around an open web.

There’s a start… I’ll keep brainstorming more ideas but in the interim, please feel free to let me know if you think any of these have real problems and/or are bunk.

Creating the Open Data Bargain in Cities

Embedded below is the talk I’ve given to both community hackers (at Open Web Vancouver) as well as City of Vancouver Staff regarding the opportunities and challenges around open data and the open motion. (Here’s an update on where Vancouver is at courtesy of some amazing work by city staff).

For those willing to brave through the presentation (or simply fast forward to the end) one piece I felt is most important is the talk’s last section which outlines what I term “The Bargain” in a reference to the informal contract Clay Shirky says exists between every Web 2.0 site and their users.

The bargain comes last, because it matters only if there is a promise (open and shared data) and a set of tools (applications languages) that are already working together. The bargain is also the most complex aspect of a functioning group, in part because it is the least explicit aspect and in part because it is the one the users have the biggest hand in creating, which means it can’t be completely determined in advance… A bargain helps clarify what you can expect of others and what they can expect of you.

Clay Shirky in Here Comes Everybody (my italics, page 270)

I believe that in an open city, a similar bargain exists between a government and its citizens. To make open data a success and to engage the community a city must listen, engage, ask for help, and of course, fulfil its promise to open data as quickly as possible. But this bargain runs both ways. The city must to its part, but so, on the flip side, must the local tech community. They must participate, be patient (cities move slower than tech companies), offer help and, most importantly, make the data come alive for each other, policy makers and citizens through applications and shared analysis.

Articles I'm digesting 24/7/2009

Been a while since I’ve done one of these and I’ve got a lot of great pieces I’ve been reading. So let’s get to it.

Designs on Policy by Allison Arieff (via David B.) and TED Talk: Are we in Control of our own Decisions? by Dan Ariely

I keep hearing about the interaction between policy and design (most flatteringly an architecture professor said I had a designer’s mind” the other day) and so over the past few years I try (with some success) to read as much as I can about design. David B sent me the Arieff piece which, of course, weds my passion for public policy with design. One thing I like is the way the piece doesn’t try to boil the ocean – it doesn’t claim (like in other places) that good design will solve every problem – just that it will help mitigate against it. Most intriguing for me is this line:

“It feels weird to have to defend design’s importance, yet also completely necessary. The United Kingdom has had a policy in place since 1949; Japan since 1956. In countries like Finland, Sweden, South Korea and the Netherlands, design is a no-brainer, reflected by the impeccable elegance, usability and readability of everything in those countries from currency to airport signage.”

A design policy? How civilized. That’s something I could get behind – especially after listening to Dan Ariely’s TED talk which is downright frightening at moments given how susceptible our decisions are (and most disconcerting the decisions of our doctors, dates and who knows whose) to the layout/perception of the choice.

Lost in the Cloud by John Zittrain

A few months ago I was in Ottawa and – surprisingly and unplanned – ended up at a pub with Richard Stallman. I asked him what he thought of Cloud Computing (a term he believes is too vague to be helpful) but was nonetheless viscerally opposed to it. Many of the reasons he cites are covered by Zittrain in this thoughtful piece. The fact is, Cloud Computing (or whatever term you may wish to use) is very convenient and it carries with it huge privacy, security and access challenges. This is potentially the next big challenge for those of us who support and Open Internet – the possibility of the internet being segmented into a series of walled gardens controlled by those who run the cloud servers is real and must be thought through. If you care about public policy and/or are a geek, read this.

Is it Time to Get Rid of the Foreign Service Designation?

Am I reading my own articles? No. I am, however, absorbed by the fascinating and constructive conversation taking place – mostly involving public servants – in the comments section underneath. Here are just some snippets:

  • “For 8 years I worked at DFAIT, observing and participating in the culture within the walls of a building named after a diplomat that Wikipedia states “is generally considered among the most influential Canadians of the 20th century.” Sadly, the elitism (whether earned or not) is only the cause of a bigger problem; lack of desire to collaborate, and almost no desire to change in an era where the only constant is change.”
  • “…as I left the issue of the FS classification was quietly but passionately part of the watercooler discussion. From my perspective, in spite of a nasty AG report on the dismal state of affairs of HR at DFAIT, the department has more pressing problems, such as credibility with central agencies, a coherent sense of mission and talent attraction and retention.”
  • “I am also a bit puzzled by people who saw your piece as an attack on DFAIT – you’re advocating for human resource reform to improve the department, after all. I’m still not sure why you think DFAIT is required though, or why Canadian foreign policy suffers when departments forumulate it without involving DFAIT.”
  • “It’s good to see that even Craig Weichel, President of PAFSO, is open to your suggestion that it might be good to have more foreign service officers circulate through other government departments…”

Putting the Cart Before the Horse by Peter Cowan

A great blog post about the lessons from implementing social media in a government agency. Peter Cowan – an Open Everything alum – is part of the team at Natural Resources Canada team that has been doing amazing work (NRCan is one of the most forward looking ministries in the world in this regard). Peter’s piece focuses on misunderstanding the “business case” for social media and how it often trips up large government bureaucracies. This abbreviated but extended quote on why traditional IT business cases don’t work or aren’t necessary is filled with great thoughts and comments:

“They (Social Media tools) are simple and viral and they cost very little to implement so the traditional requirements for upfront business needs definition to control risk and guide investment are not as important. In fact it would take more time to write a proposal and business case than to just put something out there and see what happens.

More importantly though social media are fundamentally new technologies and the best way to understand their business value is to get them into the hands of the users and have them tell you. To a large degree this is what has happened with the NRCan Wiki. Most of the innovative uses of the wiki came from the employees experimenting. They have not come from a clearly articulated business needs analysis or business case done in advance.

In fact, determining business needs in advance of having a tool in hand may actually lead to status quo approaches and tools. There is the famous Henry Ford… quote goes something like “if I had asked people what they wanted in a car they would have said faster horses”. We social media folks usually deploy this quote to highlight the weakness of focusing too much on responding to people’s perceptions of their existing business needs as a determinant of technology solution since people invariably define their needs in terms of improving the way they are already doing things, not how things could be done in a fundamentally new way.

Genius.

Google’s Microsoft Moment by Anil Dash

A fantastic piece about how Google’s self-perception is causing it to make strategically unsound choices at the same time as its public perception may be radically shifting (from cute fuzzy Gizmo in to mean nasty Stripe). A thoughtful critique and a great read on how the growth and maturation of a company’s culture needs to match its economic growth. I’ve added Anil Dash to by must read blogs – he’s got lots of great content.

Open Cities – the Counter Reaction

The Washington Monthly has an interesting piece about how some bureaucracies are having a reactionary (but albeit unsurprising) reaction to open data initiatives. The article focuses on how the data used by one application, Stumble Safely “helps you find the best bars and a safe path to stumble home on” by mashing together DC Crime Data, DC Road Polygons, DC Liquor Licenses, DC Water, DC Parks, and DC Metro Stations.

However, arming citizens with precise knowledge doesn’t appear to make one group of people happy: The Washington, D.C. police department. As the article notes:

But a funny thing has happened since Eric Gundersen launched his application: Stumble Safely has become less useful, rather than more so. When you click on the gray and red crime-indicating dots that have appeared on the map in the past few months, you don’t get much information about what exactly happened—all you get is a terse, one-word description of the category of the incident (”assault,” or “theft”) and a time, with no details of whether it was a shootout or just a couple of kids punching each other in an alley.

This isn’t Gundersen’s fault—it’s the cops’. Because while Kundra and the open-data community were fans of opening up the city’s books, it turned out that the Metropolitan Police Department was not. Earlier this year, as apps like Stumble Safely grew in number and quality, the police stopped releasing the detailed incident reports—investigating officers’ write-ups of what happened—into the city’s data feed. The official reason for the change is concern over victims’ and suspects’ privacy. But considering that before the clampdown the reports were already being released with names and addresses redacted, it’s hard to believe that’s the real one. More likely, the idea of information traveling more or less unedited from cops’ keyboards to citizens’ computer screens made the brass skittish, and the department reacted the way bureaucracies usually do: it made public information harder to get. The imperatives of Government 2.0 were thwarted by the instincts of Government 1.0.

This is just one in a long list of ways that old-style government (1.0) is reacting against technology. The end result sadly however is that the action taken by the police doesn’t reduce crime, it just reduces the public’s confidence in the police force. This is just a small example of the next big debate that will take place at all levels of government: Will your government try to control information and services or will it develop trust by being both accountable and open to others building on its work? You can’t have it both ways and I suspect citizens – particularly creatives – are going to strongly prefer the latter.

This is a crosspost from my Open Cities Blog at CreativeClass.com

Women in Open Source – the canary in the coal mine

The other month I had the pleasure watching Angie Byron give the keynote lecture at Open Web Vancouver on Women in Open Source. The synopsis from Open Web Vancouver:

The open source world is rich with opportunities: Working with people of all cultures from all over the world; Collaborating with some of the biggest and brightest minds on the ultimate solutions to complicated problems; Changing the world by providing free tools for organizations such as non-profits, educational institutions, and governments; Building up marketable skills and practical knowledge.

But yet, so many women are missing out. Why is that? And what can we do to change it? This talk will endeavour to answer these questions, as well as provide tips and strategies for women who want to dip their toe into the waters.

I wish I could embed the video on my blog but alas, it is not possible, so I encourage you to wander over to Angie’s blog and watch the video there.

The important lesson about Angie’s talk is that it isn’t just about women. The power and capacity of an open source community is determined by the quantity and quality of its social capital. If a community fails to invest in either – if it turns off or away qualified people because its culture (however unintentionally) discriminates against a gender, race or group – then it limits its growth and potential. The same is true of a community culture that doesn’t allow certain groups to improve their social capital. These may seem like large, intangible questions, but they are not. I’m sure every open source community turns some people off. Sometimes the reasons are good – the fit might not be right. But sometimes, I’m certain the reasons are not good. And the community may never get the feedback it needs because the moderate, productive person who walks away doesn’t create a scene, they may just quietly disappear (or worse they never showed up to begin with).

So Angie matters not just because women are missing out (although this is true, important and urgent). Angie’s talk matters because women are just the canary in the coal mine. Millions of people are missing out – people with ideas and the ability to make contributions get turned away because of a bad experience, because a community’s culture is off putting, too aggressive, not welcoming or not supportive.

For me its opened up a whole new way of thinking about my writing on open source communities. After Angie’s talk I sought her out as I felt we’d been talking about the same things. I’m interested in developing norms, skills and tools within an open source community that allows more people to participate and collaborate more effective, in short how do we think about community management. Angie is talking about developing open source communities that support and engage women. Working towards solving one helps us solve the other. So if you wake up today and notice there are no (other) women on the IRC channel with you… maybe we should both individually and collectively as a community engage in a little introspection and think about what we could change. Doing so won’t only make the community more inclusive, it will make it more productive and effective as well.

Open Data at the City of Vancouver – An Update 16/7/2009

matrix

For those interested in the progress around the Open Motion (or Open3 as city staff have started to call it) I’ve a little update.

Last week during a visit to city hall to talk about the motion I was shown a preview of the website the city has created to distribute and share data sets. For those unsure what such a website would look like, the baseline and example I’m measuring the city against is, of course, the Washington DC website. At the moment the city’s prospective website looks more like the (also impressive) beta site the City of Nanaimo set up after ChangeCamp – a little simpler and with a lot fewer data sets, but it is the first step.

As an aside kudos to the City of Nanaimo team which has been pushing open data and especially geo-data for quite some time as this must read Time magazine piece (yes, a must read Time magazine piece) will attest.

Anyway… back to Vancouver. The fact that the city has a beta website with a (very) modest amount of data ready to launch is a testament to the hard work and effort of the City’s IT staff. Rarely in my work with government’s have I seen something move so quickly and so needless to say… I’m quite excited. At the moment, I don’t know when the beta data site will go live – there are still a few remaining issues being worked on – but as soon as it launches I will be writing a blog post.

In the interim, big kudos should also go to the City’s Archives who posted a number of videos from the archives online and created it’s own YouTube Channel. They received so much traffic over the videos that the servers almost ground to a halt. Awesome, I say. It just goes to show how much interest there is out there.

Also exciting is that my post on how open data makes garbage collection sexy has inspired two local hackers (Luke and Kevin) to scrape the city’s garbage calendar and hand created digital versions of the city’s garbage maps to create the app I spec’ed out in the blog post. (I’ll have more on that, including links, in a few weeks) Luke also suggested I start recording other app ideas that come to me so over at the Vancouver Data Google group which was created on the fly by local coders in the audience during my and Andrea’s presentation at Open Web Vancouver. I’ve asked people to share their ideas for applications (mobile or desktop) that they’d like to see created with open data.

Sooooo… if there is an app you’d like to see created please post it to the google group or send me an email or write it in the comments below. No guarantees that it will be created but I’m hoping to help organize some hack-a-thons (or as my city friends prefer… code sprints). Having some ideas for people to sink their teeth into is always helpful.

How to predict the "Fixability" of a Bugzilla Submission

bugzilla iconMy friend Diederik van Liere has written a very, very cool jet-pack add-on that calculates the probability a bug report will result in a fixed bug.

The skinny on it is that Diederik’s app bases its prediction on the bug reporter’s experience, their past success rate, the presence of a stack trace and whether the bug reporter is a Mozilla affiliate. These variables appear to be strong and positive predictors of whether a bug will be fixed. The add-on can be downloaded here and its underlying methodology is explained in this blog post.

One way the add-on could be helpful is that it would enable the mozilla community to focus its resources on the most promising bug reports. Volunteer coders with limited time who want to show up and and take ownership over a specific bug would probably find this add-on handy as it would help them spend their precious volunteer time on bugs that are likely well thought through, documented effectively and submitted by someone who will be accessible and able to provide them with input if necessary.

The danger of course, is that a tool like this might further enhance (what I imagine is) a power-law like distribution of bug submitters. The add-on would allow those who are already the most effective bug submitters to get still more attention while first time submitters or those who are still learning may not receive as much sufficient attention (coaching, feedback, support) to improve. Indeed, one powerful way the tool might be used (and which I’m about to talk to Diederik about) is to determine if there are classes of bug submitters who are least likely to be successful. If we can find some common traits among them it might be possible to identify ways to better support them and/or enable them to contribute to the community more effectively. Suddenly a group of people who have expressed interest but have been inadvertently marginalized (not on purpose) could be brought more effectively into the community. Such a group might be the lowest hanging fruit in finding the next million mozillians.

How Open Data even makes Garbage collection sexier, easier and cheaper

So presently the City of Vancouver only shares its garbage schedule (which it divides into north and south) as a PDF file. This is a pity as it means that no one can build any apps around it. Imagine a website or Iphone app that mashed up google maps with a constantly up to date city garbage pick up schedule. With such a application one could:

  • Simply punch in your address and find out your garbage zone (no more guessing if you live on the border of a zone)
  • Get an email or SMS notification 15 minutes, 1 hour, 12 hours (whenever!) before pick up
  • Download the garbage schedule into your calendar using XML, HTML or ICAL

Garbage-App

Let’s face it, everyone could do with a garbage day reminder. I can imagine that there are 1000’s of Vancouverites who’d sign up for an email notification.

Maybe this seems all very simply, nice but unimportant. But this is more than just creating convenience. What are the implications for such an application?

For citizens:

  • Let’s face it, for many of us, it would be a convenient and nice to get a reminder of when the garbage is going to be picked up
  • It would improve citizen’s appreciation for city services
  • It might also increase the likelihood citizen will recycle as the notification would enable them to better plan and prepare for garbage pick up

For the city & taxpayers

  • Every day 100s of Vancouverites forget to put out there garbage out for pick up. If garbage isn’t picked up, it piles up. This creates several new risks and costs to the city including: increasing likelihood of rodents and health hazards and an increased risk that some residents will dispose of their garbage inappropriately/illegally
  • When garbage is not put out an expensive city asset (the garbage truck, fuel, garbage men) all pass by unused. This means that taxpayers money is not used as efficiently as it could.
  • Moreover,  when garbage isn’t put out there will be twice as much at the house the next week. Multiply that by 100’s of houses and the very quickly the city must deploy extra garbage trucks to deal with this unusually large volume of garbage – costing city taxpayers.

What would be the total value/savings of such an application? Hard to gauge. But add up, time saved by citizens better able to plan around garbage pick up, a small reduction in illegal garbage disposal, a small increase in recycling, slight increase in garbage pick up efficiency and I’m sure the total value would be in the low millions, and the direct savings for the city in the low $100,000 per year. That is not insignificant – especially if all the city had to do was free the data and allow an intrepid hacker to code up the website.

Of course it doesn’t end there. A reliable source tells me the city collects far more data about garbage collection. For example when the driver can’t pick up the garbage can, they make an entry on their device as to why (e.g., it is under a tree). This entry is sent via a wireless connection back to a City database, and includes the highly precise coordinates of the truck at that moment. Then when a resident calls in to find out why the crew did not pick up the garbage can from their residence, the operator can bring up the address on a map and pinpoint the problem.

Such data could also be on the website in an option with something like “Why didn’t my garbage get picked up?” By sharing this data the city could reduce its call volume (and thus costs). With open data, the possibilities, savings and convenience is endless.