Category Archives: technology

Some theories on why Canadians are the #1 user of YouTube (it's not all good)

In theory I’m on break – trying to recharge my batteries, summit mount inbox zero and finish off a couple of papers I owe various good people – but a few people have sent me links to this story (same content here at the CBC), about how Canadians are embrace the web like few others citizens of the world.

Naturally I’m thrilled and unsurprised. Canadians live in a large country and connectivity has always been something that has driven us. Indeed the country as we know it only exists because of a deal on connectivity – my own province of British Columbia agreed to enter the Dominion only if a transcontinental railway was built to connect it with the rest of the emerging country. Connectivity is in our blood.

There is, however, I suspect another reason why Canadians have taken to the web and it has to do with our monopolies and content regulation.

The article notes that Canada is the number one viewer of YouTube videos:

“In Canada, YouTube per capita consumption of video is No. 1 in the world, it’s just absolutely crazy in terms of how passionate Canadians are about YouTube,” said Chris O’Neill, Canada’s country director for Google.

I wonder, however, if this is because of Canada’s proximity to and familiarity with American created content, but our limited access to seeing said content. The CRTC restricts Canadians access to US channels (and as a result, TV shows). Consequently, much like I argued that the continued success of Blockbuster in Canada is not a sign of effective corporate management but poor innovation strategy and telecommunication regulation Canadians may be flooding to YouTube because they can’t access the content they want through more traditional channels.

If true (and I concede I don’t know what Canadians are watching on YouTube) then on the brightside, this is good news for Canadian consumers are able to get what they want access to, regardless of how the government tries to shape their tastes. Indeed, I suspect that American content isn’t the only thing driving YouTube traffic, as a country of immigrants I’m sure that new (and longstanding) Canadians of a range of backgrounds use YouTube to stay on top of culture, shows and other content from their countries of origin. If all this is helping Canadians become more web savvy and appreciative of the benefits of an open web – then all the better!

On the flip side, this could be a sign that a whole series of Canadian companies (and the jobs they create) are imperiled because they refuse to innovate as quickly as Canadians would like. This isn’t a reason to preserve them, but it is a reason for us to start demanding more from the executives of these companies.

The False choice: Bilingualism vs. Open Government (and accountability)

Last week a disturbing headline crossed my computer screen:

B.C. RCMP zaps old news releases from its website

2,500 releases deleted because they weren’t translated into French

1) The worst of all possible outcomes

This is a terrible outcome for accountability and open government. When we erase history we diminish accountability and erode our capacity to learn. As of today, Canadians have a poorer sense of what the RCMP has stood for, what it has claimed and what it has tried to do in British Columbia.

Consider this. The Vancouver Airport is a bilingual designated detachment. As of today, all press releases that were not translated were pulled down. This means that any press release related to the national scandal that erupted after Robert Dziekański – the polish immigrant who was tasered five times by the (RCMP) – is now no longer online. Given the shockingly poor performance the RCMP had in managing (and telling the truth about) this issue – this concerns me.

Indeed, I can’t think that anyone thinks this is a good idea.

The BC RCMP does not appear to think it is a good idea. Consider their press officer’s line: “We didn’t have a choice, we weren’t compliant.”

I don’t think there are any BC residents who believe they are better served by this policy.

Nor do I think my fellow francophone citizens believe they are better served by this decision. Now no one – either francophone or anglophone can find these press releases online. (More on this below)

I would be appalled if a similar outcome occurred in Quebec or a francophone community in Manitoba. If the RCMP pulled down all French press releases because they didn’t happen to have English translations, I’d be outraged – even if I didn’t speak French.

That’s because the one thing worse than not having the document in both official languages, is not having access to the document at all. (And having it hidden in some binder in a barracks that I have to call or visit doesn’t event hint of being accessible in the 21st century).

Indeed, I’m willing to bet almost anything that Graham Fraser, the Official Languages Commissioner – who is himself a former journalist – would be deeply troubled by this decision.

2) Guided by Yesterday, Not Preparing for Tomorrow

Of course, what should really anger the Official Languages Commissioner is an attempt to pit open and accountable government against bilingualism. This is a false choice.

I suspect that the current narrative in government is that translating these documents is too expensive. If one relies on government translators, this is probably true. The point is, we no longer have to.

My friend and colleague Luke C. pinged me after I tweeted this story saying “I’d help them automate translating those news releases into french using myGengo. Would be easy.”

Yes, mygengo would make it cheap at 5 cents a word (or 15 if you really want to overkill it). But even smarter would be to approach Google. Google translate – especially between French and English – has become shockingly good. Perfect… no. Of course, this is what the smart and practical people on the ground at the RCMP were doing until the higher ups got scared by a French CBC story that was critical of the practice. A practice that was ended even though it did not violate any policies.

The problem is there isn’t going to be more money to do translation – not in a world of multi-billion dollar deficits and in a province that boasts 63,000 french speakers. But Google translate? It is going to keep getting better and better. Indeed, the more it translates, the better it gets. If the RCMP (or Canadian government) started putting more documents through Google translate and correcting them it would become still more accurate. The best part is… it’s free. I’m willing to bet that if you ran all 2500 of the press releases through Google translate right now… 99% of them would come out legible and of a standard that would be good enough to share. (again, not perfect, but serviceable). Perhaps the CBC won’t be perfectly happy. But I’m not sure the current outcome makes them happy either. And at least we’ll be building a future in which they will be happy tomorrow.

The point here is that this decision reaffirms a false binary: one based on a 20th century assumptions where translations were expensive and laborious. It holds us back and makes our government less effective and more expensive. But worse, it ignores an option that embraces a world of possibilities – the reality of tomorrow. By continuing to automatically translate these documents today we’d continue to learn how to use and integrate this technology now, and push it to get better, faster. Such a choice would serve the interests of both open and accountable governments as well as bilingualism.

Sadly, no one at the head office of the RCMP – or in the federal government – appears to have that vision. So today we are a little more language, information and government poor.

Three asides:

1) I find it fascinating that the media can get mailed a press release that isn’t translated but the public is not allowed to access it on a website until it is – this is a really interesting form of discrimination, one that supports a specific business model and has zero grounding in the law, and indeed may even be illegal given that the media has no special status in Canadian law.

2) Still more fascinating is how the RCMP appears to be completely unresponsive to news stories about inappropriate behavior in its ranks, like say the illegal funding of false research to defend the war on drugs, but one story about language politics causes the organization to change practices that aren’t even in violation of its policies. It us sad to see more evidence still that the RCMP is one of the most broken agencies in the Federal government.

3) Thank you to Vancouver Sun Reporter Chad Skelton for updating me on the Google aspect of this story.

Visualizing Firefox Plugins Memory Consumption

A few months ago the Mozilla Labs and the Metrics Team, together with the growing Mozilla Research initiative, launched an Open Data Visualization Competition.

Using data collected from Test Pilot users (people who agreed to share anonymous usage data with Mozilla and test pilot new features) Mozilla asked its community to think of creative visual answers to the question: “How do people use Firefox?”

As an open data geek and Mozilla supporter the temptation to try to do something was too great. So I teamed up with my old data partner Diederik Van Liere and we set out to create a visualization. Our goals were simple:

  • have fun
  • focus on something interesting
  • create something that would be useful to Firefox developers and/or users
  • advance the cause for creating a Firefox open data portal

What follows is the result.

It turns out that – in our minds – the most interesting data set revolved around plugin memory consumption. Sure this sounds boring… but plugins (like Adobe reader, Quicktime or Flash) or are an important part of the browser experience – with them we engage in a larger, richer and more diverse set of content.  Plugins, however, also impact memory consumption and, consequently, browser performance. Indeed, some plugins can really slow down Firefox (or any browser). If consumers had a better idea of how much performance would be impacted they might be more selective about which plugins they download, and developers might be more aggressive in trying to make their plugins more efficient.

Presently, if you run Firefox you can go to the Plugin Check page to see if your plugins are up to date. We thought: Wouldn’t it be great if that page ALSO showed you memory consumption rates? Maybe something like this (note the Memory Consumption column, it doesn’t exist on the real webpage, and you can see a larger version of this image here):

Firefox data visualization v2

Please understand (and we are quite proud of this). All of the data in this mockup is real. Memory consumptions are estimates we derived by analyzing the Test Pilot data.

How, you might ask did we (Diederik) do that?

GEEK OUT EXPLANATION: Well, we (Diederik) built a dataset of about 25,000 different testpilot users and parsed the data to see which plugins were installed and how much memory was consumed around the time of initialization. This data was analyzed using ordinary least squares regression where the dependent variable is memory consumption and the different plugins are the explanatory variables. We only included results that are highly significant.

The following table shows our total results (you can download a bigger version here).

Plugin_memory_consumption_chart v2

Clearly, not all plugins are created equal.

Our point here isn’t that we have created the definitive way of assessing plugin impact on the browser, our point is that creating a solid methodology for doing so is likely witihin Mozilla’s grasp. More importantly, doing this could help improve the browsing experience. Indeed, it would probably be even wiser to do something like this for Add-ons, which is where I’m guessing the real lag time around the browsing experience is created.

Also, with such a small data set we were only able to calculate the memory usage for a limited number of plugins and generally those that are more obscure. Our methodology required having several data points from people who are and who aren’t using a given plugin and so with many popular plugins we didn’t have enough data from people who weren’t using it… a problem however, that would likely be easily solved with access to more data.

Finally, I hope this contest and our submission helps make the case for why Mozilla needs an open data portal. Mozilla collects and incredible amount of data of which it does not have the resources to analyze internally. Making it available to the community would do to data what Mozilla has done to code – enable others create value that could affect the product and help advance the open web. I had a great meeting earlier this week with a number of the Mozilla people about this issue, I hope that we can continue to make progress.

International Open Data Hackathon – 63 cities, 25 countries, 5 continents

and counting. Never could any of us have imagined that there would be so many stepping forward to organize an event in their cities.

The clear implication is that Open Data matters. To a lot of people.

To a lot of us.

If you are in the media, a politician or the civil service: pay attention. There are a growing number of people – not just computer programmers and hackers, but ordinary citizens – who’ve come to love and want to help build sites and applications like fixmystreet, wheredoesmymoneygo, emitter.ca or datamasher.

If you are planning to participate in a hackathon – I hope you’ll read the next part (and help continue grow the wiki).

I think, for the day – December 4th – we all really have three shared goals.

1. Have fun

2. Help foster local supportive and diverse communities of people who advocate for open data

3. Help raise awareness of open data, why it matters, by building sites and applications

I’m confident that the first two will happen, but as I said in an earlier post… it is important that we have artifacts at the end of the day to share with the world.

In pursuit of that goal, I continue to believe that one of the easiest things we can do is localize cool projects that have happened in other jursidictions. For example, a team in Bangalore, India as well as a team in Vancouver & Victoria, Canada are contemplating porting openparliament.ca to their respective jurisdictions. It’s a great way to get a huge win and a new, useful, site up and running in a (relatively) short period of time.

I write this because I’m thinking there must be tons of interesting and engaging open data applications out there. If you run such an application… (I’m especially looking at you Sunlight Foundation, Open Knowledge Foundation, MySociety & others…) and you think people in other jurisdictions might want to localize them for their country, state or city… then I’d like you to consider doing the following:

Post to the Apps page of the wiki:

  • the project name,
  • link to the source code repository,
  • any documentation,
  • the various tasks you think will be involved in localizing it
  • things that non-coders can do to advance the project (like research, documentation, graphics, copy text for websites, etc…)
  • and some (very) rough senses of scope and timelines

(Note, I’m hoping to throw a template up shortly, but sadly, right now, I’m hoping on a red-eye flight so can’t do it… with luck tomorrow sometime I’ll delete this text and have added an example like openparliament.ca. For now Victoria and Vancouver have the beginnings of what I’m thinking of on their wiki pages)

Nothing would be cooler than having open data apps ported around world, helping spread citizen engagement, democratic accountability and fun with them.

I know there are some emails flying around about connecting cities for demos as suggested on the opendata hackathon website. Hope to have more on that soon as well.

Also, if you do think that media or local officials will attend and you’d like to brief them on opendata, I have some people at the world bank who’ve been helping launch and expand their open data portal who might be willing to help engage and explain why it is important to such people. Could be nice to have the additional help. Up to you. But feel free to let me know if there is interest.

Finally, if you are running a hackathon, please reach out and say hi. I’d love to hear from you.

Excited.

Very, very excited.

Launching datadotgc.ca 2.0 – bigger, better and in the clouds

Back in April of this year we launched datadotgc.ca – an unofficial open data portal for federal government data.

At a time when only a handful of cities had open data portals and the words “open data” were not being even talked about in Ottawa, we saw the site as a way to change the conversation and demonstrate the opportunity in front of us. Our goal was to:

  • Be an innovative platform that demonstrates how government should share data.
  • Create an incentive for government to share more data by showing ministers, public servants and the public which ministries are sharing data, and which are not.
  • Provide a useful service to citizens interested in open data by bringing it all the government data together into one place to both make it easier to find.

In every way we have achieved this goal. Today the conversation about open data in Ottawa is very different. I’ve demoed datadotgc.ca to the CIO’s of the federal government’s ministries and numerous other stakeholders and an increasing number of people understand that, in many important ways, the policy infrastructure for doing open data already exists since datadotgc.ca show the government is already doing open data. More importantly, a growing number of people recognize it is the right thing to do.

Today, I’m pleased to share that thanks to our friends at Microsoft & Raised Eyebrow Web Studio and some key volunteers, we are taking our project to the next level and launching Datadotgc.ca 2.0.

So what is new?

In short, rather than just pointing to the 300 or so data sets that exist on federal government websites members may now upload datasets to datadotg.ca where we can both host them and offer custom APIs. This is made possible since we have integrated Microsoft’s Azure cloud-based Open Government Data Initiative into the website.

So what does this mean? It means people can add government data sets, or even mash up government data sets with their own data to create interest visualization, apps or websites. Already some of our core users have started to experiment with this feature. London Ontario’s transit data can be found on Datadotgc.ca making it easier to build mobile apps, and a group of us have taken Environment Canada’s facility pollution data, uploaded it and are using the API to create an interesting app we’ll be launching shortly.

So we are excited. We still have work to do around documentation and tracking some more federal data sets we know are out there but, we’ve gone live since nothing helps us develop like having users and people telling us what is, and isn’t working.

But more importantly, we want to go live to show Canadians and our governments, what is possible. Again, our goal remains the same – to push the government’s thinking about what is possible around open data by modeling what should be done. I believe we’ve already shifted the conversation – with luck, datadotgc.ca v2 will help shift it further and faster.

Finally, I can never thank our partners and volunteers enough for helping make this happen.

Let's do an International Open Data Hackathon

Let’s do it.

Last summer, I met Pedro Markun and Daniela Silva at the Mozilla Summit. During the conversation – feeling the drumbeat vibe of the conference – we agreed it would be fun to do an international event. Something that could draw attention to open data.

A few weeks before I’d met Edward Ocampo-Gooding, Mary Beth Baker and Daniel Beauchamp at GovCamp Ottawa. Fresh from the success of getting the City of Ottawa to see the wisdom of open data and hosting a huge open data hackathon at city hall they were thinking “let’s do something international.” Yesterday, I tested the idea on the Open Knowledge Foundation’s listserve and a number of great people from around the world wrote back right away and said… “We’re interested.”

This idea has lots of owners, from Brazil to Europe to Canada, and so my gut check tells me, there is interest. So let’s take the next step. Let’s do it.

Why.

Here’s my take on three great reasons now is a good time for a global open data hackathon:

1) Build on Success: There are a growing number of places that now have open data. My sense is we need to keep innovating with open data – to show governments and the public why it’s serious, why it’s fun, why it makes life better, and above all, why it’s important. Let’s get some great people together with a common passion and see what we can do.

2) Spread the Word: There are many places without open data. Some places have developed communities of open data activists and hackers, others have nascent communities. In either case these communities should know they are not alone, that there is an international community of developers, hackers and advocates who want to show them material and emotional support. They also need to demonstrate, to their governments and the public, why open data matters. I think an global open data hackathon can’t hurt, and can make help a whole lot. Let’s see.

3) Make a Better World: Finally, there is a growing amount of global open data thanks to the World Bank’s open data catalog and its Apps for Development competition. The Bank is asking for developers to build apps that, using this data (plus any other data you want) will contribute to reaching the Millennium Development Goals by 2015. No matter who, or where, you are in the world this is a cause I believe we can all support. In addition, for communities with little available open data, the bank has a catalog that might provide at least some that is of interest.

So with that all said, I think we should propose hosting a global open data hackathon that is simple and decentralized: locally organized, but globally connected.

How.

The basic premises for the event would be simple, relying on 5 basic principles.

1. It will happen on Saturday, Dec 4th. (after a fair bit of canvassing of colleagues around the world this seems to be a date a number can make work). It can be as big or as small, as long or as short, as you’d like it.

2. It should be open. Daniel, Mary Beth and Edward have done an amazing job in Ottawa attracting a diverse crowd of people to hackathons, even having whole families come out. Chris Thorpe in the UK has done similarly amazing work getting young and diverse group hacking. I love Nat Torkington’s words on the subject. Our movement is stronger when it is broader.

3. Anyone can organize a local event. If you are keen help organize one in your city and/or just participate add your name to the relevant city on this wiki page. Where ever possible, try to keep it to one per city, let’s build some community and get new people together. Which city or cities you share with is up to you as it how you do it. But let’s share.

4. You can hack on anything that involves open data. Could be a local app, or a global apps for development submission, scrape data from a government website and make it available in a useful format for others or create your own data catalog of government data.

5. Let’s share ideas across cities on the day. Each city’s hackathon should do at least one demo, brainstorm, proposal, or anything that it shares in an interactive way with at members of a hackathon in at least one other city. This could be via video stream, skype, by chat… anything but let’s get to know one another and share the cool projects or ideas we are hacking on. There are some significant challenges to making this work: timezones, languages, culture, technology… but who cares, we are problem solvers, let’s figure out a way to make it work.

Again, let’s not try to boil the ocean. Let’s have a bunch of events, where people care enough to organize them, and try to link them together with a simple short connection/presentation.Above all let’s raise some awareness, build something and have some fun.

What’s next?

1. If you are interested, sign up on the wiki. We’ll move to something more substantive once we have the numbers.

2. Reach out and connect with others in your city on the wiki. Start thinking about the logistics. And be inclusive. Someone new shows up, let them help too.

3. Share with me your thoughts. What’s got you excited about it? If you love this idea, let me know, and blog/tweet/status update about it. Conversely, tell me what’s wrong with any or all of the above. What’s got you worried? I want to feel positive about this, but I also want to know how we can make it better.

4. If there is interest let’s get a simple website up with some basic logo that anyone can use to show they are part of this. Something like the OpenDataOttawa website comes to mind, but likely simpler still, just laying out the ground rules and providing links to where events are taking place. Might even just be a wiki. I’ve registered opendataday.org, not wedded to it, but it felt like a good start. If anyone wants to help set that up, please let me know. Would love the help.

5. Localization. If there is bandwidth locally, I’d love for people to translate this blog post and repost it locally. (let me know as I’ll try cross posting it here, or at least link to it). It is important that this not be an english language only event.

6. If people want a place to chat with other about this, feel free to post comments below. Also the Open Knowledge Foundation’s Open Government mailing list is probably a good resource.

Okay, hopefully this sounds fun to a few committed people. Let me know what you think.

The Social Network and the real villains of the old/new economy

The other week I finally got around to watching The Social Network. It’s great fun and I recommend going out and watching it whether you’re a self-professed social media expert or don’t even have a Facebook account.

Here are some of my thoughts about the movie (don’t worry, no spoilers here).

1. Remember this is a Hollywood movie: Before (or after) you watch it, read Lawrence Lessig’s fantastic critique of the movie. This review is so soundly brilliant and devastating I’m basically willing to say, if you only have 5 minutes to read, leave my blog right now and go check it out. If you are a government employee working on innovation, copyright or the digital economy, I doubly urge you to read it. Treble that if you happen to be (or work for) the CIO of a major corporation or organization who (still) believes that social media is a passing phase and can’t see its disruptive implications.

2. It isn’t just the legal system that is broken: What struck me about the movie wasn’t just the problems with the legal system, it was how badly the venture capitalists come off even worse. Here is supposed to be a group of people who are supposed to help support and enable entrepreneurs and yet they’re directing lawyers to draft up contracts that screw some of the original founders. If the story is even remotely true it’s a damning and cautionary tale for anyone starting (or looking to expand) a company. Indeed, in the movie the whole success of Facebook and the ability of (some) of the owners to retain control over it rests on the fact that graduates of the first internet bubble who were screwed over by VCs are able to swoop in and protect this second generation of internet entrepreneurs. Of course they – played by Sean Parker (inventor of Napster) – are parodied as irresponsible and paranoid.

One thought I walked away with was: if, say as a result of the rise of cloud computing, the costs of setting up an online business continue to drop, at a certain point the benefits of VC capital will significantly erode or their value proposition will need to significantly change. More importantly, if you are looking to build a robust innovation cluster, having it built on the model that all the companies generated in it have the ultimate goal of being acquired by a large (American) multinational doesn’t seem like a route to economic development.

Interesting questions for policy makers, especially those outside Silicon Valley, who obsess about how to get venture capital money into their economies.

3. Beyond lawyers and VCs, the final thing that struck me about the movie was the lack of women doing anything interesting. I tweeted this right away and, of course, a quick Google search reveals I’m not the only one who noticed it. Indeed, Aaron Sorkin (the film’s screenwriter) wrote a response to questions regarding this issue on Emmy winner Ken Levine’s blog. What I noticed in The Social Network is there isn’t a single innovating or particularly positive female character. Indeed, in both the new and old economy worlds shown in the film, women are largely objects to be enjoyed, whether it is in the elite house parties of Harvard or the makeshift start-up home offices in Palo Alto. Yes, I’m sure the situation is more complicated, but essentially women aren’t thinkers – or drivers – in the movie. It’s a type of sexism that is real, and in case you think it isn’t just consider a TechCrunch article from the summer titled “Too Few Women In Tech? Stop Blaming The Men” in which the author, Michael Arrington, makes the gobsmacking statement:

The problem isn’t that Silicon Valley is keeping women down, or not doing enough to encourage female entrepreneurs. The opposite is true. No, the problem is that not enough women want to become entrepreneurs.

Really? This is a country (the United States) where women start businesses at twice the rate of men and where 42% of all businesses are women owned? To say that women don’t want to be entrepreneurs is so profoundly stupid and incorrect it perfectly reflects the roles women are shoveled into in The Social Network. And that is something the new economy needs to grapple with.

World Bank Discussion on Open Data – lessons for developers, governments and others

Yesterday the World Bank formally launched its Apps For Development competition and Google announced that in addition to integrating the World Bank’s (large and growing) data catalog into searches, it will now do it in 34 languages.

What is fascinating about this announcement and the recent changes at the bank is it appears to be very serious about open data and even more serious about open development. The repercussions of this shift, especially if the bank starts demanding that its national partners also disclose data, could be significant.

This of course, means there is lots to talk about. So, as part of the overall launch of the competition and in an effort to open up the workings of the World Bank, the organization hosted its first Open Forum in which a panel of guests talked about open development and open data. The bank was kind enough to invite me and so I ducted out of GTEC a pinch early and flew down to DC to meet some of the amazing people behind the world bank’s changes and discuss the future of open data and what it means for open development.

Embedded below is the video of the event.

As a little backgrounder here are some links to the bios of the different panelists and people who cycled through the event.

Our host: Molly Wood of CNET.

Andrew McLaughlin, Deputy Chief Technology Officer, The White House (formerly head of Global Public Policy and Government Affairs for Google) (twitter feed)

Stuart Gill, World Bank expert, Disaster Mitigation and Response for LAC

David Eaves, Open Government Writer and Activist

Rakesh Rajani, Founder, Twaweza, an initiative focused on transparency and accountability in East Africa (twitter)

Aleem Walji, Manager, Innovation Practice, World Bank Institute (twitter)

Rethinking Freedom of Information Requests: from Bugzilla to AccessZilla

Last week I gave a talk at the Conference for Parliamentarians hosted by the Information Commission as part of Right to Know Week.

During the panel I noted that, if we are interested in improving response times for Freedom of Information (FOI) requests (or, in Canada, Access to Information (ATIP) requests) why doesn’t the Office of the Information Commissioner use a bugzilla type software to track requests?

Such a system would have a number of serious advantages, including:

  1. Requests would be public (although the identity of the requester could remain anonymous), this means if numerous people request the same document they could bandwagon onto a single request
  2. Requests would be searchable – this would make it easier to find documents already released and requests already completed
  3. You could track performance in real time – you could see how quickly different ministries, individuals, groups, etc… respond to FOI/ATIP requests, you could even sort performance by keywords, requester or time of the year
  4. You could see who specifically is holding up a request

In short such a system would bring a lot of transparency to the process itself and, I suspect, would provide a powerful incentive for ministries and individuals to improve their performance in responding to requests.

For those unfamiliar with Bugzilla it is an open source software application used by a number of projects to track “bugs” and feature requests in the software. So, for example, if you notice the software has a bug, you register it in Bugzilla, and then, if you are lucky and/or if the bug is really important, so intrepid developer will come along and develop a patch for it. Posted below, for example, is a bug I submitted for Thunderbird, an email client developed by Mozilla. It’s not as intuitive as it could be but you can get the general sense of things: when I submitted the bug (2010-01-09), who developed the patch (David Bienvenu), it’s current status (Fixed), etc…

ATIPPER

Interestingly, an FOI or ATIP request really isn’t that different than a “bug” in a software program. In many ways, bugzilla is just a complex and collaborative “to do” list manager. I could imagine it wouldn’t be that hard to reskin it so that it could be used to manage and monitor access to information requests. Indeed, I suspect there might even be a community of volunteers who would be willing to work with the Office of the Information Commissioner to help make it happen.

Below I’ve done a mock up of what I think revamped Bugzilla, (renamed AccessZilla) might look like. I’m put numbers next to some of the features so that I can explain in detail about them below.

ATIPPER-OIC1

So what are some of the features I’ve included?

1. Status: Now an ATIP request can be marked with a status, these might be as simple as submitted, in process, under review, fixed and verified fixed (meaning the submitter has confirmed they’ve received it). This alone would allow the Information Commissioner, the submitter, and the public to track how long an individual request (or an aggregate of requests) stay in each part of the process.

2.Keywords: Wouldn’t it be nice to search of other FOR/ATIP requests with similar keywords? Perhaps someone has submitted a request for a document that is similar to your own, but not something you knew existed or had thought of… Keywords could be a powerful way to find government documents.

3. Individual accountability: Now you can see who is monitoring the request on behalf of the Office of the information commissioner and who is the ATIP officer within the Ministry. If the rules permitted then potential the public servants involved in the document might have their names attached here as well (or maybe this option will only be available to those who log on as ATIP officers.

4. Logs: You would be able to see the last time the request was modified. This might include getting the documents ready, expressing concern about privacy or confidentiality, or simply asking for clarification about the request.

5. Related requests: Like keywords, but more sophisticated. Why not have the software look at the words and people involved in the request and suggest other, completed requests, that it thinks might similar in type and therefor of interest to the user. Seems obvious.

6. Simple and reusable resolution: Once the ATIP officer has the documentation, they can simply upload it as an attachment to the request. This way not only can the original user quickly download the document, but anyone subsequent user who stumbles upon the request during a search could download the documents. Better still any public servant who has unclassified documents that might relate to the request can simply upload them directly as well.

7. Search: This feels pretty obvious… it would certainly make citizens life much easier and be the basic ante for any government that claims to be interested in transparency and accountability.

8. Visualizing it (not shown): The nice thing about all of these features is that the data coming out of them could be visualized. We could generate realt time charts showing average response time by ministry, list of respondees by speed from slowest to fastest, even something as mundane as most searched keywords. The point being that with visualizations is that a governments performance around transparency and accountability becomes more accessible to the general public.

It may be that there is much better software out there for doing this (like JIRA), I’m definitely open do suggestions. What I like about bugzilla is that it can be hosted, it’s free and its open source. Mostly however, software like this creates an opportunity for the Office of the Information Commissioner in Canada, and access to information managers around the world, to alter the incentives for governments to complete FOI/ATIP requests as well as make it easier for citizens to find out information about their government. It could be a fascinating project to reskin bugzilla (or some other software platform) to do this. Maybe even a Information Commissioners from around the world could pool their funds to sponsor such a reskinning of bugzilla…

From Public Servant to Public Insurgent

Are you a public insurgent?

Today, a generation of young people are arriving into the public service familiar with all sorts of tools – especially online and social media driven tools – that they have become accustomed to using. Tools like wikis, survey monkeys, doodle, instant messaging or websites like wikipedia, or issue specific blogs enable them to be more productive, more efficient and more knowledgeable.

And yet, when they arrive in their office they are told: “You cannot use those tools here.”

In short, they are told: “Don’t be efficient.”

You can, of course, imagine the impact on moral of having a boss tell you that you must do you work in a manner that is slower and less effective then you might otherwise do. Indeed, today, in the public service and even in many large organizations, we may be experiencing the first generation of a work force that is able to accomplish coordination and knowledge building tasks faster at home than at work.

Some, when confronted with this choice simple resign themselves to the power of their organizations rules and become less efficient. Others (and I suspect not an insignificant number), begin the process of looking for their next job. But what I find particularly interesting is a tinier segment who –  as dedicated employees, that love the public service and who want to be as effective as possible – believe in their mission so strongly that they neither leave, nor do they adhere to the rules. They become public insurgents and do some of their work outside the governments infrastructure.

Having spoken about government 2.0 and the future of the public service innumerable times now I have, on several occasions, run into individuals or even groups, of these public insurgents. Sometimes they installed a wiki behind the firewall, sometimes they grab their laptop and head to a cafe so they can visit websites that are useful, but blocked by their ministry, sometimes they simple send around a survey monkey in contravention of an IT policy. The offenses range from the minor to the significant. But in each case these individuals are motivated by the fact that this is the most, and sometimes only, way to do the task they’ve been handed in an effective way. More interesting is that sometimes their acts of rebellion create a debate that causes the organization to embrace the tools they secretly use, sometimes they it doesn’t and they continue to toil in secret.

I find this trend – one that I think may be growing – fascinating.

So my question to you is… are you a public insurgent? If you are I’d love to hear your story. Please post it in the comments (using an anonymous handle) or send me an email.