Tag Archives: technology

Articles I'm Digesting 15/12/2009

Here are some pieces I’ve been reading of late:

You Can’t Handle the Truth by Mark Pothier in the Boston Globe

A great piece about how the classification of drugs used by most Western countries is completely divorced from how much harm those drugs cause. This isn’t surprising, but as the evidence begins to mount regarding which drugs are actually harmful (read alcohol, cocaine or heroine) versus those which are significantly less harmful (read Ecstasy or LSD) the question will increasingly emerge – will science ever inform our policies around managing these types of substances. Indeed, it is disturbing (and, er… sobering) to once again see the only  substance I use the list – alcohol – be put in such a stark and negative light.

At some point a real conversation about drugs is going to occur in the United States – I just hope it is sooner rather than later as it will have a profound effect on effectively we can deal with the tragic situation we have around substance abuse this side of the border.

Fla. Court Tells Judges and Lawyers to “Unfriend” Each Other (the AP)

Always fascinating to see how different fields respond to social networking. In this case a Florida…

…committee ruled Nov. 17 that online “friendships” could create the impression that lawyers are in a special position to influence their judge friends.

This is a great example of how social networking can cause some professions to actually become less transparent and, I would argue, harms the long term credibility of the institution. Notice here that the committee isn’t ruling that judges and lawyers can’t be friends, they are ruling that it would be harmful if the public could see that they are friends. So, in essence, if being a friend compromises the judgment of a judge, we solve that by preventing the public from seeing that the conflict could exist, rather than dealing with the conflict. Weird.

The last line is priceless:

McGrady, who is sending a copy of the ruling to the 69 judges in his circuit, said this potential conflict of interest is why he doesn’t have a Facebook page.

“If somebody’s my friend, I’ll call them on the phone,” he said, chuckling.

Errr, right. Good to keep it all in the old boys network where those on the inside know where the conflict may lie, but there is not digital trail or map that might allow the public to be better informed… Oh, and you’re the last generation that will only “pick up the phone” so this solution has, at best, a 20 year shelf life to it.

The Killer App of 1900 by Glenn Fleishman in Publicola

As some readers know, I’m a big fan of historical examples that show we are experiencing similar pressures, transformations, evolutions as experienced in the past. Part of it is the historian in me, part of it is how it helps ease the minds of those concerned or intimidated by change. There are, occasionally, genuinely new things that appear under the sun – but often those of us interested in technology and social change are too quick to scream “This is new! It changes everything!” Moreover, it does a disservice to our efforts often making people more skeptical, resistant and generally conservative towards the perceived change. Still more importantly, the past often sheds light on how power and influence created by a new technology or system may diffuse itself – who will be the winners/losers and the resisters.

In this context this article is a priceless example of the type of writing I wish I did more of.

The Score: Advice to Young Composers by Annie Gosfield in the New York Times

While written as sounds advice for composers, this is (as the friend who sent it to me said) sounds advice for policy wonks or, in my opinion, bloggers as well. (It’s actually just sounds advice for life).

A couple of credos in the piece that I hope my work, and this blog lives by:

Take your work seriously, but don’t take yourself too seriously: Hope that is evident in my writing style.

Be willing to put yourself and your music on the line: Try to do that everyday here on the blog.

Don’t fear rejection: Something a blog is really good at teaching you.

A couple of credos in the piece I know I struggle with:

Don’t assume you know what’s accessible to the audience and what isn’t: Although counter to what the piece says, I occasionally run into a friend who says “I had NO idea what you were talking about in X blog post.” It is crushing to hear – but also really good. I do want to challenge readers but I also want to be accessible. Do let me know if I ever get to a place where a newbie is going to be totally lost.

Details count: So, er, anyone who reads my blog regularly knows that I have the occasional typo in a post, here or there… Blogging longish pieces four times a week is draining, and so I don’t proof as much as I could (plus it is hard to see one’s own errors). But I could do better.

Hope you enjoy these pieces as much as I did!

MuniForge: Creating municipalities that work like the web

Last month I published the following article in the Municipal Information Systems Association’s journal Municipal Interface. The article was behind a firewall so now that the month has gone by I’m throwing it up here. Basically, it makes the case for why, if government’s applied open source licenses to the software they developed (or paid to develop), they could save 100’s of millions, or more likely billions of dollars, a year. Got a couple of emails from municipal IT professionals from across the country

MuniForge: Creating Municipalities that Work like the Web

Introduction

This past May the City of Vancouver passed what is now referred to as “Open 3”.This motion states that the City will use open standards for managing its information, treat open source and proprietary software equally during the procurement cycle, and apply open source licenses to software the city creates.

While a great deal of media attention has focused on the citizen engagement potential of open data, but the implications of the second half of the motion – that relating to open source software – has gone relatively unnoticed. This is all the more surprising since last year the Mayor of Toronto’s also promised his city would apply an open source license to software it creates. This means that two of Canada’s largest municipalities are set to apply open source licenses to software they create in house. Consequently, the source code and the software itself will be available for free under a license that permits users to use, change, improve and redistribute it in modified or unmodified forms.

If capitalized upon these announcements could herald a revolution in how cities currently procure and develop software. Rather than having thousands of small municipalities collectively spending billions of dollars to each recreate the own wheel the open sourcing of municipal software could weave together Canada’s municipal IT departments into one giant network in which expertise and specialized talents drive up quality and security to the benefit of all while simultaneously collapsing the costs of development and support. Most interestingly, while this shift will benefit larger cities, its benefit and impact could be most dramatic and positive among the country’s smaller cities (those with populations under 200K). What is needed to make it happen is a central platform where the source code and documentation for software that cities wish to share can be uploaded and collaborated on. In short, Canada needs a Sourceforge, or better, a GitHub for municipal software.

The cost

For the last two hundred years one feature has dominated the landscape for the majority if municipalities in Canada: isolation. In a country as vast and sparsely populated as ours villages, towns, and cities have often found themselves alone. For citizens the railway, the telegraph, then the highway and telecommunications system eroded that isolation, but if we look at the operations of cities this isolation remains a dominant feature. Most Canadian municipalities are highly effective, but ultimately self contained islands. Municipal IT departments are no different. One municipality rarely talks to that of another, particularly if they are not neighbours.

The result of this process is that in many cities across Canada IT solutions are frequently developed in one of two manners.

The first is the procurement model. Thankfully, when the product is off the shelf, or easily customized, deployment can occur quickly, this however, is rarely the case. More often, larger software and expensive consulting firms are needed to deploy such solutions frequently leaving them beyond the means of many smaller cities. Moreover, from an economic development perspective the dollars spent on these deployments often flow out of the community to companies and consultants based elsewhere. On the flip side, local, smaller firms, if they exist at all, tend to be untested and frequently lack the expertise and competition necessary to provide a reliable and affordable product. Finally, regardless of the firms’ size, most solutions are proprietary and so lock a city into the solution in perpetuity. This not only holds the city hostage to the supplier, it eliminates future competition and worse, should the provider go out of business, it saddles the city with an unsupported system which will be painful and expensive to upgrade out of.

The second option is to develop in-house. For smaller cities with limited IT departments this option can be challenging, but is often still cheaper than hiring an external vendor. Here the challenge is that any solution is limited by the skills and talents of the City’s IT staff. A small city, with even a gifted IT staff of 2-5 people will be challenged to effectively build and roll out all the IT infrastructure city staff and citizens need. Moreover, keeping pace with security concerns, new technologies and new services poses additional challenges.

In both cases the IT services a city can develop and support for staff and citizens is be limited by either the skills and capacity of its team or the size of its procurement budget. In short, the collective purchasing power, development capacity and technical expertise of Canada’s municipal IT departments is lost because we remain isolated from one another. With each city IT department acting like an island this creates enormous constraints and waste. Software is frequently recreated hundreds of times over as each small city creates its own service or purchases its own license.

The opportunity

It need not be this way. Rather than a patchwork of isolated islands, Canada’s municipal IT departments could be a vast interconnected network.

If even two small communities in Canada applied an open source license to a software they were producing, allowed anyone to download it and documented it well the cost savings would be significant. Rather than having two entities create what is functionally the same piece of software, the cost would be shared. Once available, other cities could download and write patches that would allow this software to integrate with their own hardware/software infrastructure. These patches would also be open source making it easier for still more cities to use the software. The more cities participate in identifying bugs, supplying patches and writing documentation, the lower the costs to everyone becomes. This is how Linus Torvalds started a community whose operating system – Linux – would become world class. It is the same process by which Apache came to dominate webservers and it is the same approach used by Mozilla to create Firefox, a web browser whose market share now rivals that of Internet Explorer. The opportunity to save municipalities millions, if not billions in software licensing and/or development costs every year is real and tangible.

What would such a network look like and how hard would it be to create? I suspect that two pieces would need to be in place to begin growing a nascent network.

First, and foremost, there need to be a handful of small projects. Often the most successful source projects are those that start collaboratively. This way the processes and culture are, from the get go, geared towards collaboration and sharing.  This is also why smaller cities are the perfect place to start for collaborating on open source projects. The world’s large cities are happy to explore new models, but they are too rich, too big and too invested in their current systems to drive change. The big cities can afford Accenture. Small cities are not only more nimble, they have the most to gain. By working together and using open source they can provide a level of service comparable to that of the big cities, at a fraction of the cost. An even simpler first step would be to ensure that when contractors sign on to create new software for a city, they agree that the final product will be available under and open source license.

Second, MISA, or another body, should create a Sourceforge clone for hosting open sourced municipal software projects. Sourceforge is an American based open source software development web site which provides services that help people build cool and share software with coders around the world. It presently hosts more than 230,000 software projects has over 2 million registered users. Soureforge operates as a sort of market place for software initiatives, a place where one can locate software one is interested in and then both download it and/or become part of a community to improve it.

A Soureforge clone – say Muniforge – would be a repository for software that municipalities across the country could download and use for free. It would also be the platform upon which collaboration around developing, patching and documenting would take place. Muniforge could also offer tips, tools and learning materials for those new to the open source space on how to effectively lead, participate and work within an open source community. This said, if MISA wanted to keep costs even lower, it wouldn’t even need to create a sourecforge clone, it could simply use the actual sourceforge website and lobby the company to create a new “municipal” category.

And herein lies the second great opportunity of such a platform. It can completely restructure the government software business in Canada. At the moment Canadian municipalities must choose between competing proprietary systems that lock them into to a specific vendor. Worst still, they must pay for both the software development and ongoing support. A Muniforge would allow for a new type of vendor modeled after Redhat – the company that offers support to users that adopt its version of the free, open source Linux operating system. Suddenly while vendors can’t sell software found on Muniforge, they could offer support for it. Cities would not have the benefit of outsourcing support, without having to pay for the development of a custom, proprietary software system. Moreover, if they are not happy with their support they can always bring it in house, or even ask a competing company to provide support. Since the software is open source nothing prevents several companies from supporting the same piece of software – enhancing service, increasing competition and driving down prices.

There is another, final, global benefit to this approach to software development. Over time, a Muniforge could begin to host all of the software necessary to run a modern day municipality. This has dramatic implications for cities in the developing world. Today, thanks to rapid urbanization, many towns and villages in Asian and Africa will be tomorrow’s cities and megacities. With only a fraction of the resources these cities will need to be able to offer the services that are today common place in Canada. With Muniforge they could potentially download all the infrastructure they need for free – enabling precious resources to go towards other critical pieces of infrastructure such as sewers and drinking water. Moreover, a Muniforge would encourage small local IT support organizations to develop in those cities providing jobs fostering IT innovation where it is needed most.  Better still, over time, patches and solutions would flow the other way, as more and more cities help improve the code base of projects found on Muniforge.

Conclusion

The internet has demonstrated that new, low cost models of software development exist. Open source software development has shown how loosely connected networks of coders/users from across a country, or even around the world can create world class software that rivals and even outperforms software created by the largest proprietary developers. This is the logic of the web – participation, better development and low-cost development.

The question cities across Canada need to ask themselves is: do we want to remain isolated islands, or do we want to work like the web, working collaboratively to offer better services, more quickly and at a lower cost. If even only some cities choose the later answer an infrastructure to enable collaboration can be put in place at virtually no cost, while the potential benefits and the opportunity to restructure the government software industry would be significant. Island or network – which do we want to be?

オープンデータの3つの規則

[The following is a Japanese Translation of this post – I’ll be publishing a different language each day this week.]

158px-Flag_of_Japan.svg私はここ数年来、開かれた政府の仕事に深く関わってきた。具体的には政府のデータの公開情報を市民の誰もが活用ができるようにと主張してきた。私が興味を持ったことを書いてみると、公開情報と技術と世代の変化が政府を変えて行くと言うことだ。
今年の初めに、バンクーバーの市長と市議会にオープンモーション(スタッフの間ではOpen 3と呼ばれている)を導入することを助言し始め、カナダで最初にバンクーバー市のオープンデータポータルを作り出した。最近では、オーストラリア政府が私に国際リファレンスグループのための政府2.0特別専門委員会に.出席するよう依頼してきた。

開かれた政府の仕事は広範囲にわたるが、最近の私の仕事では公開データにおいて、結局、何が必要か何を求めているのかの本質を見極めることを強く求められた。カナダ政府情報機関の委員が行ったデジタル時代のRight to Know 週間において、結果的には議会の討論中に私の努力したことが出席者と共有された。:

政府公開データの3つの規則

  1. もし、スパイダーやインデックスがなければ利用できない。
  2. もし、読めない形式だったら活用できない。
  3. もし、法律の枠組みにおいて、許可がなければ使用できない。

例えば、(1) 基本的なことを言うと、もしC(または他の検索エンジン)が見つからなければ、殆どの市民がそのことを調べることができない。だから、それが可能な、あらゆる検索エンジンスパイダーを利用したほうがいい。

データを見つけた後、(2)で言っていることは、そのデータが使えることが必要である。役立つフォームで引き出したりダウンロードしたりする必要がある。(例えば、API、サブスクリプションフィード、書類)グーグルマップや他のデータセットを使ったり、オープンオフィースで分析したり、標準のものを変換したり、必要なプログラムを使えることが必要である。
データーを自由に使ったりできない人は討論から外される。

最終的にデータが見つかって使えても、(3)の関係で、作ったものを共有したり、他の人を動員したり、新しいサービスや興味あることを供給する法律的な許可が必要と言う著作権の問題が生じてくる。情報は自由に使用できるライセンスが必要であるが、理想を言えば、全くライセンスがないほうが良く、著作権に触れない政府のデータがあれば一番良い。

データを見つけて使って、共有することが私たちの望んでいることである。

もちろん、インターネットの検索をすると他人も同じように考えていたことが分かる。
おそらくCIOレベルや低レベルの会話には適した8つの重要なオープンガーバメントデータがあるが、政治家(あるいは副大臣、政府長官、最高経営者)に話す時には、以上の3つが基本になっていることが分かった。それは、覚えておくべき必要とされる本質的な事項である。

This Japanese translation was made possible thanks to the generous volunteer work of Tosh Nagashima at the Space-Time Research company in Australia. The team there was amazing in providing a number of translations – I am very much in their debt.

Drie Regels voor Open Overheidsdata

The following is a Dutch Translation of this post – I’ll be publishing a different language each day this week.

158px-Flag_of_the_Netherlands.svgIn de afgelopen jaren ben ik in toenemende mate betrokken geworden bij de Open Overheid beweging en in het bijzonder kom ik op voor Open Data, het beschikbaar maken van informatie die de overheid verzamelt en creëert zodat burgers de informatie kunnen analyseren, gebruiken en hergebruiken voor nieuwe doelen. Mijn interesse in dit onderwerp is een gevolg dat ik veel geschreven en werk heb verricht hoe technologie, open systemen en de generatie overgang de overheid zullen veranderen. Begin dit jaar ben ik begonnen met het adviseren van de burgemeester en gemeenteraad van de stad Vancouver om de Open Motie aan te nemen (ook wel Open3 genoemd) en het ontwikkelen van Vancouver’s Open Data portaal, de eerste gemeentelijk open data portaal in Canada. Recentelijk ben ik gevraagd door de Australische overheid om deel te nemen aan de International Reference Group voor haar Overheid 2.0 Taskforce.

Uiteraard is de Open Overheid beweging behoorlijk breed, maar in mijn meer recente werk heb ik getracht om de kern van Open Data te destilleren uit deze bredere beweging. Wat hebben we nu echt nodig en vragen we dat wel? Tijdens de  Conferentie voor Parlementariërs: Transparantie in het Digitale Tijdperk “Right to Know Week” panel discussie – georganiseerd door het Office of the Information Commissioner – , introduceerde ik drie regels voor Open Overheidsdata.

Drie Regels voor Open Overheidsdata

  1. Als data niet kan worden gevonden of doorzoekbaar gemaakt, dan bestaat het niet
  2. Als data niet beschikbaar is in een open en leesbare vorm voor computers, dan zal het burgers niet uitnodigen om er mee aan de slag te gaan.
  3. Als er geen juridisch raamwerk is dat toestaat om de data te hergebruiken, dan zal het burgers niet empoweren.

Een korte toelichting, (1) betekent eigenlijk: kan ik het vinden? Wanneer Google (en elke andere zoekmachine) informatie niet kan vinden, dan zal dat voor de meeste burgers betekenen dat de informatie niet bestaat. Dus het is van cruciaal belang dat je ervoor zorgt dat de data geoptimaliseerd is om te worden geïndexeerd door allerlei soorten zoekmachines.

Als ik de data heb gevonden, dan richt Regel (2) zich op het bruikbaar maken van de data. Ik moet met de data kunnen spelen. Dat betekent dat ik in staat moet zijn om de data te downloaden in een eenvoudig en bruikbaar formaat (zoals een API, een RSS feed of een bestand met toelichting). Burgers hebben data nodig dat ze in staat stelt om een mash-up te maken met Google Maps of andere websites, of te analyseren in OpenOffice of het te converteren naar een bestandsformaat of programma naar eigen inzicht. Burgers die niet kunnen spelen met informatie zijn burgers die niet meedoen aan het debat .

Uiteindelijk, zelfs wanneer ik de data kan vinden en er mee kan spelen, dan benadrukt Regel (3) dat ik een juridisch raamwerk nodig heb dat mij toestaat om te delen wat ik heb gemaakt, dat ik andere burgers mag uitnodigen en organiseren om te participeren, er een nieuwe dienst om heen kan bouwen of dat ik gewoon interessante feiten mag benadrukken.  Dit betekent dat de licentie behorende tot de informatie en data zo min mogelijk restricties oplegt aan het gebruik, idealiter komt wordt overheidsdata beschikbaar gesteld aan het publieke domein. De beste overheidsdata en informatie is welke die niet beschermd is door auteursrechten. Databestanden  die gelicenseerd zijn op een wijze dat het burgers onmogelijk wordt gemaakt om hun werk te delen, maakt burgers monddood en leidt tot censurering.

Zoeken, spelen, en delen. Dat is wat wij willen.

Een snelle zoektocht op het Internet laat zien dat andere mensen ook hebben nagedacht over dit onderwerp. Er is een uitstekend stuk over 8 Principes van Open Overheidsdata die meer gedetailleerd zijn, en misschien zelfs beter, zeker voor discussies op CIO niveau. Maar voordat we gaan praten met politici (of senior ambtenaren en CEO’s) en net zoals de mensen aanwezig bij de conferentie vorige maanden, vond ik de eenvoud van drie belangrijker: het zijn drie simpele regels die iedereen makkelijk kan onthouden.

This Dutch translation was made possible due to the generous work of Diederik van Lieree.

Twitter is my Newspaper: explaining twitter to newbies

I’m frequently asked about technology, the internet and web 2.0 and in the course of the discussion the subject of Twitter inevitably arises. People are frequently curious, often judgmental, and almost always bewildered by the service. This isn’t surprising, Twitter is poorly understood despite its relative success (and its success is still small in the grand scheme of internet related things). Part of this is because people (even early adopters) are still figuring twitter out, and part of this is because those frequently explaining it don’t understand it themselves (I’m looking at you newspaper columnists).

Most often I sense people misunderstand Twitter because they compare it to a text based internet tool they do know: e-mail. This reference point shapes their assumptions about Twitters purpose and leads them to ask questions like: how can you read all those tweets? or how can you “follow” 250 people? If the unsaid assumption is that twitter creates 2000 new emails a day to read – well no surprise they’re running for the hills! This is doubly true if you believe the myth that all people tweet about (and thus read) is their breakfast menu, or what they are doing in a given moment.

Twitter is not email and while it has many (evolving) uses I’ve found the best metaphor for explaining how my friends and I use it is a much simpler technology: the newspaper.

Few people read every (or even most) articles in a newspaper – indeed most of us just scan the headlines when we see a newspaper. Likewise, very few people read all their tweets. Indeed, many of us go days (guilt free) without ever looking at the newspaper – same with Twitter, many people go days without looking at their twitter feed. The difference is that your newspaper’s headlines only change once a day and it is awkward to carry the thing around with you. Twitter’s headlines are always changing, and its located on your blackberry or iphone.

Now the difference from email becomes more clear. There is no obligation – or even expectation – of readership with Twitter. Emails you have to (or are supposed to) read and, let’s face it, are often a chore to get through. Newspaper articles you choose to read, often for pleasure or consumption.

Indeed, as Taylor and I outlined in Missing the Link, Twitter is better than a newspaper in many ways since you get to choose the columnists whose headlines you’ll scanning. Whereas a newspaper brings together articles, ideas and information someone else thinks you should care about Twitter brings together the ideas, articles and information by people you care about. I follow a number of “thought leaders” people like Clay Shirky, Jay Rosen, Andrew Potter, Tim O’Reilly, David Weinberger, etc… along with some friends and colleagues. While they occasionally say short pithy things, they are usually linking to articles that they find interesting. In short, some of the smartest people I know, and some I don’t, are essentially creating a vetted reading list for me. This virtual community is my news editor.

Better still, I don’t always have a newspaper on me. But an endless stream of articles vetted by smart people is always just a click away so whenever I have a spare moment – on the bus, in line at the grocery store, or waiting for a taxi – I can pull up some interesting reading material I would otherwise never have read.

I don’t claim this is Twitter’s only use, so it’s not complete, comprehensive explanation, but I’ve found this explanation has helped a number of my non-web savvy friends to “get” Twitter.

If I could start with a blank sheet of paper… (part 2)

The other week Martin Stewart-Weeks posted this piece on the Australian Government’s Web 2.0 Taskforce blog. In it he asked:

“…imagine for a moment it was your job to create the guidelines that will help public servants engage online. Although you have the examples from other organisations, you are given the rare luxury to start with a blank sheet of paper (at least for this exercise). What would you write? What issues would you include? Where would you start? Who would you talk to?”

Last week I responded with this post which explained why my efforts would focus on internal change. This week I want to pick the thread back up and talk about what applications I would start with and why.

First, Social Networking Platform (this is essential!):

An inspired public service shouldn’t ban Facebook, it should hire it.

A government-run social networking platform, one that allowed public servants to list their interests, current area of work, past experiences, contact information and current status, would be indispensable. It would allow public servants across ministries to search out and engage counterparts with specialized knowledge, relevant interests or similar responsibilities. Moreover, it would allow public servants to set up networks, where people from different departments, but working on a similar issue, could keep one another abreast of their work.

In contrast, today’s public servants often find themselves unaware of, and unable to connect with, colleagues in other ministries or other levels of government who work on similar issues. This is not because their masters don’t want them to connect (although this is sometimes the case) but because they lack the technology to identify one another. As a result, public servants drafting policy on interconnected issues — such as the Environment Canada employee working on riverbed erosion and the Fisheries and Oceans employee working on spawning salmon — may not even know the other exists.

If I could start with a blank sheet of paper… then I’d create a social networking platform for government. I think it would be the definitive game changer. Public servants could finally find one another (saving millions of hours and dollars in external consultants, redundant searches and duplicated capacity. Moreover if improving co-ordination and the flow of information within and across government ministries is a central challenge, then social networking isn’t a distraction, it’s an opportunity.

Second, Encourage Internal Blogs

I blogged more about this here.

If public servants feel overwhelmed by information one of the main reasons is that they have no filters. There are few, if any bloggers within departments that are writing about what they think is important and what is going on around them. Since information is siloed everybody has to rely on either informal networks to find out what is actually going on (all that wasted time having coffee and calling friends to find out gossip) or on formal networks, getting in structured meetings with other departments or ones’ boss to find out what their bosses, bosses, boss is thinking. What a waste of time and energy.

I suspect that if you allowed public servants to blog, you could cut down on rumours (they would be dispelled more quickly) email traffic and, more importantly, meetings (which are a drain on everybody’s time) by at least 25%. Want to know what my team is up to? Don’t schedule a meeting. First, read my blog. Oh, and search the tags to find what is relevant to you. (You can do that on my blog too, if you are still reading this piece it probably means you are interested in this tag.)

Third, Create a Government Wide Wiki

The first reason to create a wiki is that it would give people a place to work collectively on documents, within their departments or across ministries. Poof, siloes dissolved. (Yes, it really is that simple, and if you are middle management, that terrifying).

The second reason to provide some version control. Do you realize most governments don’t have version control software (or do, but nobody uses it, because it is terrible). A wiki, if nothing else, offers version control. That’s reason enough to migrate.

The third reason though is the most interesting. It would change the information economics, and thus culture, of government. A wiki would slowly come to function as an information clearing house. This would reduce the benefits of hoarding information, as it would be increasingly difficult to leverage information into control over an agenda or resource. Instead the opposite incentive system would take over. Sharing information or your labour (as a gift) within the public service would increase your usefulness to, and reputation among, others within the system.

Fourth, Install an Instant Messaging App

It takes less time than a phone call. And you can cut and paste. Less email, faster turn-around, quicker conversations. It isn’t a cure all, but you’ve already got young employees who are aching for it. Do you really want to tell them to not be efficient?

Finally… Twitter

Similar reasons to blogs. Twitter is like a custom newspaper. You don’t read it everyday, and most days you just scan it – you know – to keep an eye on what is going on. But occasionally it has a piece or two that you happen to catch that are absolutely critical… for your file, your department or your boss.

This is how Twitter works. It offers peripheral vision into what is going on in the areas or with the people that you care about or think are important. It allows us to handle the enormous flow of information around us. Denying public servants access to Twitter (or not implementing it, or blogs, internally) is essentially telling them that they must drink the entire firehose of information that is flowing through their daily life at work. They ain’t going to do it. Help them manage. Help them tweet.

Searching The Vancouver Public Library Catalog using Amazon

A few months ago I posted about a number of civic applications I’d love to see. These are computer, iphone, blackberry applications or websites that leverage data and information shared by the government that would help make life in Vancouver a little nicer.

Recently I was interviewed on CBC’s spark about some of these ideas that have come to fruition because of the hard work and civic mindedness of some local hackers. Mostly, I’ve talked about Vantrash (which sends emails or tweets to remind people of their upcoming garbage day), but during the interviewed I also mentioned that Steve Tannock created a script that allows you to search the Vancouver Public Library (VPL) Catalog from the Amazon website.

Firstly – why would you want you want to use Amazon to search the VPL? Two reasons: First, it is WAY easier to find books on the Amazon site then the library site, so you can leverage Amazon’s search engine to find books (or book recommendations) at the VPL. Second, it’s a great way to keep the book budget in check!

To use the Amazon website to search the VPL catalog you need to follow these instructions:

1. You need to be using the Firefox web browser. You can download and install it for free here. It’s my favourite browser and if you use it, I’m sure it will become yours too.

2. You will need to install the greasemonkey add-on for Firefox. This is really easy to do as well! After you’ve installed Firefox, simply go here and click on install.

3. Finally, you need to download the VPL-Amazon search script from Steve Tannock’s blog here.

4. While you are at Steve’s blog, write something nice – maybe a thank you note!

5. Go to the Amazon website and search for a book. Under the book title will be a small piece of text letting you know if the VPL has the book in its catalog! (See example picture below) Update: I’m hearing from some users that the script works on the Amazon.ca site but not the Amazon.com site.

I hope this is helpful! And happy searching.

Also, for those who are more technically inclined feel free to improve on the script – fix any bugs (I’m not sure there are any) or make it better!

Amazon shot

Spark Interview on VanTrash – The Open Source Garbage Reminder Service

A couple of weeks ago I was interviewed by the CBC’s Nora Young for her show Spark:  a weekly audio blog of smart and unexpected trendwatching about the way technology affects our lives and world.

The interview (which was fun!) dives a little deeper into some of the cool ways citizens – in working to make their lives better – can make cool things happen (and improve their community) when government’s make their data freely available. The interview focuses mostly on VanTrash, the free garbage reminder service created by Luke Closs and Kevin Jones based on a blog post I wrote. It’s been getting a lot of positive feedback and is helping make the lives of Vancouverites just a little less hectic.

You can read more about the episode here and listen to it on CBC radio at 1:05 local time in most parts of Canada and 4:05 on the west coast.

You can download a podcast of the Spark episode here or listen to it on the web here.

If you live in Vancouver – check out VanTrash.ca and sign up! (or sign your parents or neighbour up!) Never forget to take the garbage out again. It works a whole lot better than this approach my friends mom uses for her:

Van trash reminder

My new mac – some thoughts for other PC users

As some of you know, I recently shifted from a PC to a Mac. It’s a big transition for me… I’ve used a PC all my life, so it is easy to say that I’m having a little (but not a ton) of culture shock.

I’ll be honest about the single best selling feature of the mac: Spotlight.

I do very few things on my computer. Mostly I write, I surf, and I email. A LOT of email. So first and foremost, having a computer where I can find my emails and documents easily is critical. When you’ve got over 70,000 emails you want to be able to search, well, neither Microsoft Outlook, any Windows desktop search engine I’ve ever seen, or even Google desktop (which essentially requires you to load a browser each time) is going to cut it.

I NEED to be able to find stuff quickly. Google has bred me with an expectation of instant results (not a slow churning solution). Maybe Windows 7 will get there, but I’ve given up waiting. My 5 year old thinkpad wasn’t going to last long enough for me to see.

Am I happy? Absolutely. One thing our Apple friends do well is design. I love the keyboard, the screen and pretty much everything physical about the machine. Moreover, the convergence of Mac & Windows software has made the transition relatively easy – I’d be frightened to think of how much time on my computer I spend on the browser, but it is a lot… so moving from Firefox to Firefox is pretty sweet.

That said, the transition hasn’t been perfect. There are several features on the Mac that have been frustrating, and even disappointing. For those thinking of making the leap I thought I let you know the rough parts; it shouldn’t dissuade you, just set some expectations that not everything in Macworld is peaches and cream. Of course, if some veteran users have solutions to these issues, I’ll be eternally grateful.

So here’s my list of 4 things I’d change on the Mac – some of these are so petty I’m almost embarrassed…

1. No “send to” email client option. One thing Windows has that I’ve not found on the Mac is the “Send To” folder. Drop any application in the Send To folder and when you right-click on a document you have the option of opening the document with that application. What I loved was being was being able to right-click on a document, send it to my email client and bingo! A new email was created with the document attached. Very productive and easy. Alas, no such luck in the Mac.

2. No “open container” option in spotlight. Yes, I love Spotlight AND… when the drop down menu is showing me a list of found items, why can’t I right-click on it and open the containing folder? Sometimes, I don’t know what document I’m looking for, but I do know it is co-located with a document I do know the title of… Just saying.

3. In Mail, you can’t drag an email to iCal to create an event. Best feature Outlook (and I presume Entourage) has that Mail and iCal don’t is the ability to turn an email into an event. I know that Mail has the funky – click on the date and it will create an event – but it rarely brings in the relevant information. In Outlook I simply dragged an email to the calendar and presto! I had an event in which the email contents were in the notes. That way I could easily copy all the relevant details and, had a ton of context I could quickly reference within my calendar.

4. Okay, so this one seems REALLY petty… but it strikes at something deeper, something important for PC users to know. I’m feeling a little annoyed that, in Mail, when I delete an email in my inbox the cursor always moves to the newer email regardless how the mail is sorted. In Outlook it always moved “down” (Which I had arranged to mean that it went to an older email). Small, I know, but it is driving me crazy when I’m dealing with my email. Of course, this is all part of what I understand to be a larger philosophical problem with Macs (and why I’ve never been an owner before) which is that the company is centered on the idea that it knows how you should use your computer better than you do… so customizing is limited. This is the biggest culture shift for PC users. Owning a Mac is like being in a gated community… its pretty and manicured, but you have to adhere to the community bylaws, or else…! Yes the Windows world has got serious medical issues (viruses), a generic corporate feel (Windows Themes) and a approach to planning that seems modeled after Houston (I say this with some affection) but you also had a lot more freedom to create trouble or solve things your way. At the moment, I’m welcoming my new overlord because it’s like my computer has been taken over by the Swiss! It’s efficient, but if I try to complain… well you get the point.

Pretty much everything else that I’m wrestling with. The way Alt-Tab works on the Mac or the fact that I can’t open press “command+F” to open the “File” menu are things that I know, in time, I’ll adjust to.

19th Century Net Neutrality (and what it means for the 21st Century)

So what do bits of data and coal locomotive have in common?

It turns out a lot.

In researching an article for a book I’ve discovered an interesting parallel between the two in regard to the issue of Net Neutrality. What is Net Neutrality? It is the idea that when you use the Internet, you do so free of restrictions. That any information you download gets treated the same as any other piece of information. This means that your Internet service provider (say Rogers, Shaw or Bell) can’t choose to provide you with certain content faster than other content (or worse, simply block you from accessing certain content altogether).

Normally the issue of Net Neutrality gets cast in precisely those terms – do bits of data flowing through fibre optic and copper cables get treated the same, regardless of whose computer they are coming from and whose computer they are going to. We often like to think these types of challenges are new, and unique, but one thing I love about being a student of history, is that there are almost always interesting earlier examples to any problem.

Take the late 19th and early 20th century. Although the term would have been foreign to them, Net Neutrality was a raging issue, but not in regard to the telegraph cables of the day.  No, it was an issue in regards to railway networks.

In 1903 the United States Congress passed the Elkins Act. The Act forbade railway companies from offering, and railway customers from demanding, preferential rates for certain types of goods. Any “good” that moved over the (railway) network had to be priced and treated the same as any other “good.” In short, the (railway) network had to be neutral and price similar goods equally. What is interesting is that many railway companies welcomed the act because some trusts (corporations) paid the standard rail rate but would then demand that the railroad company give them rebates.

What’s interesting to me is that

a) Net Neutrality was a problem back in the late 19th and early 20th century; and

b) Government regulation was seen as an effective solution to ensuring a transparent and fair market place on these networks

The question we have to ask ourselves is, do we want to ensure that the 21st century (fibre optic) networks will foster economic growth, create jobs and improve productivity in much the same way the 19th and 20th century (railway) networks did for that era? If the answer is yes, we’d be wise to look back and see how those networks were managed effectively and poorly.  The Elkins Act is an interesting starting point, as it represented progressives efforts to ensure transparency and equality of opportunity in the marketplace so that it could function as an effective platform for commerce.