Yearly Archives: 2010

Apps for Climate Action Update – Lessons and some new sexy data

ttl_A4CAOkay, so I’ll be the first to say that the Apps4Climate Action data catalog has not always been the easiest to navigate and some of the data sets have not been machine readable, or even data at all.

That however, is starting to change.

Indeed, the good news is three fold.

First, the data catalog has been tweaked and has better search and an improved capacity to sort out non-machine readable data sets. A great example of a government starting to think like the web, iterating and learning as the program progresses.

Second, and more importantly, new and better sets are starting to be added to the catalog. Most recently the Community Energy and Emissions Inventories were released in an excel format. This data shows carbon emissions for all sorts of activities and infrastructure at a very granular level. Want to compare the GHG emissions of a duplex in Vancouver versus a duplex in Prince George? Now you can.

Moreover, this is the first time any government has released this type of data at all, not to mention making it machine readable. So not only have the app possibilities (how green is your neighborhood, rate my city, calculate my GHG emissions) all become much more realizable, but any app using this data will be among the first in the world.

Finally, probably one of the most positive outcomes of the app competition to date is largely hidden from the public. The fact that members of the public have been asking for better data or even for data sets at all(!) has made a number of public servants realize the value of making this information public.

Prior to the competition making data public was a compliance problem, something you did but you figured no one would ever look at or read it. Now, for a growing number of public servants, it is an innovation opportunity. Someone may take what the government produces and do something interesting with it. Even if they don’t, someone is nonetheless taking interest in your work – something that has rewards in of itself. This, of course, doesn’t mean that things will improve over night, but it does help advance the goal of getting government to share more machine readable data.

Better still, the government is reaching out to stakeholders in the development community and soliciting advice on how to improve the site and the program, all in a cost-effective manner.

So even within the Apps4Climate Action project we see some of the changes the promise of Government 2.0 holds for us:

  • Feedback from community participants driving the project to adapt
  • Iterations of development conducted “on the fly” during a project or program
  • Success and failures resulting in queries in quick improvements (release of more data, better website)
  • Shifting culture around disclosure and cross sector innovation
  • All on a timeline that can be measured in weeks

Once this project is over I’ll write more on it, but wanted to update people, especially given some of the new data sets that have become available.

And if you are a developer or someone who would like to do a cool visualization with the data, check out the Apps4Climate Action website or drop me an email, happy to talk you through your idea.

How to Engage with Social Media: An Example

The other week I wrote a blog post titled Canadian Governments: How to Waste millions online ($30M and Counting) in which I argued that OpenID should be the cornerstone of the government’s online identification system. The post generated a lot of online discussion, much of which was of very high quality and deeply thoughtful. On occasion, comments can enhance and even exceed a post’s initial value, and I’d argue this is one of these cases – something that is always a joy when it happens.

There was however, one comment that struck me as particularly important, not only because it was thoughtful, but because the type of comment is so rare. This is because it came from a government official. In this case, from Dave Nikolejsin, the CIO of the Government of British Columbia.

Everything about Mr. Nikolejsin’s comment deserves to be studied and understood by those in the public and private sector seeking to understand how to engage the public online. His comment is a perfect case of how and why governments should allow public servants to comment on blogs that tackle issues they are themselves addressing.

What makes Mr. Nikolejsin’s comment (which I’ve reprinted below) so effective? Let me break out the key components:

  1. It’s curious: Given the nature of my blog post a respondent could easily have gone on the offensive and merely countered claims they disagreed with. Instead Mr Nikolejsin remains open and curious about the ideas in the post and its claims. This makes readers and other commentators less likely to attack and more likely to engage and seek to understand.
  2. It seeks to return to first principles: The comment is effective because it is concise and it tackles the specific issues raised by the post. But part of what really makes it shine is how it seeks to identify first principles by talking about different approaches to online ID’s. Rather than ending up arguing about solutions, the post engages readers to identify what assumptions they may or may not have in common with one another. This won’t necessarily makes people more likely to agree, but they’ll end up debating the right thing (goals, assumptions) rather than the wrong thing (specific solutions).
  3. It links to further readings: Rather than try to explain everything in his response, the comment instead links to relevant work. This keeps the comment shorter and more readable, while also providing those who care about this issue (like me) with resources to learn more.
  4. It solicits feedback: “I really encourage you to take a look at the education link and tell me what you think.Frequently comments simply retort points in the original post they disagree with. This can reinforce the sense that the two parties are in opposition. Mr. Nikolejsin and I actually agree far more than we disagree: we both want a secure, cost effective, and user friendly online ID management system for government. By asking for feedback he implicitly recognizes this and is asking me to be a partner, not an antagonist.
  5. It is light: One thing about the web is that it is deeply human. Overly formal statements looks canned and cause people to tune out. This comment is intelligent and serious with its content, but remains light and human in its style. I get the sense a human wrote this, not a communications department. People like engaging with humans. They don’t like engaging with communication departments.
  6. Community Feedback: The comment has already sparked a number of responses which contain supportive thoughts, suggestions and questions, including some by those working in municipalities, as experts in the field and citizen users. It’s actually a pretty decent group of people there – the kind a government would want to engage.

In short, this is a comment that sought to engage. And I can tell you, it has been deeply, deeply successful. I know that some of what I wrote might have been difficult to read but after reading Mr. Nikolejsin’s comments, I’m much more likely to bend over backwards to help him out. Isn’t this what any government would want of its citizens?

Now, am I suggesting that governments should respond to every blog post out there? Definitely not. But there were a number of good comments on this post and the readership in terms of who was showing up makes commenting on a post likely worthwhile.

I’ve a number of thoughts on the comment that I hope to post shortly. But first, I wanted to repost the comment, which you can also read in the original post’s thread here.

Dave Nikolejsin <dave.nikolejsin@gov.bc.ca> (unregistered) wrote: Thanks for this post David – I think it’s excellent that this debate is happening, but I do need to set the record straight on what we here in BC are doing (and not doing).

First and foremost, you certainly got my attention with the title of your post! I was reading with interest to see who in Canada was wasting $30M – imagine my surprise when I saw it was me! Since I know that we’ve only spent about 1% of that so far I asked Ian what exactly it was he presented at the MISA conference you mentioned (Ian works for me). While we would certainly like someone to give us $30M, we are not sure where you got the idea we currently have such plans.

That said I would like to tell you what we are up to and really encourage the debate that your post started. I personally think that figuring out how we will get some sort of Identity layer on the Internet is one of the most important (and vexing) issues of our day. First, just to be clear, we have absolutely nothing against OpenID. I think it has a place in the solution set we need, but as others have noted we do have some issues using foreign authentication services to access government services here in BC simply because we have legislation against any personal info related to gov services crossing the border. I do like Jeff’s thinking about whom in Canada can/will issue OpenID’s here. It is worth thinking about a key difference we see emerging between us and the USA. In Canada it seems that Government’s will issue on line identity claims just like we issue the paper/plastic documents we all use to prove our Identities (driver’s licenses, birth certificates, passports, SIN’s, etc.). In the USA it seems that claims will be issued by the private sector (PayPal, Google, Equifax, banks, etc.). I’m not sure why this is, but perhaps it speaks to some combination of culture, role of government, trust, and the debacle that REALID has become.

Another issue I see with OpenID relates to the level of assurance you get with an OpenID. As you will know if you look at the pilots that are underway in US Gov, or look at what you can access with an OpenID right now, they are all pretty safe. In other words “good enough” assurance of who you are is ok, and if someone (either the OpenID site or the relying site) makes a mistake it’s no big deal. For much of what government does this is actually an acceptable level of assurance. We just need a “good enough” sense of who you are, and we need to know it’s the same person who was on the site before. However, we ALSO need to solve the MUCH harder problem of HIGH ASSURANCE on-line transactions. All Government’s want to put very high-value services on-line like allowing people access to their personal health information, their kids report cards, driver’s license renewals, even voting some day, and to do these things we have to REALLY be sure who’s on the other end of the Internet. In order to do that someone (we think government) needs to vouch (on-line) that you are really you. The key to our ability to do so is not technology, or picking one solution over the other, the key is the ID proofing experience that happens BEFORE the tech is applied. It’s worth noting that even the OpenID guys are starting to think about OpenID v.Next (http://self-issued.info/?p=256) because they agree with the assurance level limitation of the current implementation of OpenID. And OpenID v.Next will not be backward compatible with OpenID.

Think about it – why is the Driver’s License the best, most accepted form of ID in the “paper” world. It’s because they have the best ID proofing practices. They bring you to a counter, check your foundation documents (birth cert., Card Card, etc.), take your picture and digitally compare it to all the other pictures in the database to make sure you don’t have another DL under another name, etc. Here in BC we have a similar set of processes (minus the picture) under our Personal BCeID service (https://www.bceid.ca/register/personal/). We are now working on “claims enabling” BCeID and doing all the architecture and standards work necessary to make this work for our services. Take a look at this work here (http://www.cio.gov.bc.ca/cio/idim/index.page?).

I really encourage you to take a look at the education link and tell me what you think. Also, the standards package is getting very strong feedback from vendors and standards groups like the ICF, OIX, OASIS and Kantara folks. This is really early days and we are really trying to make sure we get it right – and spend the minimum by tracking to Internet standards and solutions wherever possible.

Sorry for the long post, but like I said – this is important stuff (at least to me!) Keep the fires burning!

Thanks – Dave.

Saving Millions: Why Cities should Fork the Kuali Foundation

For those interested in my writing on open source, municipal issues and technology, I want to be blunt: I consider this to be one of the most important posts I’ll write this year.

A few months ago I wrote an article and blog post about “Muniforge,” an idea based on a speech I’d given at a conference in 2009 in which I advocated that cities with common needs should band together and co-develop software to reduce procurement costs and better meet requirements. I continued to believe in the idea, but have recognized that cultural barriers would likely mean it would be difficult to realize.

Last month that all changed. While at Northern Voice I ended up talking to Jens Haeusser an IT strategist at the University of British Columbia and confirmed something I’d long suspected: that some people much smarter than me had already had the same idea and had made it a reality… not among cities but among academic institutions.

The result? The Kuali foundation. “…A growing community of universities, colleges, businesses, and other organizations that have partnered to build and sustain open-source administrative software for higher education, by higher education.”

In other words for the past 5 years over 35 universities in the United States, Canada, Australia and South Africa have been successfully co-developing software.

For cities everywhere interested in controlling spending or reducing costs, this should be an earth shattering revelation – a wake up call – for several reasons:

  • First, a viable working model for muniforge has existed for 5 years and has been a demonstrable success, both in creating high quality software and in saving the participating institutions significant money. Devising a methodology to calculate how much a city could save by co-developing software with an open source license is probably very, very easy.
  • Second, what is also great about universities is that they suffer from many of the challenges of cities. Both have: conservative bureaucracies, limited budgets, and significant legacy systems. In addition, neither have IT as the core competency and both are frequently concerned with licenses, liability and the “owning” intellectual property.
  • Which thirdly, leads to possibly the best part. The Kuali Foundation has already addressed all the critical obstacles to such an endeavour and has developed licensing agreements, policies, decision-making structures, and work flows processes that address necessary for success. Moreover, all of this legal, policy and work infrastructure is itself available to be copied. For free. Right now.
  • Fourth, the Kuali foundation is not a bunch of free-software hippies that depend on the kindness of strangers to patch their software (a stereotype that really must end). Quite the opposite. The Kuali foundation has helped spawn 10 different companies that specialize in implementing and supporting (through SLAs) the software the foundation develops. In other words, the universities have created a group of competing firms dedicated to serving their niche market. Think about that. Rather than deal with vendors who specialize in serving large multinationals and who’ve tweaked their software to (somewhat) work for cities, the foundation has fostered competing service providers (to say it again) within the higher education niche.

As a result, I believe a group of forwarding thinking cities – perhaps starting with those in North America – should fork the Kuali Foundation. That is, they should copy Kuali’s bylaws, it structure, its licenses and pretty much everything else – possibly even the source code for some of its projects – and create a Kuali for cities. Call it Muniforge, or Communiforge or CivicHub or whatever… but create it.

We can radically reduce the costs of software to cities, improve support by creating the right market incentive to help foster companies whose interests are directly aligned with cities and create better software that meets cities unique needs. The question is… will we? All that is required is for CIO’s to being networking and for a few to discover some common needs. One I idea I have immediately is for the City of Nanaimo to apply the Kuali modified Apache license to its council monitoring software package it developed in house, and to upload it to GitHub. That would be a great start – one that could collectively save cities millions.

If you are a city CIO/CTO/Technology Director and are interested in this idea, please check out these links:

The Kuali Foundation homepage

Open Source Collaboration in Higher Education: Guidelines and Report of the Licensing and Policy Framework Summit for Software Sharing in Higher Education by Brad Wheeler and Daniel Greenstein (key architects behind Kuali)

Open Source 2010: Reflections on 2007 by Brad Wheeler (a must read, lots of great tips in here)

Heck, I suggest looking at all of Brad Wheeler’s articles and presentations.

Another overview article on Kuali by University Business

Phillip Ashlock of Open Plans has an overview article of where some cities are heading re open source.

And again, my original article on Muniforge.

If you aren’t already, consider reading the OpenSF blog – these guys are leaders and one way or another will be part of the mix.

Also, if you’re on twitter, consider following Jay Nath and Philip Ashlock.

Half victory in making BC local elections more transparent

Over the past few months the British Columbia government (my home province – or for my American friends – state) has had a taskforce looking at reforming local (municipal) election rules.

During the process I submitted a suggestion to the taskforce outlining why campaign finance data should be made available online and in machine readable format (ie. so you can open it in Microsoft Excel, or Google Docs, for example).

Yesterday the taskforce published their conclusions and… they kind of got it right.

At first blush, things look great… The press release and taskforce homepage list, as one of the core recommendations:

Establish a central role for Elections BC in enforcement of campaign finance rules and in making campaign finance disclosure statements electronically accessible

Looks promising… yes? Right. But note the actual report (which ironically, is only available in PDF, so I can’t link to the specific recommendations… sigh). The recommendation around disclosure reads:

Require campaign finance disclosure information to be published online
and made centrally accessible though Elections BC

and the explanatory text reads:

Many submissions suggested that 120 days is too long to wait for disclosure reports, and that the public should be able to access disclosure information sooner and more easily. Given the Task Force’s related recommendations on Elections BC’s role in overseeing local campaign finance rules, it is suggested that Elections BC act as a central repository of campaign finance disclosure statements. Standardizing disclosure statement forms is of practical importance if the statements are to be published online and centrally available, and would help members of the public, media and academia analyze the information. [my italics]

My take? That the spirit of the recommendation is for campaign finance data be machine readable – that you should be able to download, open, and play with it on your own computer. However, the literally reading of this text suggests that simple scanning account ledgers and sharing them as an image file or unstructured pdf might suffice.

This would be essentially doing the same thing that generally happens presently and so would not mark a step forward. Another equally bad outcome? That the information gets shared in a manner similar to the way federal MP campaign data is shared on Elections Canada website where it cannot be easily downloaded and you are only allowed to look at one candidates financial data at a time. (Elections Canada site is almost designed to prevent you from effectively analyzing campaign finance data).

So in short, the Taskforce members are to be congratulated as I think their intentions were bang on: they want the public to be able to access and analyze campaign finance data. But we will need to continue to monitor this issue carefully as the language is vague enough that the recommendation may not produce the desired outcome.

Canadian Governments: How to waste millions online ($30M and counting)

Back from DC and Toronto I’m feeling recharged and reinvigorated. The Gov 2.0 expo was fantastic, it was great to meet colleagues from around the world in person. The FCM AGM was equally exciting with a great turnout for our session on Government 2.0 and lots of engagement from the attendees.

So, now that I’m in a good mood, it’s only natural that I’m suddenly burning up about some awesomely poor decisions being made at the provincial level and that may also may be in the process of being made at the federal level.

Last year at the BC Chapter of the Municipal Information Systems Association conference I stumbled, by chance, into a session run by the British Columbia government about a single login system it was creating for government website. So I get that this sounds mundane but check this out: it would means that if you live in BC you’ll have a single login name and password when accessing any provincial government service. Convenient! Better still, the government was telling the municipalities that this system (still in development) could work for their websites too. So only one user name and password to access any government service in BC! It all sounds like $30 million (the number I think they quoted) well spent.

So what could be wrong with this…?

How about the fact that such a system already exists. For free.

Yes, OpenID, is a system that has been created to do just this. It’s free and licensed for use by anyone. Better still, it’s been adopted by a number of small institutions such as Google, Yahoo, AOL, PayPal, and Verisign and… none other than the US government which recently began a pilot adoption of it.

So let me ask you: Do you think the login system designed by the BC government is going to be more, or less secure that that an open source system that enjoys the support of Google, Yahoo, AOL, PayPal, Verisign and the US Government? Moreover, do we think that the security concerns these organizations have regarding their clients and citizens are less strict than those of the BC government?

I suspect not.

But that isn’t going to prevent us from sinking millions into a system that will be less secure and will costs millions more to sustain over the coming decades (since we’ll be the only ones using it… we’ll have to have uniquely trained people to sustain it!).

Of course, it gets worse. While the BC government is designing its own system, rumour has it that the Federal Government is looking into replacing Epass; it’s own aging website login system which, by the by, does not work with Firefox, a web browser used by only a quarter of all Canadians. Of course, I’m willing to bet almost anything that no one is even contemplating using OpenID. Instead, we will sink 10’s of millions of dollars (if not more…) into a system. Of course, what’s $100 million of taxpayer dollars…

Oh, and today’s my birthday! And despite the tone of this post I’m actually in a really good mood and have had a great time with friends, family and loved ones. And where will I be today…? At 30,000 ft flying to Ottawa for GovCamp Canada. Isn’t that appropriate? :)

Gov 2.0 Expo Ignite talk on Open Data as an old idea

First, sorry for the scant blogging this week. Being 6 weeks into a 10 week travel marathon that sees me crossing the continent 9 times the long days finally caught up with me. Also, I’ve been at the O’Reilly Media Gov 2.0 Expo in Washington DC which has been fantastic. Really, a sense of coming home. It has also been great to meet so many people that I’ve corresponded with, admired and/or whose work I’ve simply followed closely. Great moment was spending a hour with Tim Berners-Lee in the speakers lounge talking about open data and meeting up with the Sunlight Foundation team at their offices in DC. Real inspirations all of them.

Tomorrow I’m giving a talk at the Federation of Canadian Municipalities on Open Data and the opportunity of Open Source software, so been busy there to, getting my thoughts together. I consider this one of the most important audiences I’ll talk to this year, this is a group that could transform how cities work in Canada – so I’m looking forward to it.

In the mean time, for those who were not at the Gov 2.0 expo I’ve pasted a clip of my talk below. It’s an ignite talk which means it last 5 minutes, I had to give them 20 slides in advance and those slide move forward every 15 seconds (I’m not controlling them!)

But then… forget my talk! There were a bunch of other (more) fantastic talks that can be found on the Gov 2.0 you-tube page here. I strongly encourage you to check them out!

Articles I'm digesting – 25/5/2010

Been a while since I’ve done one of these. A couple of good ones ranging from the last few months. Big thank you’s to those who sent me these pieces. Always enjoy.

The Meaning of Open by Jonathan Rosenberg, Senior Vice President, Product Management

Went back and re-read this. Every company makes mistakes and Google is no exception (privacy settings on Buzz being everyone’s favourite) but this statement, along with Google’s DataLiberartion.org (which unlike Facebook is designed to ensure you can extract your information from Google’s services) shows why Google enjoys greater confidence than Facebook, Apple or any number of its competitors. If you’re in government, the private or the non-profit sector, read this post. This is how successful 21st century organizations think.

Local Governments Offer Data to Software Tinkerers by Claire Cain Miller (via David Nauman & Andrew Medd)

Another oldie (December 2009 is old?) but a goodie. Describes a little bit of the emerging eco-system for open local government data along with some of the tensions it is creating. Best head in the sand line:

Paul J. Browne, a deputy commissioner of the New York City Police Department, said it releases information about individual accidents to journalists and others who request it, but would not provide software developers with a regularly updated feed. “We provide public information, not data flow for entrepreneurs,” he said.

So… if I understand correctly, the NYPD will only give data to people who ask and prefer to tie up valuable resources filling out individual requests rather than just provide a constant feed that anyone can use. Got it. Uh, and just for the record, those “entrepreneurs” are the next generation of journalists and the people who will make the public information useful. The NYPD’s “public information” is effectively useless, much like that my home town police department offers. Does anyone actually looks at PDF’s and pictures of crimes? That you can only get on a weekly basis? Really? In an era of spreadsheets and google maps… no.

Didacticism in Game Design by Clint Hocking (via Lauren Bacon)

eaves.ca readers meet Clint Hocking. My main sadness in introducing you is that you’ll discover how a truly fantastic, smart blog reads. The only good news for me us that you are hopefully more interested in public policy, open source and things I dwell on than video games, so Clint won’t steal you all away. Just many of you.

A dash of a long post post that is worth reading

As McLuhan says: the medium is the message. When canned, discrete moral choices are rendered in games with such simplicity and lack of humanity, the message we are sending is not the message specific to the content in question (the message in the canned content might be quite beautiful – but it’s not a ludic message) – it is the message inherent in the form in which we’ve presented it: it effectively says that ‘being moral is easy and only takes a moment out of each hour’. To me, this is almost the opposite of the deeper appreciation of humanity we might aim to engender in our audience.

Clint takes video games seriously. And so should you.

The Analytic Mode by David Brooks (via David Brock)

These four lines alone make this piece worth reading. Great lessons for students of policy and politics:

  • The first fiction was that government is a contest between truth and error. In reality, government is usually a contest between competing, unequal truths.
  • The second fiction was that to support a policy is to make it happen. In fact, in government power is exercised through other people. It is only by coaxing, prodding and compromise that presidents [or anyone!] actually get anything done.
  • The third fiction was that we can begin the world anew. In fact, all problems and policies have already been worked by a thousand hands and the clay is mostly dry. Presidents are compelled to work with the material they have before them.
  • The fourth fiction was that leaders know the path ahead. In fact, they have general goals, but the way ahead is pathless and everything is shrouded by uncertainty

The case against non-profit news sites by Bill Wyman (via Andrew Potter)

Yes, much better that news organizations be beholden to a rich elite than paying readers… Finally someone takes on the idea that a bunch of enlightened rich people or better, rich corporate donors, are going to save “the news.” Sometimes it feels like media organizations are willing to do anything they can to avoid actually having to deal with paying customers. Be it using advertisers and relying on rich people to subsidize them, anything appears to be better than actually fighting for customers.

That’s what I love about Demand Media. Some people decry them as creating tons of cheap content, but at least they looked at the market place and said: This is a business model that will work. Moreover, they are responding to a real customer demand – searches in google.

Wyman’s piece also serves as a good counterpoint to the recent Walrus advertising campaign which essentially boiled down to: Canada needs the Walrus and so you should support it. The danger here is that people at the Walrus believe this line: That they are of value and essential to Canada even if no one (or very few people) bought them or read them. I think people should buy The Walrus not because it would be good for the country but because it is engaging, informative and interesting to Canadians (or citizens of any country). I think the Walrus can have great stories (Gary Stephen Ross’s piece A Tale of Two Cities is a case in point), but if you have a 1 year lead time for an article, that’s going to hard to pull off in the internet era, foundation or no foundation. I hope the Walrus stays with us, but Wyman’s article serves up some arguments worth contemplating.

Open Data: An Example of the Long Tail of Public Policy at Work

VancouverGraffiti_AnalysisAs many readers know, Vancouver passed what has locally been termed the Open3 motion a year ago and has had a open data portal up and running for several months.

Around the world much of the focus of open data initiatives have focused on the development of applications like Vancouver’s Vantrash, Washington DC’s Stumble Safely or Toronto’s Childcare locator. But the other use of data portals is to actually better understand and analyze phenomena in a city – all of which can potentially lead to a broader diversity of perspectives, better public policy and a more informed public and/or decision makers.

I was thus pleased to find out about another example of what I’ve been calling the Long Tail of Public Policy when I received an email from Victor Ngo, a student at the University of British Columbia who just completed his 2nd year in the Human Geography program with an Urban Studies focus (He’s also a co-op student looking for a summer job – nudge to the City of Vancouver).

It turns out that last month, he and two classmates did a project on graffiti occurrence and its relationship to land use, crime rates, and socio-economic variables. As Victor shared with me:

It was a group project I did with two other members in March/April. It was for an introductory GIS class and given our knowledge, our analysis was certainly not as robust and refined as it could have been. But having been responsible for GIS analysis part of the project, I’m proud of what we accomplished.

The “Graffiti sites” shapefile was very instrumental to my project. I’m a big fan of the site and I’ll be using it more in the future as I continue my studies.

So here we have University students in Vancouver using real city data to work on projects that could provide some insights, all while learning. This is another small example of why open data matters. This is the future of public policy development. Today Victor may be a student, less certain about the quality of his work (don’t underestimate yourself, Victor) but tomorrow he could be working for government, a think tank, a consulting firm, an insurance company or a citizen advocacy group. But wherever he is, the open data portal will be a resource he will want to turn to.

With Victor’s permission I’ve uploaded his report, Graffiti in the Urban Everyday – Comparing Graffiti Occurrence with Crime Rates, Land Use, and Socio-Economic Indicators in Vancouver, to my site so anyone can download it. Victor has said he’d love to get people’s feedback on it.

And what was the main drawback of using the open data? There wasn’t enough of it.

…one thing I would have liked was better crime statistics, in particular, the data for the actual location of crime occurrence. It would have certainly made our analysis more refined. The weekly Crime Maps that the VPD publishes is an example of what I mean:

http://vancouver.ca/police/CrimeMaps/index.htm

You’re able to see the actual location where the crime was committed. We had to tabulate data from summary tables found at:

http://vancouver.ca/police/organization/planning-research-audit/neighbourhood-statistics.html

To translate: essentially the city releases this information in a non-machine-readable format, meaning that citizens, public servants at other levels of government and (I’m willing to wager) City of Vancouver public servants outside the police department have to recreate the data in a digital format. What a colossal waste of time and energy. Why not just share the data in a structured digital way? The city already makes it public, why not make it useful as well? This is what Washington DC (search crime) and San Francisco have done.

I hope that more apps get created in Vancouver, but as a public policy geek, I’m also hoping that more reports like these (and the one Bing Thom architects published on the future of Vancouver also using data from the open data catalog) get published. Ultimately, more people learning, thinking, writing and seeking solutions to our challenges will create a smarter, more vibrant and more successful city. Isn’t that what you’d want your city government (or any government, really…) to do?

Parliament: invite me to audit your books (I'm not really asking)

“Following careful consideration, the auditor general will not be invited to conduct a performance audit of the House of Commons”

– A parliamentary statement released last Thursday afternoon

There was a time when the above statement made sense. It was a time when accounts were kept on paper, when copying and shipping such papers was expensive and time consuming, and when the number of people who would have gone through a giant binder of accounts would have been quite small.

In such an era, auditors had a unique role – they represented the interests of the public since the public could not review the books themselves. Thus, picking the auditor mattered. Since this person would be one of the few people with the time, resources, and access to review MP’s accounts, it became a powerful and politically sensitive position. The public demanded someone they could trust, the politicians – justifiably – wanted to ensure that this person would not abuse their role by shedding light on certain members or applying standards unevenly. Hence, choosing who would see the books mattered, since few people, if anyone, would review the work of the auditor.

So should Parliament acquiesce to the auditor general and hand over their books to her? She meets all the standards set above so the answer seems like it should be yes. But it isn’t. the auditor general does not oversee parliament, and she should not receive special access, nor should we begin to establish precedent that she does.

However.

We don’t live in an era described above. Today, the accounts are kept in a digital format. It should be easy to convert them to Microsoft Excel or another computer format. They could be posted online where anyone could download and look at them at no cost. And, as the Guardian newspaper proved last year, thousands and thousands of people would be interested in using their computers to analyze and write about them.

What Parliament should do is hand their expense accounts over to everyone. Indeed, I am today making that formal request: I would like Parliament to invite Canadian taxpayers – the people who vote for them, who pay their salaries, and who cover their expenses – to review their books. Please take all the expenses and post them online. Today.

As in the United Kingdom I am confident that many Canadians will take an interest in the accounts. The Guardian asked people to help review the accounts and ordinary citizens from across the UK found a number of unusual claims. Others took the information about expenses and visualized it in interesting ways, ways that allowed citizens to better understand how their money was being spent.

Would the process be painful for MPs? Possibly for some. Would it lead to a clamp down on MP’s expenses? I have my doubts. I think most people recognize that MPs engage in a tremendous amount of travel and, more importantly, want their MPs to use these funds to educate themselves, conduct research, think independently and, of course, better represent their constituents. But there will be little or no money for these important activities if people feel that expense accounts get used up on other activities.

More importantly, posting MP’s accounts could reduce the likelihood of misspending in the future – a truly good outcome. Our goal shouldn’t be to catch problems after the fact, but to prevent them in the first place. Knowing that constituents will be able to see one’s expenses can be a more effective constraint than any ethics or spending guideline. Indeed, this is the same argument I made around why publicly accessible charitable receipts should be downloadable as such an act might have saved taxpayers $3.2 Billion. Here the stakes are smaller, but no less important.

In the end, as this is our government, this is also our money, and these are our documents. Parliament, we would like you to invite us to see what is already ours, so that we can collectively do our own analysis. If the auditor-general wants to do hers as well… power to her. But we agree that you are not accountable to her. You are accountable to us.

Mick Jagger & why copyright doesn't always help artists

I recently read this wonderful interview with Mick Jagger on the BBC website which had this fantastic extract about the impact of the internet on the music industry. What I love about this interview is that Mick Jagger is, of course, about as old a legend as you can find in the music industry.

…I’m talking about the internet.

But that’s just one facet of the technology of music. Music has been aligned with technology for a long time. The model of records and record selling is a very complex subject and quite boring, to be honest.

But your view is valid because you have a huge catalogue, which is worth a lot of money, and you’ve been in the business a long time, so you have perspective.

Well, it’s all changed in the last couple of years. We’ve gone through a period where everyone downloaded everything for nothing and we’ve gone into a grey period it’s much easier to pay for things – assuming you’ve got any money.

Are you quite relaxed about it?

I am quite relaxed about it. But, you know, it is a massive change and it does alter the fact that people don’t make as much money out of records.

But I have a take on that – people only made money out of records for a very, very small time. When The Rolling Stones started out, we didn’t make any money out of records because record companies wouldn’t pay you! They didn’t pay anyone!

Then, there was a small period from 1970 to 1997, where people did get paid, and they got paid very handsomely and everyone made money. But now that period has gone.

So if you look at the history of recorded music from 1900 to now, there was a 25 year period where artists did very well, but the rest of the time they didn’t.

So what does this have to do with copyright? Well, remember, the record labels and other content distributors (not creators!) keep saying how artists will starve unless there is copyright. But understand that for the entire 110-year period that Mick Jagger is referencing there was copyright… and yet artists were paid to record LPs and records for only a small fraction (less than a quarter) of that period. During the rest of the time, the way they made money was by performing. There is nothing about a stronger copyright regime that ensures artists (the creators!) will receive for more money or compensation.

So when the record labels say that without stricter copyright legislation artists will suffer, what they really mean to say is one specific business model – one that requires distributors and that they happen to do well by – will suffer. Artists, who traditionally never received much from the labels (and even during this 25 year period only a tiny few profited handsomely) have no guarantees that with stricter copyright they will see more revenue. No, rather, the distributors will simply own their content for longer and have greater control over its use.

This country is about to go into a dark, dark place with the new copyright legislation. I suspect we will end up stalled for 30 years and cultural innovation will shift to other parts of the world where creativity, remix culture and forms of artistic expression are kept more free.

Again, as Lessig says:

  • Creativity and innovation always builds on the past.
  • The past always tries to control the creativity that builds upon it.
  • Free societies enable the future by limiting this power of the past.
  • Ours is less and less a free society.

Welcome to copyright reform. A Canada where the past controls the creativity that gets built upon it.