Monthly Archives: July 2012

What do I think of the Canadian Senate?

Read Jennifer Ditchburn in the Globe and Mail – Senate stubborn on making information about chamber more accessible.

It is laughable about how hard the Canadian Senate makes it to access information about it. The lower house – which has made good progress in the last few years on this front – shares tons of information online. But the Senate? Attendance records, voting records and well, pretty much any record, is nigh high impossible to get online. Indeed, as Jennifer points out, for many requests you have to make an appointment and go in, in person, in Ottawa(!!!) to get them.

What year is it? 1823? It’s not like we haven’t had the mail, the telephone, the fax machine, and of course, the internet come along to make accessing all this information a little easier. I love that if you want to get certain documents about the operation of the senate you have to go to Ottawa.

Given the Senate is not even elected in Canada and has, shall we say, a poor reputation for accountability and accessibility, you’d think this would be a priority. Sadly, it is not. Having spoken with some of the relevant parties I can say, Senators are not interested in letting you see or know anything.

I understand the desire of the senate to be above the political fray, to not be bent by the fickle swings in electoral politics, to be a true house of “second sober thought.” And yet I see no reason why it can’t still be all that, while still making all the information that it must make public about itself, available online in a machine readable format. It is hard to see how voting records or attendance records will sway how the Senate operates, other than maybe prompt some Senators to show up for work more often.

But let’s not hold our breath for change. Consider my favourite part of the article:

“A spokeswoman for government Senate leader Marjory LeBreton said she was unavailable and her office had no comment. Ms. LeBreton has asked a Senate committee to review the rules around Senate attendance, but it’s unclear if the review includes the accessibility of the register.”

No comment? For a story on the Senate’s lack of accessibility? Oh vey! File it under: #youredoingitwrong

 

 

Is Civic Hacking Becoming 'Our Pieces, Loosely Joined?'

I’ve got a piece up over on the WeGov blog at TechPresident – Is Civic Hacking Becoming ‘Our Pieces, Loosely Joined?

Juicy bit:

There is however, a larger issue that this press release raises. So far, it appears that the spirit of re-use among the big players, like MySociety and the Sunlight Foundation*, only goes so deep. Indeed often it seems they are limited to believing others should re-use their code. There are few examples where the bigger players dedicate resources to support other people’s components. Again, it is fine if this is all about creating competing platforms and competing to get players in smaller jurisdictions who cannot finance creating whole websites on their own to adopt it. But if this is about reducing duplication then I’ll expect to see some of the big players throw resources behind components they see built elsewhere. So far it isn’t clear to me that we are truly moving to a world of “small pieces loosely joined” instead of a world of “our pieces, loosely joined.”

You can read the rest over there.

Living in the Future: My Car isn’t Flying, but it is Cheap and Gamified

I remember in the early 80’s when I was about 8 years old I walked up to my dad and said “you know the year 2000 really isn’t that far away, and unless something changes we aren’t going to get jetpacks and flying cars.” Even then I could see the innovation curve wasn’t going to meet the expectations movies and books had set for me.

Mostly I’m glad I reset my expectations at such an early age – since it is fairly rare that something comes along that makes me feel like I live in the future: things like the iPad and smartphones. Interestingly however, what feels most like magic to me these days – the thing I do regularly but that feels like it is the future – isn’t something I own, but a service that is all about whatI don’t have to own. A car.

A Car So Cheap, I don’t Own It

I actually belong to two car sharing entities: Modo (like Zipcar) and Car2Go. While I love both it is that latter that really has me feeling like I live in the future.

What makes Car2Go so amazing is that unlike ZipCar and Modo, which have fixed parking spots to which cars must be picked up and returned, Car2Go essentially scattered 600 cars around Vancouver. There are no dedicated parking spots to which you must return the car. You can leave it anywhere in the city that isn’t private parking or metered.

So, to lay it out. Here’s the process of using one of these cars:

  1. Fire up your smart phone and locate the nearest car – generally within 10-400 meters.
  2. Walk to said car
  3. Once you arrive, hold your Car2Go card against the windshield to “sign in” to the car
  4. Get in and start driving to your destination.
  5. Upon arrival, “sign out” by holding your card against the windshield

I then automatically get billed at $0.35 a minute for use, so basically a 10 minute car ride (about the average length of a trip when in Vancouver) comes to $3.50, about $1 more than bus fare. I can’t even begin to describe the convenience of this service. What it saves me over having to own a car… mind boggling. Between walking, transit, car2go, modo, taxis and uber…. living without owning a car has never been easier, even with a 7 month old.

But again, the magic is that, where ever I am in the city, I can pull up my phone, generally find a car within 2-3 minutes walk and drive anywhere I want without having to plan it all beforehand. There are, quite literally, cars scattered around Vancouver, at my disposal. Brilliant.

Gamifying my Car to help the environment

Once you get into a Car2Go (they are all two seater Smart cars built by Mercedes-Benz) there is a little screen that allows you to control a map and the radio. The other day however, I noticed a little green leaf icon with a number in it, so pushed that “button” to see what would happen.

Turns out Car2Go has a gamified driving. The onboard computer measures your performance around acceleration, cruising and deceleration and provides you with a score that reflects your performance. I took the photo below at the end of a ride the other day.

green-dashboard

What determines your score – from what I can tell-  is your efficiency around accelerating, cruising and decelerating. So the more smoothly you brake the gentler your acceleration (not an easy feat in a Smart Car) the better your score. Cruising appears to be about smoothness as well.

Once you are aware of these scoreboard – even if you are not looking at it and it is only in the back of your mind – I find myself trying to drive more smoothly.

In the subtitle, I point out that all this probably helps the environment by reducing the amount of gas consumed by Car2Go drivers. But I bet it also helps reduce maintenance costs on Car2Go’s cars. I’m sure many people are rougher on shared cars than they are on their own and that this gamification helps dampen that effect.

Either way, if you’d asked my six year old self if I’d not own a car when I’m 35, and that I’d actually be sharing a car with hundreds of other residents that I could swipe into using a simple card and that had a game inside it that urged me to drive more sustainable.. I suspect I’d just looked confused. But using Car2Go is probably the closest I get, on a regular basis, to doing something I really couldn’t have imagined when I was 6. And that’s kind of fun.

OSCON Community Management Keynote Video, Slides and some Bonus Material

Want to thank everyone who came to my session and who sent me wonderful feedback from both the keynote and the session. I was thrilled to see ZDnet wrote a piece about the keynote as well as have practioners, such as Sonya Barry, the Community Manager for Java write things like this about the longer session:

Wednesday at OSCON we kicked off the morning with the opening plenaries. David Eaves’ talk inspired me to attend his longer session later in the day – Open Source 2.0 – The Science of Community Management. It was packed – in fact the most crowded session I’ve ever seen here. People sharing chairs, sitting on every available spot on the floor, leaning up against the back wall and the doors. Tori did a great writeup of the session, so I won’t rehash, but if you haven’t, you should read it - What does this have to do with the Java Community? Everything. Java’s strength is the community just as much as the technology, and individual project communities are so important to making a project successful and robust.

That post pretty much made my day. It’s why we come to OSCON, to hopefully pass on something helpful, so this conference really felt meaningful to me.

So, to be helpful I wanted to lay out a bunch of the content for those who were and were not there in a single place, plus a fun photo of my little guy – Alec – hanging out at #OSCON.

A Youtube video of the keynote is now up – and I’ve posted my slides here.

In addition, I did an interview in the O’Reilly boothif it goes up on YouTube, I’ll post it.

There is no video of my longer session, formally titled Open Source 2.0 – The Science of Community Management, but informally titled Three Myths of Open Source Communities, but Jeff Longland helpfully took these notes and I’ll try to rewrite it as a series of blog posts in the near future.

Finally, I earlier linked to some blog posts I’ve written about open source communities, and on open source community management as these are a deeper dive on some of the ideas I shared.

Some other notes about OSCON…

If you didn’t catch Robert “r0ml” Lefkowitz’s talk: How The App Store Killed Free Software, And Why We’re OK With That which, contrary to some predictions was neither trolling nor link bait but a very thoughtful talk which I did not entirely agree with but has left me with many, many things to think about (a sign of a great talk) do try to see if an audio copy can be tracked down.

Jono Bacon, Brian Fitzpatrick and Ben Collins-Sussman are all menches of the finest type – I’m grateful for their engagement and support given I’m late arriving at a party they all started. While you are reading this, check out buying Brian and Ben’s new book – Team Geek: A Software Developer’s Guide to Working Well with Others.

Also, if you haven’t watched Tim O’Reilly’s opening keynote, The Clothesline Paradox and the Sharing Economy, take a look. My favourite part is him discussing how we break down the energy sector and claim “solar” only provides us with a tiny fraction of our energy mix (around the 9 minutes mark). Of course, pretty much all energy is solar, from the stuff we count (oil, hydroelectic, etc.. – its all made possible by solar) or the stuff we don’t count like growing our food, etc.. Loved that.

Oh, and this ignite talk on Cryptic Crosswords by Dan Bentley from OSCON last year, remains one of my favourite. I didn’t get to catch is talk this year on why the metric system sucks – but am looking forward seeing it once it is up on YouTube.

Finally, cause I’m a sucker dad, here’s early attempts to teach my 7 month old hitting the OSCON booth hall. As his tweet says “Today I may be a mere pawn, but tomorrow I will be the grandmaster.”

Alec-Chess

Posts on Open Source Community Management

For those stopping by my page because of my OSCON keynote here are some links on community management that might be of interest:

Structurelessness, feminism and open: what open advocates can learn from second wave feminists (this is in part about why open source communities are not pure meritocracies).

Developing Community Management Metrics and Tools for Mozilla (using data to better understand participation)

Community Management as Open Source’s Core Competency (an oldie, but still some good stuff in here)

Wiki’s and Open Source: Collaborative or Cooperative? (on why open source works, in part because we are anti-collaborative)

Remixing Angie Byron to create the next Million Mozillians

How GitHub saved Open Source (okay, maybe not saved – but I do think it turbo charged innovation and breathed new life in OS by teaching us the fork is not a four letter word)

Design Matters: Looking at a Re-themed Bugzilla

Okay, I’ll stop there. I tried to include a little something for everyone – hopefully some stuff here will at times challenge readers, at times confirm their thinking and always be helpful.

 

 

 

 

Containers, Facebook, Baseball & the Dark Matter around Open Data (#IOGDC keynote)

Below is a extended blog post that summarizes the keynote address I gave at the World Bank/Data.gov International Open Government Data Conference in Washington DC on Wednesday July 11th. This piece is cross posted over at the WeGov blog on TechPresident where I’m also write on transparency, technology and politics.

Yesterday, after spending the day at the International Open Government Data Conference at the World Bank (and co-hosted by data.gov) I left both upbeat and concerned. Upbeat because of the breadth of countries participating and the progress being made.

I was worried however because of the type of conversation we are having how it might limit the growth of both our community and the impact open data could have. Indeed as we talk about technology and how to do open data we risk missing the real point of the whole exercise – which is about use and impacts.

To get drive this point home I want to share three stories that highlight the challenges, I believe, we should be talking about.

Challenge 1: Scale Open Data

IDealx-300x247In 1956 Ideal-X, the ship pictured to the left, sailed from Newark to Houston and changed the world.

Confused? Let me explain.

As Marc Levine chronicles in his excellent book The Box, the world in 1956 was very different to our world today. Global trade was relatively low. China was a long way off from becoming the world’s factory floor. And it was relatively unusual for people to buy goods made elsewhere. Indeed, as Levine puts it, the cost of shipping goods was “so expensive that it did not pay to ship many things halfway across the country, much less halfway around the world.” I’m a child of the second era of globalization. I grew up in a world of global transport and shipping. The world before all of that which Levine is referring to is actually foreign to me. What is amazing is how much of that has just become a basic assumption of life.

And this is why Ideal-X, the aforementioned ship, is so important. It is the first cargo container ship (in how we understand containers). Its trip from Newark to Houston marked the beginning of a revolution because containers slashed the cost of shipping goods. Before Ideal-X the cost of loading cargo onto a medium sized cargo ship was $5.83 per ton, with containers, the cost dropped to 15.8 cents. Yes, the word you are looking for is: “wow.”

You have to understand that before containers loading a ship was a lot more like packing a mini-van for a family vacation to the beach than the orderly process of what could be described as stacking very large lego blocks on a boat. Before containers literally everything had to be hand packed, stored and tied down in the hull. (see picture to the right)

This is a little bit what our open data world looks like right today. The people who are consuming open data are like digital longshoreman. They have to look at each open data set differently, unpack it accordingly and figure out where to put it, how to treat it and what to do with it. Worse,  when looking at data from across multiple jurisdictions it is often much like cargo going around the world before 1956: a very slow and painful process. (see man on the right).

Of course, the real revolution in container shipping happened in 1966 when the size of containers was standardized. Within a few years containers could move from pretty much anywhere in the world from truck to train to boat and back again. In the following decades global shipping trade increased by 2.5 times the rate of economic output. In other words… it exploded.

Container-300x260Geek side bar: For techies, think of shipping containers as the TCP-IP packet of globalization. TCP-IP standardized the packet of information that flowed over the network so that data could move from anywhere to anywhere. Interestingly, like containers, what was in the package was actually not relevant and didn’t need to be known by the person transporting it. But the fact that it could move anywhere created scale and allowed for logarithmic growth.

What I’m trying to drive at is that, when it comes to open data, the number of open data sets that gets published is no longer the critical metric. Nor is the number of open data portals. We’ve won. There are more and more. The marginal political and/or persuasive benefit of an addition of another open data portal or data set won’t change the context anymore. I want to be clear – this is not to say that more open data sets and more open data portals are not important or valuable – from a policy and programmatic perspective more is much, much better. What I am saying is that having more isn’t going to shift the conversation about open data any more. This is especially true if data continues to require large amounts of work and time for people to unpack and understanding it over and over again across every portal.

In other words, what IS going to count, is how many standardized open data sets get created. This is what we SHOULD be measuring. The General Transit Feed Specification revolutionized how people engaged with public transit because the standard made it so easy to build applications and do analysis around it. What we need to do is create similar standards for dozens, hundreds, thousands of other data sets so that we can drive new forms of use and engagement. More importantly we need to figure out how to do this without relying on a standards process that take 8 to 15 to infinite years to decide on said standard. That model is too slow to serve us, and so re-imaging/reinventing that process is where the innovation is going to shift next.

So let’s stop counting the number of open data portals and data sets, and start counting the number of common standards – because that number is really low. More critically, if we want to experience the kind of explosive growth in use like that experienced by global trade and shipping after the rise of the standardized container then our biggest challenge is clear: We need to containerize open data.

Challenge 2: Learn from Facebook

facebook-logoOne of the things I find most interesting about Facebook is that everyone I’ve talked to about it notes how the core technology that made it possible was not particularly new. It wasn’t that Zuckerberg leveraged some new code or invented a new, better coding language. Rather it was that he accomplished a brilliant social hack.

Part of this was luck, that the public had come a long way and was much more willing to do social things online in 2004 than they were willing to do even two years earlier with sites like friendster. Or, more specifically, young people who’d grown up with internet access were willing to do things and imagine using online tools in ways those who had not grown up with those tools wouldn’t or couldn’t. Zuckerberg, and his users, had grown up digital and so could take the same tools everyone else had and do something others hadn’t imagined because their assumptions were just totally different.

My point here is that, while it is still early, I’m hoping we’ll soon have the beginnings of a cohort of public servants who’ve “grown up data.” For whom, despite their short career in the public service, have matured in a period where open data has been an assumption, not a novelty. My hope and suspicion is that this generation of public servants are going to think about Open Data very differently than many of us do. Most importantly, I’m hoping they’ll spur a discussion about how to use open data – not just to share information with the public – but to drive policy objectives. The canonical opportunity for me around this remains restaurant inspection data, but I know there are many, many, more.

What I’m trying to say is that the conferences we organize have got to talk less and less about how to get data open and have to start talking more about how do we use data to drive public policy objectives. I’m hoping the next International Open Government Data Conference will have an increasing number of presentations by citizens, non-profits and other outsiders are using open data to drive their agenda, and how public servants are using open data strategically to drive to a outcome.

I think we have to start fostering that conversation by next year at the latest and that this conversation, about use, has to become core to everything we talk about within 2 years, or we will risk losing steam. This is why I think the containerization of open data is so important, as well as why I think the White House’s digital government strategy is so important since it makes internal use core to the governments open data strategy.

Challenge 3: The Culture and Innovation Challenge.

In May 2010 I gave this talk on Open Data, Baseball and Government at the Gov 2.0 Summit in Washington DC. It centered around the story outline in the fantastic book Moneyball by Michael Lewis. It traces the story about how a baseball team – the Oakland A’s – used a new analysis of players stats to ferret out undervalued players. This enabled them to win a large number of games on a relatively small payroll. Consider the numbers to the right.

I mean if you are the owner of the Texas Rangers, you should be pissed! You are paying 250% in salary for 25% fewer wins than Oakland. If this were a government chart, where “wins” were potholes found and repaired, and “payroll” was costs… everyone in the world bank would be freaking out right now.

For those curious, the analytical “hack” was recognizing that the most valuable thing a player can do on offense is get on base. This is because it gives them an opportunity to score (+) but it also means you don’t burn one of your three “outs” that would end the inning and the chance for other players to score. The problem was, to measure the offensive power of a player, most teams were looking at hitting percentages (along with a lot of other weird, totally non-quantitative stuff) which ignores the possibility of getting walked, which allows you to get on base without hitting the ball!

What’s interesting however is that the original thinking about the fact that people were using the wrong metrics to assess baseball players first happened decades before the Oakland “A”s started to use it. Indeed it was a nighttime security guard with a strong mathematics background and an obsession for baseball that first began point this stuff out.

20-yearsThe point I’m making is that it took 20 years for a manager in baseball to recognize that there was better evidence and data they could be using to make decisions. TWENTY YEARS. And that manager was hated by all the other managers who believed he was ruining the game. Today, this approach to assessing baseball is common place – everyone is doing it – but see how the problem of using baseball’s “open data” to create better outcomes was never just an accessibility issue. Once that was resolved the bigger challenge centered around culture and power. Those with the power had created a culture in which new ideas – ideas grounded in evidence but that were disruptive – couldn’t find an audience. Of course, there were structural issues as well, many people had jobs that depended on not using the data, on instead relying on their “instincts” but I think the cultural issue is a significant one.

So we can’t expect that we are going to go from open portal today to better decisions tomorrow. There is a good chance that some of the ideas data causes us to think will be so radical and challenging that either the ideas, the people who champion them, or both, could get marginalized. On the up side, I feel like I’ve seen some evidence to the contrary to this in city’s like New York and Chicago, but the risk is still there.

So what are we going to do to ensure that the culture of government is one that embraces the challenges to our thinking and assumptions that doesn’t require 20 years to pass for us to make progress. This is a critical challenge for us – and it is much, much bigger than open data.

Conclusion: Focus on the Dark Matter

I’m deeply indebted to my friend – the brilliant Gordon Ross – who put me on to this idea the other day over tea.

Macguffin

Do you remember the briefcase in Pulp Fiction? The on that glowed when opened? That the characters were all excited about but you never knew what was inside it. It’s called a MacGuffin. I’m not talking about the briefcase per se. Rather I mean the object in a story that all the characters are obsessed about, but that you – the audience – never find out what it is, and frankly, really isn’t that important to you. In Pulp Fiction I remember reading that the briefcase is allegedly Marsellus Wallace soul. But ultimately, it doesn’t matter. What matters is that Vincent Vega, Jules Winnfield and a ton of other characters think it is important, and that drives the action and the plot forward.

Again – let me be clear – Open Data Portals are our MacGuffin device. We seem to care A LOT about them. But trust me, what really matters is everything that can happens around them. What makes open data important is not a data portal. It is a necessary prerequisite but it’s not the end, it just the means. We’re here because we believe that the things open data can let us and others do, matter. The Open Data portal was only ever a MacGuffin device – something that focused our attention and helped drive action so that we could do the other things – that dark matter that lies all around the MacGuffin device.

And that is what brings me back to our three challenges. Right now, the debate around open data risks become too much like a Pulp Fiction conference in which all the panels talk about the briefcase. Instead we should be talking more and more about all the action – the dark matter – taking place around the briefcase. Because that is what is really matters. For me, I think the three things that matter most are what I’ve mentioned about in this talk:

  • standards – which will let us scale, I believe strongly that the conversation is going to shift from portals to standards
  • strategic use – starting us down the path of learning how open data can drive policy outcomes; and
  • culture and power – recognizing that there are lots of open data is going to surface a lot of reasons why governments don’t want to engage data driven in decision making

In other words, I want to be talking about how open data can make the world a better place, not about how we do open data. That conversation still matters, open data portals still matter, but the path forward around them feels straightforward, and if they remain the focus we’ll be obsessing about the wrong thing.

So here’s what I’d like to see in the future from our Open Data conferences. We got to stop talking about how to do open data. This is because all of our efforts here, everything we are trying to accomplish… it has nothing to do with the data. What I think we want to be talking about is how open data can be a tool to make the world a better place. So let’s make sure that is the conversation we are have.

Reviewing Access to Information Legislation

Just got informed – via the CivicAccess mailing list – that Canada’s Access to Information Commissioner is planning to review Canada’s Access to Information legislation (full story here at the Vancouver Sun).

This is great news. Canada has long trumpeted its Access to Information Legislation as world leading. This was true… in 1985. It was plausible in 1995. Today, it is anything but true. The process is slow, frequently requests are denied and requests had to be paid for by check. Indeed, if a document you are looking for might be held by the US government, it is well known among Canadian journalists that you are better to ask the Americans for it. Even though you are a foreign they are both much faster and much more likely, to provide it. It is, frankly, embarrassing.

So we are no longer global leaders. Which is why I think it is great the commissioner might look abroad for best practices. The article suggests she will look at Britain, the United States, Mexico, New Zealand and Australia.

These are all fine choices. But if I had my pick, I’d add Brazil to the mix. It’s new transparency law is exceedingly interesting and aggressive in its approach. Greg Michener – a Canadian who lives in Brazil – covered the new law in Brazil’s Open Government Shock Therapy for TechPresident (where I’m an editor). The disclosure requirements in Brazil set a bar that, in some ways, is much higher than in Canada.

There are also some Eastern European countries that have had very progressive transparency laws – in reaction to both previously authoritarian regimes and to corruption problems – that make them worth examining. In other words, I’d love to see a mix that included more countries that have altered their laws more recently. This is probably where we are going to find some of the newer, more exciting innovations.

Regardless of what countries are looked at though – I’m glad the commissioner is doing this and wish her good luck.

The Long Tail of Ushahidi

I’ve got a post up over at TechPresident on Ushahidi. I’m basically responding to a new site called DeadUshahidi that points out a lot of Ushahidi maps never generate a lot of reports:

What has the people at DeadUshahidi concerned is that long tail of “dead” projects. I mean, look at all those unsuccessful maps! What a waste! In part I agree. There are lots of people running around believing that rolling out a map will solve a problem when, without at least plan, it probably won’t.

There is of course, another way to look at that power law distribution. One could also say it is a sign of enormous success. Perhaps Ushahidi has made the cost of deploying a mapping platform so low that it is worth the risk of losing what is now a negligible investment of time and money to spin one up. I mean, isn’t this the effective application of Eric Reis’ “lean startup” Silicon Valley methodologies?

You can read it all over here.

Oh, I’m also keynoting the International Open Government Data Conference at the World Bank in Washington on Wednesday morning – in case you are around.

 

If you are away from your cell phone… Awesome South African Online Ad

Was doing some research for a story I am writing over at TechPresident which had me visiting the site of Mxit, a social network built largely for mobile phones and used by urban youths in South Africa.

Check out the landing page for the site (note the red circle):

mobile-only-2

 

So, where I grew up “Never Let the Conversation End” meant porting the “conversation” over to a mobile device. In South Africa, for these young people, it is the opposite. Here’s a zoom in, for those who couldn’t read it…

It’s not that we’ve needed more evidence for understanding that the way access to the internet was going to come to emerging economies was not through one laptop per child or telecentres but via cellphones. That all said, it is still very striking when you see a manifestation of that logical conclusion spread out right in front of you.

Today there are over 500 million cell phones in Africa and, according to the Guardian, that number is growing fast. As the percentage of smart phones increases as well, I’ll expect more of these moments that seem foreign to me, someone who started out on the non-mobile internet.