Category Archives: open source

Is the Internet bringing us together or is it tearing us apart?

The other day the Vancouver Sun – via Simon Fraser University’s Public Square program – asked me to pen a piece answering the questions: Is the Internet bringing us together or is it tearing us apart?

Yesterday, they published the piece.

My short answer?

Trying to unravel whether the Internet is bringing us together or tearing us apart is impossible. It does both. What really matters is how we build generative communities, online and off.

My main point?

That community organizing is both growing and democratizing. On MeetUp alone there are 423 coming events in Vancouver. That’s 423 emergent community leaders, all learning how to mobilize people, whether it is for a party, to teach them how to knit, grow a business or learn how to speak Spanish.

This is pretty exciting.

A secondary point?

Is that it is not all good news. There are lots of communities, online and off, that are not generative. So if we are creating more communities, many of them will also be those we don’t agree with, and that are even destructive.

Check it

It always remains exciting to me what you can squeeze into 500 words. Yesterday, the Sun published the piece here, if you’re interested, please do consider checking it out.

Community Managers: Expectations, Experience and Culture Matter

Here’s an awesome link to grind home my point from my OSCON keynote on Community Management, particularly the part where I spoke about the importance of managing wait times – the period between when a volunteer/contributor takes and action and when they get feedback on that action.

In my talk I referenced code review wait times. For non-developers, in open source projects, a volunteer (contributor) will often write a patch which they must be reviewed by someone who oversees the project before it gets incorporated into the software’s code base. This is akin to a quality assurance process – say, like if you are baking brownies for the church charity event, the organizer probably wants to see the brownies first, just to make sure they aren’t a disaster. The period between which you write the patch (or make the brownies) and when the project manager reviews them and say they are ok/not ok, that’s the wait time.

The thing is, if you never tell people how long they are going to have to wait – expect them to get unhappy. More importantly, if, while their waiting, other contributors come and make negative comments about their contributions, don’t be surprised if they get even more unhappy and become less and less inclined to submit patches (or brownies, or whatever makes your community go round).

In other words while your code base may be important but expectations, experience and culture matter, probably more. I don’t think anyone believes Drupal is the best CMS ever invented, but its community has a pretty good expectations, a great experience and fantastic culture, so I suspect it kicks the ass of many “technically” better CMS’s run by lesser managed communities.

Because hey, if I’ve come to expect that I have to wait an infinite or undetermined amount of time, if the experience I have interacting with others suck and if the culture of the community I’m trying to volunteer with is not positive… Guess what. I’m probably going to stop contributing.

This is not rocket science.

And you can see evidence of people who experience this frustration in places around the net. Edd Dumbill sent me this link via hacker news of a frustrated contributor tired of enduring crappy expectations, experience and culture.

Heres what happens to pull requests in my experience:

  • you first find something that needs fixing
  • you write a test to reproduce the problem
  • you pass the test
  • you push the code to github and wait
  • then you keep waiting
  • then you wait a lot longer (it’s been months now)
  • then some ivory tower asshole (not part of the core team) sitting in a basement finds a reason to comment in a negative way.
  • you respond to the comment
  • more people jump on the negative train and burry your honestly helpful idea in sad faces and unrelated negativity
  • the pull dies because you just don’t give a fuck any more

If this is what your volunteer community – be it software driven, or for poverty, or a religious org, or whatever – is like, you will bleed volunteers.

This is why I keep saying things like code review dashboards matter. I bet if this user could at least see what the average wait time is for code review he’d have been much, much happier. Even if that wait time were a month… at least he’d have known what to expect. Of course improving the experience and community culture are harder problems to solve… but they clearly would have helped as well.

Most open source projects have the data to set up such a dashboard, it is just a question of if we will.

Okay, I’m late for an appointment, but really wanted to share that link and write something about it.

NB: Apologies if you’ve already seen this. I accidentally publishes this as a page, not a post on August 24th, so it escaped most people’s view.

Transparency Case Study: There are Good and Bad Ways Your Organization can be made "Open"

If you have not had the chance, I strongly encourage you to check out a fantastic piece of journalism in this week’s Economist on the state of the Catholic Church in America. It’s a wonderful example of investigative and data driven journalism made possible (sadly) by the recent spat of sexual-abuse and bankruptcy cases. As a result some of the normally secret financial records of the Church have been made public enabling the Economist to reconstruct the secret and opaque general finances of the Catholic church in America. It is a fascinating, and at times disturbing read.

The articles also suggests – I believe – a broader lessons for non-profits, governments and companies. Disclosure and transparency are essential to the effective running of an organization. As you read the piece it is clear that more disclosure would probably have compelled the Church to manage its finances in a more fiscally sound manner. It probably would have also made acts that are, at best negligent, at worst corrupt, difficult to impossible. This is, indeed, why many charities, organizations and public companies must conduct audits and publish the results.

But conforming to legal requirements will not shield you from an angry public. My sense is that many member contribution based organizations – from public companies to clubs to governments, are going to feel enormous pressure from their “contributors” to disclose more about how funds are collected, managed, disbursed and used. In a post financial collapse and post Enron era it’s unclear to me that people trust auditors the way they once did. In addition, as technology makes it easier to track money in real time, contributors are going to want more than just an annual audit. Even if they look at it rarely, they are going to want to know there is a dashboard or system they can look at and understand that shows them where the money goes.

I’m open to being wrong about this – and I’m not suggesting this is a panacea that solves all problems, but I nonetheless suspect that many organizations are going to feel pressure to become more transparent. There will be good ways in which that takes place… and bad ways. The Catholic Church story in the Economist is probably an example of the worst possible way: transparency forced upon an organization through the release of documents in a court case.

For anyone running an non-profit, community group, public agency or government department – this makes the article doubly worth reading. It is a case study in the worst possible scenario for your organization. The kind of disaster you never want to have to deal with.

The problem is, and I’m going to go out on a limb here, is that, at some point in the next 10-20 years, there is a non-trivial risk any organization (including your’s, reader) will face a publicity or legitimacy crisis because of a real or imagined problem. Trust me when I tell you: that moment will not be the moment when it is easiest or desirable from a cost, political and cultural perspective, to make your organization more transaparent. So better for you to think about how you’d like to shift policies, culture and norms to make it more transparent and accountable today, when things are okay, than in the crisis.

Consider again the Catholic Church. There are some fascinating and disturbing facts shared in the story that provide some interesting context. On the fascinating side, I had no idea of the scope and size of the Catholic Church. Consider that, according to the article:

“Almost 100m Americans, a third of the nation, have been baptised into the faith and 74m identify themselves as Catholic.”

and

“there are now over 6,800 Catholic schools (5% of the national total); 630 hospitals (11%) plus a similar number of smaller health facilities; and 244 colleges and universities.”

We are talking about a major non-profit that is providing significant services into numerous communities. It also means that the Catholic church does a lot of things that many other non-profits do. Whatever you are doing, they are probably doing it too.

Now consider some of the terrible financial practices the Church tried/tries to get away with because it thinks no one will be able to see them:

Lying about assets: “In a particularly striking example, the diocese of San Diego listed the value of a whole city block in downtown San Diego at $40,000, the price at which it had been acquired in the 1940s, rather than trying to estimate the current market value, as required. Worse, it altered the forms in which assets had to be listed. The judge in the case, Louise Adler, was so vexed by this and other shenanigans on the part of the diocese that she ordered a special investigation into church finances which was led by Todd Neilson, a former FBI agent and renowned forensic accountant. The diocese ended up settling its sexual-abuse cases for almost $200m.”

Playing fast and loose with finances: “Some dioceses have, in effect, raided priests’ pension funds to cover settlements and other losses. The church regularly collects money in the name of priests’ retirement. But in the dioceses that have gone bust lawyers and judges confirm that those funds are commingled with other investments, which makes them easily diverted to other uses.”

Misleading contributors about the destination of funds: “Under Cardinal Bernard Law, the archdiocese of Boston contributed nothing to its clergy retirement fund between 1986 and 2002, despite receiving an estimated $70m-90m in Easter and Christmas offerings that many parishioners believed would benefit retired priests.”

Using Public Subsidies to Indirectly Fund Unpopular Activities: “Muni bonds are generally tax-free for investors, so the cost of borrowing is lower than it would be for a taxable investment. In other words, the church enjoys a subsidy more commonly associated with local governments and public-sector projects. If the church has issued more debt in part to meet the financial strains caused by the scandals, then the American taxpayer has indirectly helped mitigate the church’s losses from its settlements. Taxpayers may end up on the hook for other costs, too. For example, settlement of the hundreds of possible abuse cases in New York might cause the closure of Catholic schools across the city.”

Of course all of this pales in comparison to the most disturbing part of the article: in several jurisdictions, the church is spending money to lobby governments to no extend the statute of limitations around sexual-abuse cases. This is so that it, and its priests, cannot be charged by authorities diminishing the likelihood that they get sued. The prospect that an organization that is supposed to both model the highest ideals of behaviour as well as protect the most marginalized is trying to limit the statues on such a heinous crime is so profoundly disgusting it is hard to put words to it. The Economist gives it a shot:

Various sources say that Cardinal Dolan and other New York bishops are spending a substantial amount—estimates range from $100,000 a year to well over $1m—on lobbying the state assembly to keep the current statute of limitations in place. His office will not comment on these estimates. This is in addition to the soft lobbying of lawmakers by those with pulpits at their disposal. The USCCB, the highest Catholic body in America, also lobbies the federal government on the issue. In April the California Catholic Conference, an organisation that brings the state’s bishops together, sent a letter to California’s Assembly opposing a bill that would extend the statute and require more rigorous background checks on church workers.

This disgusting discover aside, most organizations are probably not going to have the same profound problems found in the Catholic Church. But in almost every organization, no matter the controls, some form of mismanagement is probably taking place. The question is, will you already have in place policies and a culture that support transparency and disclosure before the problem is discovered – or will the crises become the moment where you have to try to implement them, probably under less than ideal circumstances?

As they said after Watergate “It’s not the Crime that kills you, but the cover up,” good transparency and disclosure can’t stop the crime, but it might help prevent them. Moreover, it can also make the cover up harder and, potentially, make it easier to ease the concerns of contributors and rebuilt trust. One could imagine that if the Church had been more transparent about its finances it might have better protected itself against bankruptcy from some of these cases. More importantly, it’s transparency might have make it easier to rebuilt trust, whereas any chance now will just seem like a reaction to the crises, not a genuine desire to operate differently.

Again, I think the pressure on many orgs to be more transparent is going to grow. And managers should recognize there are good and bad conditions under which such transparency can take place. Read this Economist story. In addition to be fascinating, it is a great case study in the worst case scenario for opaque institutions.

Using Metrics to Measure Interest in an Open Source Project

David Boswell has a couple of interesting posts (here and here) about how he is using metrics to measure how effective Mozilla is at attracting and engaging people express an interest in helping contribute to the Mozilla mission.

Some of the metrics being used can be seen at Mozilla’s Are We Growing Yet website. What I think is important is how the team is trying to look at metrics from the site to see if tweaks have an impact on attracting more people. Obviously, this is something commercial websites have been doing for a long time and it is about time these lessons were being applied in open source community management.

What I’m particularly excited about (and have helped with) is the creation of contribution paths for different parts of Mozilla. This way we can begin to try to measure how many people that express an interest in contributing to Mozilla start to contribute code (or engage in another activity, like marketing) and how far along the code contribution path they travel. Obviously we’d love for contributors to travel as far along that path as possible and so, with these metrics, we can ideally begin to see how changes in policy, tools or websites might encourage or discourage people’s decision to contribute.

Interesting stuff.

Is Civic Hacking Becoming 'Our Pieces, Loosely Joined?'

I’ve got a piece up over on the WeGov blog at TechPresident – Is Civic Hacking Becoming ‘Our Pieces, Loosely Joined?

Juicy bit:

There is however, a larger issue that this press release raises. So far, it appears that the spirit of re-use among the big players, like MySociety and the Sunlight Foundation*, only goes so deep. Indeed often it seems they are limited to believing others should re-use their code. There are few examples where the bigger players dedicate resources to support other people’s components. Again, it is fine if this is all about creating competing platforms and competing to get players in smaller jurisdictions who cannot finance creating whole websites on their own to adopt it. But if this is about reducing duplication then I’ll expect to see some of the big players throw resources behind components they see built elsewhere. So far it isn’t clear to me that we are truly moving to a world of “small pieces loosely joined” instead of a world of “our pieces, loosely joined.”

You can read the rest over there.

OSCON Community Management Keynote Video, Slides and some Bonus Material

Want to thank everyone who came to my session and who sent me wonderful feedback from both the keynote and the session. I was thrilled to see ZDnet wrote a piece about the keynote as well as have practioners, such as Sonya Barry, the Community Manager for Java write things like this about the longer session:

Wednesday at OSCON we kicked off the morning with the opening plenaries. David Eaves’ talk inspired me to attend his longer session later in the day – Open Source 2.0 – The Science of Community Management. It was packed – in fact the most crowded session I’ve ever seen here. People sharing chairs, sitting on every available spot on the floor, leaning up against the back wall and the doors. Tori did a great writeup of the session, so I won’t rehash, but if you haven’t, you should read it - What does this have to do with the Java Community? Everything. Java’s strength is the community just as much as the technology, and individual project communities are so important to making a project successful and robust.

That post pretty much made my day. It’s why we come to OSCON, to hopefully pass on something helpful, so this conference really felt meaningful to me.

So, to be helpful I wanted to lay out a bunch of the content for those who were and were not there in a single place, plus a fun photo of my little guy – Alec – hanging out at #OSCON.

A Youtube video of the keynote is now up – and I’ve posted my slides here.

In addition, I did an interview in the O’Reilly boothif it goes up on YouTube, I’ll post it.

There is no video of my longer session, formally titled Open Source 2.0 – The Science of Community Management, but informally titled Three Myths of Open Source Communities, but Jeff Longland helpfully took these notes and I’ll try to rewrite it as a series of blog posts in the near future.

Finally, I earlier linked to some blog posts I’ve written about open source communities, and on open source community management as these are a deeper dive on some of the ideas I shared.

Some other notes about OSCON…

If you didn’t catch Robert “r0ml” Lefkowitz’s talk: How The App Store Killed Free Software, And Why We’re OK With That which, contrary to some predictions was neither trolling nor link bait but a very thoughtful talk which I did not entirely agree with but has left me with many, many things to think about (a sign of a great talk) do try to see if an audio copy can be tracked down.

Jono Bacon, Brian Fitzpatrick and Ben Collins-Sussman are all menches of the finest type – I’m grateful for their engagement and support given I’m late arriving at a party they all started. While you are reading this, check out buying Brian and Ben’s new book – Team Geek: A Software Developer’s Guide to Working Well with Others.

Also, if you haven’t watched Tim O’Reilly’s opening keynote, The Clothesline Paradox and the Sharing Economy, take a look. My favourite part is him discussing how we break down the energy sector and claim “solar” only provides us with a tiny fraction of our energy mix (around the 9 minutes mark). Of course, pretty much all energy is solar, from the stuff we count (oil, hydroelectic, etc.. – its all made possible by solar) or the stuff we don’t count like growing our food, etc.. Loved that.

Oh, and this ignite talk on Cryptic Crosswords by Dan Bentley from OSCON last year, remains one of my favourite. I didn’t get to catch is talk this year on why the metric system sucks – but am looking forward seeing it once it is up on YouTube.

Finally, cause I’m a sucker dad, here’s early attempts to teach my 7 month old hitting the OSCON booth hall. As his tweet says “Today I may be a mere pawn, but tomorrow I will be the grandmaster.”

Alec-Chess

Containers, Facebook, Baseball & the Dark Matter around Open Data (#IOGDC keynote)

Below is a extended blog post that summarizes the keynote address I gave at the World Bank/Data.gov International Open Government Data Conference in Washington DC on Wednesday July 11th. This piece is cross posted over at the WeGov blog on TechPresident where I’m also write on transparency, technology and politics.

Yesterday, after spending the day at the International Open Government Data Conference at the World Bank (and co-hosted by data.gov) I left both upbeat and concerned. Upbeat because of the breadth of countries participating and the progress being made.

I was worried however because of the type of conversation we are having how it might limit the growth of both our community and the impact open data could have. Indeed as we talk about technology and how to do open data we risk missing the real point of the whole exercise – which is about use and impacts.

To get drive this point home I want to share three stories that highlight the challenges, I believe, we should be talking about.

Challenge 1: Scale Open Data

IDealx-300x247In 1956 Ideal-X, the ship pictured to the left, sailed from Newark to Houston and changed the world.

Confused? Let me explain.

As Marc Levine chronicles in his excellent book The Box, the world in 1956 was very different to our world today. Global trade was relatively low. China was a long way off from becoming the world’s factory floor. And it was relatively unusual for people to buy goods made elsewhere. Indeed, as Levine puts it, the cost of shipping goods was “so expensive that it did not pay to ship many things halfway across the country, much less halfway around the world.” I’m a child of the second era of globalization. I grew up in a world of global transport and shipping. The world before all of that which Levine is referring to is actually foreign to me. What is amazing is how much of that has just become a basic assumption of life.

And this is why Ideal-X, the aforementioned ship, is so important. It is the first cargo container ship (in how we understand containers). Its trip from Newark to Houston marked the beginning of a revolution because containers slashed the cost of shipping goods. Before Ideal-X the cost of loading cargo onto a medium sized cargo ship was $5.83 per ton, with containers, the cost dropped to 15.8 cents. Yes, the word you are looking for is: “wow.”

You have to understand that before containers loading a ship was a lot more like packing a mini-van for a family vacation to the beach than the orderly process of what could be described as stacking very large lego blocks on a boat. Before containers literally everything had to be hand packed, stored and tied down in the hull. (see picture to the right)

This is a little bit what our open data world looks like right today. The people who are consuming open data are like digital longshoreman. They have to look at each open data set differently, unpack it accordingly and figure out where to put it, how to treat it and what to do with it. Worse,  when looking at data from across multiple jurisdictions it is often much like cargo going around the world before 1956: a very slow and painful process. (see man on the right).

Of course, the real revolution in container shipping happened in 1966 when the size of containers was standardized. Within a few years containers could move from pretty much anywhere in the world from truck to train to boat and back again. In the following decades global shipping trade increased by 2.5 times the rate of economic output. In other words… it exploded.

Container-300x260Geek side bar: For techies, think of shipping containers as the TCP-IP packet of globalization. TCP-IP standardized the packet of information that flowed over the network so that data could move from anywhere to anywhere. Interestingly, like containers, what was in the package was actually not relevant and didn’t need to be known by the person transporting it. But the fact that it could move anywhere created scale and allowed for logarithmic growth.

What I’m trying to drive at is that, when it comes to open data, the number of open data sets that gets published is no longer the critical metric. Nor is the number of open data portals. We’ve won. There are more and more. The marginal political and/or persuasive benefit of an addition of another open data portal or data set won’t change the context anymore. I want to be clear – this is not to say that more open data sets and more open data portals are not important or valuable – from a policy and programmatic perspective more is much, much better. What I am saying is that having more isn’t going to shift the conversation about open data any more. This is especially true if data continues to require large amounts of work and time for people to unpack and understanding it over and over again across every portal.

In other words, what IS going to count, is how many standardized open data sets get created. This is what we SHOULD be measuring. The General Transit Feed Specification revolutionized how people engaged with public transit because the standard made it so easy to build applications and do analysis around it. What we need to do is create similar standards for dozens, hundreds, thousands of other data sets so that we can drive new forms of use and engagement. More importantly we need to figure out how to do this without relying on a standards process that take 8 to 15 to infinite years to decide on said standard. That model is too slow to serve us, and so re-imaging/reinventing that process is where the innovation is going to shift next.

So let’s stop counting the number of open data portals and data sets, and start counting the number of common standards – because that number is really low. More critically, if we want to experience the kind of explosive growth in use like that experienced by global trade and shipping after the rise of the standardized container then our biggest challenge is clear: We need to containerize open data.

Challenge 2: Learn from Facebook

facebook-logoOne of the things I find most interesting about Facebook is that everyone I’ve talked to about it notes how the core technology that made it possible was not particularly new. It wasn’t that Zuckerberg leveraged some new code or invented a new, better coding language. Rather it was that he accomplished a brilliant social hack.

Part of this was luck, that the public had come a long way and was much more willing to do social things online in 2004 than they were willing to do even two years earlier with sites like friendster. Or, more specifically, young people who’d grown up with internet access were willing to do things and imagine using online tools in ways those who had not grown up with those tools wouldn’t or couldn’t. Zuckerberg, and his users, had grown up digital and so could take the same tools everyone else had and do something others hadn’t imagined because their assumptions were just totally different.

My point here is that, while it is still early, I’m hoping we’ll soon have the beginnings of a cohort of public servants who’ve “grown up data.” For whom, despite their short career in the public service, have matured in a period where open data has been an assumption, not a novelty. My hope and suspicion is that this generation of public servants are going to think about Open Data very differently than many of us do. Most importantly, I’m hoping they’ll spur a discussion about how to use open data – not just to share information with the public – but to drive policy objectives. The canonical opportunity for me around this remains restaurant inspection data, but I know there are many, many, more.

What I’m trying to say is that the conferences we organize have got to talk less and less about how to get data open and have to start talking more about how do we use data to drive public policy objectives. I’m hoping the next International Open Government Data Conference will have an increasing number of presentations by citizens, non-profits and other outsiders are using open data to drive their agenda, and how public servants are using open data strategically to drive to a outcome.

I think we have to start fostering that conversation by next year at the latest and that this conversation, about use, has to become core to everything we talk about within 2 years, or we will risk losing steam. This is why I think the containerization of open data is so important, as well as why I think the White House’s digital government strategy is so important since it makes internal use core to the governments open data strategy.

Challenge 3: The Culture and Innovation Challenge.

In May 2010 I gave this talk on Open Data, Baseball and Government at the Gov 2.0 Summit in Washington DC. It centered around the story outline in the fantastic book Moneyball by Michael Lewis. It traces the story about how a baseball team – the Oakland A’s – used a new analysis of players stats to ferret out undervalued players. This enabled them to win a large number of games on a relatively small payroll. Consider the numbers to the right.

I mean if you are the owner of the Texas Rangers, you should be pissed! You are paying 250% in salary for 25% fewer wins than Oakland. If this were a government chart, where “wins” were potholes found and repaired, and “payroll” was costs… everyone in the world bank would be freaking out right now.

For those curious, the analytical “hack” was recognizing that the most valuable thing a player can do on offense is get on base. This is because it gives them an opportunity to score (+) but it also means you don’t burn one of your three “outs” that would end the inning and the chance for other players to score. The problem was, to measure the offensive power of a player, most teams were looking at hitting percentages (along with a lot of other weird, totally non-quantitative stuff) which ignores the possibility of getting walked, which allows you to get on base without hitting the ball!

What’s interesting however is that the original thinking about the fact that people were using the wrong metrics to assess baseball players first happened decades before the Oakland “A”s started to use it. Indeed it was a nighttime security guard with a strong mathematics background and an obsession for baseball that first began point this stuff out.

20-yearsThe point I’m making is that it took 20 years for a manager in baseball to recognize that there was better evidence and data they could be using to make decisions. TWENTY YEARS. And that manager was hated by all the other managers who believed he was ruining the game. Today, this approach to assessing baseball is common place – everyone is doing it – but see how the problem of using baseball’s “open data” to create better outcomes was never just an accessibility issue. Once that was resolved the bigger challenge centered around culture and power. Those with the power had created a culture in which new ideas – ideas grounded in evidence but that were disruptive – couldn’t find an audience. Of course, there were structural issues as well, many people had jobs that depended on not using the data, on instead relying on their “instincts” but I think the cultural issue is a significant one.

So we can’t expect that we are going to go from open portal today to better decisions tomorrow. There is a good chance that some of the ideas data causes us to think will be so radical and challenging that either the ideas, the people who champion them, or both, could get marginalized. On the up side, I feel like I’ve seen some evidence to the contrary to this in city’s like New York and Chicago, but the risk is still there.

So what are we going to do to ensure that the culture of government is one that embraces the challenges to our thinking and assumptions that doesn’t require 20 years to pass for us to make progress. This is a critical challenge for us – and it is much, much bigger than open data.

Conclusion: Focus on the Dark Matter

I’m deeply indebted to my friend – the brilliant Gordon Ross – who put me on to this idea the other day over tea.

Macguffin

Do you remember the briefcase in Pulp Fiction? The on that glowed when opened? That the characters were all excited about but you never knew what was inside it. It’s called a MacGuffin. I’m not talking about the briefcase per se. Rather I mean the object in a story that all the characters are obsessed about, but that you – the audience – never find out what it is, and frankly, really isn’t that important to you. In Pulp Fiction I remember reading that the briefcase is allegedly Marsellus Wallace soul. But ultimately, it doesn’t matter. What matters is that Vincent Vega, Jules Winnfield and a ton of other characters think it is important, and that drives the action and the plot forward.

Again – let me be clear – Open Data Portals are our MacGuffin device. We seem to care A LOT about them. But trust me, what really matters is everything that can happens around them. What makes open data important is not a data portal. It is a necessary prerequisite but it’s not the end, it just the means. We’re here because we believe that the things open data can let us and others do, matter. The Open Data portal was only ever a MacGuffin device – something that focused our attention and helped drive action so that we could do the other things – that dark matter that lies all around the MacGuffin device.

And that is what brings me back to our three challenges. Right now, the debate around open data risks become too much like a Pulp Fiction conference in which all the panels talk about the briefcase. Instead we should be talking more and more about all the action – the dark matter – taking place around the briefcase. Because that is what is really matters. For me, I think the three things that matter most are what I’ve mentioned about in this talk:

  • standards – which will let us scale, I believe strongly that the conversation is going to shift from portals to standards
  • strategic use – starting us down the path of learning how open data can drive policy outcomes; and
  • culture and power – recognizing that there are lots of open data is going to surface a lot of reasons why governments don’t want to engage data driven in decision making

In other words, I want to be talking about how open data can make the world a better place, not about how we do open data. That conversation still matters, open data portals still matter, but the path forward around them feels straightforward, and if they remain the focus we’ll be obsessing about the wrong thing.

So here’s what I’d like to see in the future from our Open Data conferences. We got to stop talking about how to do open data. This is because all of our efforts here, everything we are trying to accomplish… it has nothing to do with the data. What I think we want to be talking about is how open data can be a tool to make the world a better place. So let’s make sure that is the conversation we are have.