Category Archives: commentary

Government, Digital Services & IT Procurement Reform

Next semester I’ll be teaching a course on why healthcare.gov initially turned into a disaster. Why? Because sadly, the failure of healthcare.gov was not special. Conservative estimates suggest over half of all IT projects are not completed on time or on budget. Others suggest the numbers are higher still. What did make healthcare.gov special was that its failure was so spectacularly public. As a result, unlike so many IT failures that stay hidden from public view, healthcare.gov is, somewhat paradoxically, safe to talk about. This makes it a wonderful case study to explore why so many large IT projects end in disaster.

Thinking about the course has me diving deeper into government IT procurement. Many of my colleagues in the civic tech space are convinced that procurement reform is critical to improving government IT services and fostering a better eco-system of IT vendors. I agree there is a great deal of good work that could be done to make procurement simpler and more effective. However, procurement is not a silver bullet. Indeed, I suspect that procurement reform, on its own, will have a limited impact, because it is not the central problem.

What procurement thinks it is trying to solve

There are two broad goals of government IT procurement policies that I think often get conflated.

One set of rules try to constrain and/or force people to make good technicaldecisions. Is the solution accessible? Is it secure? Does it protect privacy? etc…

The second goal is to constrain and/or force people to make good processdecisions. This is about ensuring fairness, accountability and broadly to prevent corruption. Did you get three bids? Have you laid out the specifications? Are you related to any of the vendors? Have the vendors donated money to any politicians? etc…

Both sets of rules have unintended consequences that make procurement slow and more difficult (although for many governments this can be a feature, not a bug, and making spending more difficult can be a net positive at a system level even if frustrating at the unit level).

The underlying assumption

Unpack these two goals — and particularly the first one — and you discover two underlying assumptions:

  1. IT implementations are treated like technical not adaptive challenges

The first assumption is that IT should be commoditized. While some IT purchases may be similar to buying a pencil, most are not. And if you are talking about an IT purchase that is custom build or the cost is north of $1 million, this is almost definitely not the case. Large IT implementations are complex with significant unknowns. It is impossible (despite organizations efforts) to spec out how all the problems will be solved in advance. Moreover this work takes place in dynamic environments where assumptions about the tech it will operate on, how users will interface with it and even what it product will need to do are dynamic. We treat IT projects like they are technical challenges — that we know exactly what must be done and that it can be all mapped out in advance — but they are almost always adaptive problems, where we don’t even know what all the problems are at the beginning of the process.

2. Effective process can force good decision making (or at least constrain bad ones)

Think of all the questions an enterprise — government or for profit — needs to assess: Will a solution work? Does it have a good user experience for our users? Can that UX evolve? Can the vendor adapt to unknown problems? Can they integrate with our current environment? To our future environment? Plus hundreds of other issues, all of which require nuance and knowledge.

But any set of procurement rules is about standardization process — so that the process can be evaluated, not the outcome. And this makes it harder to bring to bear these nuanced decision and knowledge because nuance, by definition, is hard to standardize. I fear that implicit in procurement reform is the belief that a good set of policies can design a process that, irregardless of who runs it, will prevent disaster and possibly even ensure an optimal outcome. If we assume procurement problems are technical problems for which the appropriate solution must merely be identified, then with the right “code” the machinery of procurement — regardless of who is manning it — should be able to select the optimal solution.

Both assumptions are, of course, patently ridiculous.

This is why I’m not confident that tinkering with the rules of procurement in the IT sector will generate much change. I certainly don’t think it will foster a new eco-system of service providers who will provide solutions that don’t fail 50% of the time. All the tinkering in the world won’t change the underlying issues — which is more than anything else: more than rules, oversight or the size of the ecosystem, is capacity of the people of running the procurement to evaluate technology and vendors capacity that matters.

I’ve seen terrible, terrible IT choices made by organizations and inspired decisions (like California’s Child Welfare Services RFP on which I’m writing a case study) be produced by virtually identical procurement rules and policies. The current rule set can allow a determined actor in government to make good choices. Could we do a lot better? Absolutely. But it is not the defining barrier.

This is, again, why I think USDS and similar digital service groups that try to attract talent that has worked on mass market consumer technology matter. Recruiting top technology talent into government is the single best strategy for ensuring better procurement. Deeply experienced people stuck with an okay rule set will be better than less experienced people stuck with an okay rule set.

And vendors generally agree. As a vendor, what you want more than anything is a capable, intelligent and knowledgeable buyer, not a set of rules that must be adhered to no matter the context.

And, to be clear, this is how the private sector does it. Ask Peter Thiel, Mark Zuckerberg, Elon Musk or Paul Graham (especially Paul Graham). None of them would outsource the technology choices for their start up or $1B dollar unicorn to a group of procurement officers equipped with even the most perfect set of procurement rules. Quite the contrary. They spend millions in stock options and money hiring amazing talent to make highly informed decisions to meet needs in a fluid and evolving environment.

So we should do Procurement Reform?

We should. We definitely should. But let’s recognize the limits. We should have rules that prevent corruption, nepotism or competitive practices. But unshackling people of rules may equip them to make more bad decisions as it does good ones. Let’s make rules that help people buy software and don’t make the process a burden, but our defense against poor outcomes shouldn’t be more rules, it should be an investment in pulling the brightest and best minds into government.

This is a problem I think digital service groups are trying to solve — creating a pipeline of talent that has worked with the latest tech and delivered solutions that have served hundreds of millions of users, flowing through government. So let’s just ensure that we invest in both. Procurement reform without the right people won’t get us there.

The Empire Strikes Back: How the death of GDS puts all government innovators at risk

The UK Government Digital Service(GDS) is dead. I’m sure it will continue to exist in some form, but from what I’ve read it appears to have been gutted of its culture, power and mandate. As a innovator and force for pulling the UK government into the 21st century, it’s over.

The UK government comms people, and the new head of GDS himself will say otherwise, but read between the lines of the stories as well as what former GDS people are saying, and it’s clear: GDS is dead.

These narratives are also aligned as to why. GDS was killed by what the Brits call “Whitehall.” This is short form for the inner and elite circles of the UK public service somewhat cruelly depicted in “Yes, Minister.” Their motive? GDS represented a centralization of IT control, and the departments — for whom GDS frequently found savings of tens of millions of pounds and provided effective online services on behalf of— didn’t like seeing their autonomy and budget sucked away by a group of IT geeks.

Losing GDS is a tragedy for the UK. But the downside isn’t limited to just the UK. It represents a blow to efforts to rethink and digitize governments around the world.

Why?

Because the debates over how governments work are not domestic (they haven’t been for a long time). GDS inspired innovators and politicians everywhere who were hungry for models of how governments could function in a digital age. Copycats sprang up around the world, such as the Digital Transformation Office in Australia and the United States Digital Service in the US at the national level, as well as others at the regional and local level. (For example, the Province of Ontario, which has a fantastic digital team, is currently hiring a Chief Digital Officer to lead a similar type of org.)

While working independently, these organizations share a common sense of mission and so trade best practices and lessons. More importantly, a few years ago they benefited from a massive asymmetry in information. They understood just how disruptive the innovation they were seeking to implement could be. And not from a technology perspective, but to the organizational structure and distribution of power within the public service.

As it became clearer to the rest of the public service in the UK just how far-reaching GDS’s vision was, and how much thinking digitally challenges the way senior public servants in places like Whitehall think… that asymmetry in knowledge and capacity didn’t diminish, but the asymmetry in awarenessand implications did. And so the center fought back. And it has far more power than any GDS might.

But it isn’t just in the UK. I suspect the existence of GDS and its global network has activated a global counter-movement among senior mandarins. Public servants are connected across international borders by shared alma maters, common international organizations and professional groups, cross-border projects, etc.… don’t think for a second that that global network hasn’t been activated (particularly among Treasury Board officials) to what some see as a new threat.

To be clear, this isn’t a GDS good, Whitehall bad piece. For anything to fail there is probably some shared responsibility. And assigning blame doesn’t really matter. What I care about it preserving what I and many others believe are important efforts to modernize governments so that the effectiveness of government, and trust in its institutions, are maintained.

So what does matter is, if you work for a digital service outside the UK, it is time to go double down on political cover and/or to reach out to the senior public servants around you and build some alliances. Because your Whitehall equivalent definitely has relationships with the UK Whitehall, and they are asking them what they think and know about GDS, and their story probably isn’t a good one.

The empire of traditional siloed bureaucracy is fighting back. You probably won’t beat it. So how are you going to engage it?

Canada’s Draft Open Government Plan — The Promise and Problems Reviewed

Backdrop

On Friday the Canadian Government released its draft national action plan. Although not mentioned overtly in the document, these plans are mandated by the Open Government Partnership (OGP), in which member countries must draft National Action Plans every two years where they lay out tangible goals.

I’ve both written reviews about these plans before and offered suggestions about what should be in them. So this is not my first rodeo, nor is it for the people drafting them.

Purpose of this piece

In the very niche world of open government there are basically two types of people. Those who know about the OGP and Canada’s participation (hello 0.01%!), and those who don’t (hello overwhelming majority of people — excited you are reading this).

If you are a journalist, parliamentarian, hill staffer, academic, public servant, consultant, vendor, or everyday interested citizen already following this topic, here are thoughts and ideas to help shape your criticisms and/or focus your support and advice to government. If you are new to this world, this post can provide context about the work the Canadian Government is doing around transparency to help facilitate your entrance into this world. I’ll be succinct — as the plan is long, and your time is finite.

That said, if you want more details about thoughts, please email me — happy to share more.

The Good

First off, there is lots of good in the plan. The level of ambition is quite high, a number of notable past problems have been engaged, and the goals are tangible.

While there are worries about wording, there are nonetheless a ton of things that have been prioritized in the document that both myself and many people in the community have sought to be included in past plans. Please note, that “prioritized” is distinct from “this is the right approach/answer.” Among these are:

  • Opening up the Access to Information Act so we can update it for the 21st century. (Commitment 1)
  • Providing stronger guarantees that Government scientists — and the content they produce — is made available to the public, including access to reporters (Commitment 14)
  • Finding ways to bake transparency and openness more firmly into the culture and processes of the public service (Commitment 6 and Commitment 7)
  • Ensuring better access to budget and fiscal data (Commitment 9, Commitment 10 and Commitment 11)
  • Coordinating different levels of government around common data standards to enable Canadians to better compare information across jurisdictions (Commitment 16)
  • In addition, the publishing of Mandate Letters (something that was part of the Ontario Open by Default report I helped co-author) is a great step. If nothing else, it helps public servants understand how to better steer their work. And the establishment of Cabinet Committee on Open Government is worth watching.

Lots of people, including myself, will find things to nit pick about the above. And it is always nice to remember:

a) It is great to have a plan we can hold the government accountable to, it is better than the alternative of no plan

b) I don’t envy the people working on this plan. There is a great deal to do, and not a lot of time. We should find ways to be constructive, even when being critical

Three Big Ideas the Plan Gets Right

Encouragingly, there are three ideas that run across several commitments in the plan that feel thematically right.

Changing Norms and Rules

For many of the commitments, the plan seeks to not simply get tactical wins but find ways to bake changes into the fabric of how things get done. Unlike previous plans, one reads a real intent to shift culture and make changes that are permanent and sustainable.

Executing on this is exceedingly difficult. Changing culture is both hard to achieve and measure. And implementing reforms that are difficult or impossible to reverse is no cake walk either, but the document suggests the intent is there. I hope we can all find ways to support that.

User Centric

While I’m not a fan of all the metrics of success, there is a clear focus on making life easier for citizens and users. Many of the goals have an underlying interest of creating simplicity for users (e.g. a single place to find “x” or “y”). This matters. An effective government is one that meets the needs of its citizens. Figuring out how to make things accessible and desirable to use, particularly with information technology, has not been a strength of governments in the past. This emphasis is encouraging.

There is also intriguing talk of a “Client-First” service strategy… More on that below.

Data Standards

There is lots of focus on data standards. Data standards matter because it is hard to use data — particularly across government, or from different governments and organizations — if they have different standards. Imagine trying if every airline used a different standard to their tickets, so to book a trip involving more than one airline would be impossible as their computers wouldn’t be able to share information with one another, or you. That challenging scenario is what government looks like today. So finding standards can help make government more legible.

So seeing efforts like piloting the Open Contracting Data Standard in Commitment 9, getting provincial and local governments to harmonize data with the feds in Commitment 16 and “Support Openness and Transparency Initiatives around the World” in Commitment Commitment 18 are nice…

… and it also makes it glaring when it is not there. Commitment 17 — Implement the Extractives Sector Transparency Measures Act — is still silent about implementing the Extractive Industries Transparency Initiative standard and so feels inconsistent with the other commitments. More on this below.

The Big Concerns

While there are some good themes, there are also some concerning ones. Three in particular stand out:

Right Goal, Wrong Strategy

At the risk of alienating some colleagues, I’m worried about the number of goals that are about transparency for transparency’s sake. While I’m broadly in favour of such goals… they often risk leading nowhere.

I’d much rather see specific problems the government wants to focus its resources on open data or sharing scientific materials on. When no focus is defined and the goal is to just “make things transparent” what tends to get made transparent is what’s easy, not what’s important.

So several commitments, like numbers 3, 13, 14, 15, essentially say “we are going to put more data sets or more information on our portals.” I’m not opposed to this… but I’m not sure it will build support and alignment because doing that won’t be shaped to help people solve today’s problems.

A worse recommendation in this vein is “establish a national network of open data users within industry to collaborate on the development of standards and practices in support of data commercialization.” There is nothing that will hold these people together. People don’t come together to create open data standards, they come together to solve a problem. So don’t congregate open data users — they have nothing in common — this is like congregating “readers.” They may all read, but their expertise will span a wide variety of interests. Bring people together around a problem and then get focused on the data that will help them solve it.

To get really tangible, on Friday the Prime Minister had a round table with local housing experts in Vancouver. One outcome of that meeting might have been the Prime Minister stating, “this is a big priority, I’m going to task someone with finding all the data the federal government has that is relevant to this area so that all involved have the most up to date and relevant information we can provide to improve all our analyses.”

Now maybe the feds have no interesting data on this topic. Maybe they do. Maybe this is one of the PMOs top 6 priorities, maybe it isn’t. Regardless, the government should pick 3–8 policy areas they care deeply about and focus sharing data and information on those. Not to the exclusion of others, but to provide some focus. Both to public servants internally, so they can focus their efforts, and to the public. That way, experts, the public and anyone can, if they are able, grab the data to help contribute to the public discourse on the topic.

There is one place where the plan comes close to taking this approach, in Commitment 22: Engage Canadians to Improve Key Canada Revenue Agency Services. It talks a lot about public consultations to engage on charitable giving, tax data, and improving access to benefits. This section identifies some relatively specific problem the government wants to solve. If the government said it was going to disclose all data it could around these problems and work with stakeholders to help develop new research and analysis… then they would have nailed it.

Approaches like those suggested above might result in new and innovative policy solutions from both traditional and non-traditional sources. But equally important, such an approach to a “transparency” effort will have more weight from Ministers and the PMO behind it, rather than just the OGP plan. It might also show governments how transparency — while always a source of challenges— is also a tool that can help advance their agenda by promoting conversations and providing feedback. Above all it would create some models, initially well supported politically, that could then be scaled to other areas.

Funding

I’m not sure if I read this correctly but the funding, with $11.5M in additional funds over 5 years (or is that the total funding?) is feeling like not a ton to work with given all the commitments. I suspect many of the commitments have their own funding in various departments… but it makes it hard to assess how much support there is for everything in the plan.

Architecture

This is pretty nerdy… but there are several points in the plan where it talks about “a single online window” or “single, common search tool.” This is a real grey area. There are times when a single access point is truly transformative… it creates a user experience that is intuitive and easy. And then there are times when a single online window is a Yahoo! portal when what you really want is to just go to google and do your search.

The point to this work is the assumption that the main problem to access is that things can’t be found. So far, however, I’d say that’s an assumption, I’d prefer the government test that assumption before making it a plan. Why Because a LOT of resources can be expended creating “single online windows.”

I mean, if all these recommendations are just about creating common schemas that allow multiple data sources to be accessed by a single search tool then… good? (I mean please provide the schema and API as well so others can create their own search engines). But if this involves merging databases and doing lots of backend work… ugh, we are in for a heap of pain. And if we are in for a heap of pain… it better be because we are solving a real need, not an imaginary one.

The Bad

There are a few things that I think have many people nervous.

Access to Information Act Review

While there is lots of good in that section, there is also some troubling language. Such as:

  • A lot of people will be worried about the $5 billing fee for FOIA requests. If you are requesting a number of documents, this could represent a real barrier. I also think it creates the wrong incentives. Governments should be procuring systems and designing processes that make sharing documents frictionless — this implies that costs are okay and that there should be friction in performing this task.
  • The fact that Government institutions can determine a FOIA request is “frivolous” or “vexatious” so can thus be denied. Very worried about the language here.
  • I’m definitely worried about having mandatory legislative review of the Access to Information Act every five years. I’d rather get it right once every 15–30 years and have it locked in stone than give governments a regular opportunity to tinker with and dilute it.

Commitment 12: Improve Public Information on Canadian Corporations

Having a common place for looking up information about Canadian Corporations is good… However, there is nothing in this about making available the trustees or… more importantly… the beneficial owners. The Economist still has the best article about why this matters.

Commitment 14: Increase Openness of Federal Science Activities (Open Science)

Please don’t call it “open” science. Science, by definition, is open. If others can’t see the results or have enough information to replicate the experiment, then it isn’t science. Thus, there is no such thing as “open” vs. “closed” science. There is just science, and something else. Maybe it’s called alchemy or bullshit. I don’t know. But don’t succumb to the open science wording because we lose a much bigger battle when you do.

It’s a small thing. But it matters.

Commitment 15: Stimulate Innovation through Canada’s Open Data Exchange (ODX)

I already talked about about how I think bringing together “open data” users is a big mistake. Again, I’d focus on problems, and congregate people around those.

I also suspect that incubating 15 new data-driven companies by June 2018 is not a good idea. I’m not persuaded that there are open data businesses, just businesses that, by chance, use open data.

Commitment 17: Implement the Extractives Sector Transparency Measures Act

If the Extractives Sector Transparency Measures Act is that same as it was before then… this whole section is a gong show. Again, no EITI standard in this. Worse, the act doesn’t require extractive industries to publish payments to foreign governments in a common standard (so it will be a nightmare to do analysis across companies or industry wide). Nor does it require that companies submit their information to a central repository, so aggregating the data about the industry will be nigh high impossible (you’ll have to search across hundreds of websites).

So this recommendation: “Establish processes for reporting entities to publish their reports and create means for the public to access the reports” is fairly infuriating as it a terrible non-solution to a problem in the legislation.

Maybe the legislation has been fixed. But I don’t think so.

The Missing

Not in the plan is any reference to the use of open source software or shares that software across jurisdictions. I’ve heard rumours of some very interesting efforts of sharing software between Ontario and the federal government that potentially saves tax payers millions of dollars. In addition, by making the software code open, the government could employ security bug bounties to try to make it more secure. Lots of opportunity here.

The Intriguing

The one thing that really caught my eye, however, was this (I mentioned it earlier):

The government is developing a Service Strategy that will transform service design and delivery across the public service, putting clients at the centre.

Now that is SUPER interesting. A “Service Strategy”? Does this mean something like the Government Digital Service in the UK? Because that would require some real resources. Done right it wouldn’t just be about improving how people get services, but a rethink of how government organizes services and service data. Very exciting. Watch that space.

On Journalism, Government and the cost of Digital Illiteracy

Earlier today the CBC published a piece by Alison Crawford about Canadian public servants editing wikipedia. It draws from a clever twitter bot — @gccaedits— that tracks edits to wikipedia from government IP address. I love the twitter account — fun!

The article, not so much. How sad that the digital literacy of the CBC is such that this is deemed newsworthy. How equally sad that government comms people feel they need to respond to things like this.

The article is pretty formulaic. It pulls out the more sensational topics that got edited on wikipedia, such as one on hockey (aren’t we so Canadian!) and one sexual positions (obligatory link bait).

It then has an ominous warning letting you know this is a serious issue:

“It’s yet another edit in a string of embarrassing online encyclopedia changes made by federal employees during the work day.”

It then lists other well known scandals of public servants editing wikipedia you are almost certainly familiar with, such as The “poopitch” and the “ Rush incident.”

See the problem? The waste!

I do. I see the colossal problem of a media institution that does not understand digital or how to deploy its power and privilege. And of a government unable to summon a response appropriate in scale to these types of stories in the past.

Let’s break it down:

  1. This is Not a Problem

Look at @gccaedits. There are on average maybe 7 edits per day. There are 257,034 public servants in Canada not counting the RCMP or the military. Assume each edit comes from a unique individual 0.0027% of public servants are spending say 15 minutes editing wikipedia each day.

But how do we know employees weren’t doing this during their break? Can anyone of us say that we’ve never looked at a sports score, sent a tweet, conducted an elaborate prank, called a friend or read an article unrelated to our work while at the office? How is this any different? In what world is a handful of people making an edit to wikipedia an indication of a problem?

I’ll bet the percent of CBC employees making edits to wikipedia is equal or greater than 0.0027%. Will they share their IP addresses so we can check?

2. Articles Like This Create Waste

Did you know that because this article was written there is probably 1–10 people in government who spent hours, if not their whole day: investigating what the government’s policies are regarding wikipedia editing; calling IT and asking them to trace the IP addresses that made the changes; bringing in HR to call the person responsible or figure out consequences. Maybe the union rep got pulled from their normal job to defend the poor “offender” from this overbearing response. It is possible tens of thousands of dollars in time was spent “managing” this issue. That, right there, is the true waste

You know that those public servants were not doing? They were NOT finding ways to help with the fire in Fort McMurray or addressing suicides in First Nations communities or the millions of other PRESSING AND URGENT PROBLEMS WE NEED GOVERNMENT FOCUSED ON.

3. Articles Like These Diminish Journalism

It isn’t just about the cost to government having to deal with this. What other issues could have been written about today? What about something actually important that held the government to account?

Remember the other week when Amanda Pfeffer of the CBC wrote about how IBM won a $32M Ontario government contract to fix software IBM itself had created? More of that please. I don’t mind attacking public servants judgement, but let’s do on shit that matters.

4. Journalists and Managers Cooperate for Terrible Outcomes

This isn’t just the fault of the CBC. Governments need to learn the consequence to reacting — as opposed to opportunity of simply ignoring — these types of articles.

The reflexive response to articles like these for management is to grope for a “solution.” The end game is almost always expensive and Orwellian network monitoring software sold by companies like Blue Coat and a heap of other “security” tools. As a result public servants are blocked from accessing wikipedia and a range of other deeply useful tools on the web all while their computers become slower and more painful to use.

This is not a path to more effective government. Nor a path to valuable journalism. We can do better.

Okay, rant mostly done.

I feel bad for public servants who had a crappy day today because of this.

I feel bad for Alison Crawford — who has lots of important stories to her credit, and wasn’t even the first to write about this, but whose article is the focus of this piece. Please keep reading her stuff — especially on the judicial file.

Ultimately, this CBC could have been an amazing article about how the ‘poopitch’ and ‘Rush’ incidents were crazy over reactions and we need to figure out how to manage the public service in a digital age.

But it wasn’t. It was a cheap thrill. Less of that please CBC. More of the real journalism we desperately need.

Government Procurement Failure: BC Ministry of Education Case Study

Apologies for the lack of posts. I’ve been in business mode – both helping a number of organizations I’m proud of and working on my own business.

For those interested in a frightening tale of inept procurement, poor judgement and downright dirty tactics when it comes to software procurement and government, there is a wonderfully sad and disturbing case study emerging in British Columbia that shows the lengths a government is willing to go to shut out open source alternatives and ensure that large, expensive suppliers win the day.

The story revolves around a pickle that the province of British Columbia found itself in after a previous procurement disaster. The province had bought a student record management system – software that records elementary and secondary students’ grades and other records. Sadly, the system never worked well. For example, student records generally all get entered at the end of the term, so any system must be prepared to manage significant episodic spikes in usage. The original British Columbia Electronic Student Information System (BCeSIS) was not up to the task and frequently crashed and/or locked out teachers.

To make matters worse, after spending $86M over 6 years it was ultimately determined that BCeSIS was unrecoverably flawed and, as the vendor was ending support, a new system needed to be created.

Interestingly, one of the Province’s school districts – the District of Saanich – decided it would self-fund an open source project to create an alternative to BCeSIS. Called OpenStudent, the system would have an open source license, would be created using locally paid open source developers, could be implemented in a decentralized way but still meet the requirements of the province and… would cost a fraction of that proposed by large government vendors.  The Times Colonist has a simple article that covers the launch of OpenStudent here.

Rather than engage Saanich, the province decided to take another swing at hiring a multinational to engage in a IT mega-project. An RFP was issued to which only companies with $100M in sales could apply. Fujitsu was awarded a 12 year contract with costs of up to $9.4M a year.

And here are the kickers:

So in other words, the province sprung some surprise requirements on the District of Saanich that forced it to kill an open source solution that could have saved tax payers millions and employed British Columbians, all while exempting a multinational from meeting the same requirements. It would appear that the province was essentially engaged in a strategy to kill OpenStudent, likely because any success it enjoyed would have created an ongoing PR challenge for the province and threatened its ongoing contract with Fujitsu.

While I don’t believe that any BC government official personally profited from this outcome, it is hard – very hard indeed – not to feel like the procurement system is deeply suspect or, at worst, corrupted. I have no idea if it is possible, but I do hope that these documents can serve as the basis for legal action by the District of Saanich against the Province of British Columbia to recapture some of their lost expenses. The province has clearly used its purchasing power to alter the marketplace and destroy competitors; whether this is in violation of a law, I don’t know. I do know, however, that it is in violation of good governance, effective procurement and general ethics. As a result, all BC tax payers have suffered.

Addendum: It has been suggested to me that that one reason the BC government may be so keen to support Fujitsu and destroy competing suppliers is because it needs to generate a certain amount of business for the company in order for it to maintain headcount in the province. Had OpenStudent proved viable and cheaper (it was estimated to cost $7-10 per student versus $20 for Fujistu’s service), Fujistu might have threatened to scale back operations which might have hurt service levels for other contracts. Unclear to me if this is true or not. To be clear I don’t hold Fujistu responsible for anything here – they are just a company trying to sell their product and offer the best service they can. The disaster described above has nothing to do with them (they may or may not offer amazing products, I don’t know); rather, it has everything to do with the province using its power to eliminate competition and choice.

Open Data Day 2014 – Five Fun Events Around the World

With over 110 Events happening world wide it is impossible to talk about every Open Data Day event. But looking almost every event on the wiki I’ve been deeply moved and inspired by the various efforts, goals and aspirations of the people who have organized these events.

In order to help others understand why Open Data Day matters as well as what can happen on it, here are five open data day events that I’m stumbled across that are doing something particularly fun or interesting.

1. Capetown & Johannesburg, South Africa

Their Description:

Coders, data wranglers and data investigators will pair up to look at one of three openly available datasets we have on hand, and work out the most interesting questions they can ask of it in less than half an hour. After 30 minutes, it’s all change – another desk, another partnership, another exciting data set to turn into a story.

What we’re hoping is that you’ll learn tips and tricks for getting data, querying it, creating quick visualizations and turning it to stories that people want to know about. You’ll learn from different people with a variety of skills, hopefully that you wouldn’t normally work with. And we’re also hoping it will be four hours of fun.

Why I love it: I love the focus on learning. With the participation of hacks and hackers the goal is clearly to help journalists and citizens learn new skills, not so they can do something with the data sets available on open data day, but so they can better play with data sets in the future to pursue stories or help a community. The point of speed data dating is thus not to build a product, the product is the skills and networks developed and, with luck the future stories and analyses that will be told by those who participated.

2. Buenos Aires, Argentina

Their description:

On February 22nd we will go out to the street and play with local data and some street artists to create beautiful visualizations.

Why I love it: Street art open data? What a great way to try to raise awareness of the importance of data literacy and transparency. In addition, how awesome is it to move outside the digital realm and use data to create artifacts that are not necessarily digital. And if there are artists involved? Jer would be so happy to read about this.

3. Greenfield, MA, United States

Their description:

We’re convening a small group to work with the Franklin Regional Council of Governments on a user-friendly way to map private wells in Western Massachusetts…

…Why is it important to map wells?

  • Only about 5% of private wells in Massachusetts are geolocated.
  • Many towns in Western Mass rely 100% on private wells.

Not knowing where our wells are can (and does) lead to water contaminated by nearby septic systems, dumping, and pollutant storage. Aside from the obvious health concerns, there are also financial implications from remediation costs and lowered property values.

Why I love it: Wow, WOW, WOW!!! This is maybe one of the coolest open data day events I’ve ever seen. Here you have a small community focusing on a problem that is real and tangible to them. Moreover, open data could have a direct and meaningful impact on the issue. I love the focus. I love that rallying point. I love the high impact with low resources (their building has minimal heat – so they are advising people to layer up). I wish these crew all the best success and hope to see an update.

4. Nagoya, Japan

Their Description:

Now, highlight of this year is “data of Nagoya Castle!”

The nearly 300 maps and survey drawings of the Nagoya Castle will be made open prior to Open Data Day. The Nagoya Castle office is cooperating with us and has decided that we can use their data for “International Open Data Day.” The references to the image are here.

Why I love it: Well – fill disclosure, my understanding of this event is through the prism of Google translate. But if I understood correctly… there are a few open data events in Japan that have a strong focus on local history which I find totally fascinating. At this event in Nagoya they are bringing in a professor who is an expert in open data as well as expert in the Nagoya castle to talk about the data that is being made open. In addition they are organizing and actual physical tour of the caste. Open Data meets local history buffs!

5. Cairo, Egypt

Their Description:

We will be organizing an online and decentralized event in Cairo, Egypt for the Open Data Day. There are numerous suggested tracks depending on the participants set of expertise:

For translators (المترجمون)

  • Open Data Handbook: The handbook discusses the legal, social and technical aspects of open data. It can be used by anyone but is especially designed for those seeking to open up data. It discusses the why, what and how of open data – why to go open, what open is, and the how to ‘open’ data. Translate it into Arabic here
  • Translate any of the School of Data short tutorials, for example, What is Data?Telling a Story with DataFinding Data, or Any other course/modules

For bloggers (المدونون)

Write blog posts about Open Data related topics and case studies, and don’t forget to use the following hashtag, #ODD2014. Possible ideas for blog posts:

  • Write about the concept of Frictionless Data
  • Case studies how you searched for, extracted and used governmental data
  • Listing of local organizations working or promoting Open Data or advocating for more Transparent and Open Governments

For Developers (مطوري البرامج)

  • Scrape data from capmas and put it into Open Format
  • Scrape data (Budget or the The Financial Monthly Bulletin) from the Minister of Finance and upload it to OpenSpending.org,
  • Create a tool to scrap the traffic data from bey2ollak and put it in an open format.

For Data Wranglers (هواة جمع البيانات)

Why I love it: I love that there are calls to action for a variety of people – including those who have no coding skills at all. How genius is it to organize an event to localize/translate the Open Data handbook? This is something a large number of people could do – and better still can help make open data accessible to a still larger pool of people.

And for the other roles the suggestion of projects – particular with a focus on the national budget and government operations data (capmas) suggests there is a strong civil society presence within the open data community. Will be super interested to see what progress they make and if there is broader interest in their work.