Tag Archives: cdnpoli

On Journalism, Government and the cost of Digital Illiteracy

Earlier today the CBC published a piece by Alison Crawford about Canadian public servants editing wikipedia. It draws from a clever twitter bot — @gccaedits— that tracks edits to wikipedia from government IP address. I love the twitter account — fun!

The article, not so much. How sad that the digital literacy of the CBC is such that this is deemed newsworthy. How equally sad that government comms people feel they need to respond to things like this.

The article is pretty formulaic. It pulls out the more sensational topics that got edited on wikipedia, such as one on hockey (aren’t we so Canadian!) and one sexual positions (obligatory link bait).

It then has an ominous warning letting you know this is a serious issue:

“It’s yet another edit in a string of embarrassing online encyclopedia changes made by federal employees during the work day.”

It then lists other well known scandals of public servants editing wikipedia you are almost certainly familiar with, such as The “poopitch” and the “ Rush incident.”

See the problem? The waste!

I do. I see the colossal problem of a media institution that does not understand digital or how to deploy its power and privilege. And of a government unable to summon a response appropriate in scale to these types of stories in the past.

Let’s break it down:

  1. This is Not a Problem

Look at @gccaedits. There are on average maybe 7 edits per day. There are 257,034 public servants in Canada not counting the RCMP or the military. Assume each edit comes from a unique individual 0.0027% of public servants are spending say 15 minutes editing wikipedia each day.

But how do we know employees weren’t doing this during their break? Can anyone of us say that we’ve never looked at a sports score, sent a tweet, conducted an elaborate prank, called a friend or read an article unrelated to our work while at the office? How is this any different? In what world is a handful of people making an edit to wikipedia an indication of a problem?

I’ll bet the percent of CBC employees making edits to wikipedia is equal or greater than 0.0027%. Will they share their IP addresses so we can check?

2. Articles Like This Create Waste

Did you know that because this article was written there is probably 1–10 people in government who spent hours, if not their whole day: investigating what the government’s policies are regarding wikipedia editing; calling IT and asking them to trace the IP addresses that made the changes; bringing in HR to call the person responsible or figure out consequences. Maybe the union rep got pulled from their normal job to defend the poor “offender” from this overbearing response. It is possible tens of thousands of dollars in time was spent “managing” this issue. That, right there, is the true waste

You know that those public servants were not doing? They were NOT finding ways to help with the fire in Fort McMurray or addressing suicides in First Nations communities or the millions of other PRESSING AND URGENT PROBLEMS WE NEED GOVERNMENT FOCUSED ON.

3. Articles Like These Diminish Journalism

It isn’t just about the cost to government having to deal with this. What other issues could have been written about today? What about something actually important that held the government to account?

Remember the other week when Amanda Pfeffer of the CBC wrote about how IBM won a $32M Ontario government contract to fix software IBM itself had created? More of that please. I don’t mind attacking public servants judgement, but let’s do on shit that matters.

4. Journalists and Managers Cooperate for Terrible Outcomes

This isn’t just the fault of the CBC. Governments need to learn the consequence to reacting — as opposed to opportunity of simply ignoring — these types of articles.

The reflexive response to articles like these for management is to grope for a “solution.” The end game is almost always expensive and Orwellian network monitoring software sold by companies like Blue Coat and a heap of other “security” tools. As a result public servants are blocked from accessing wikipedia and a range of other deeply useful tools on the web all while their computers become slower and more painful to use.

This is not a path to more effective government. Nor a path to valuable journalism. We can do better.

Okay, rant mostly done.

I feel bad for public servants who had a crappy day today because of this.

I feel bad for Alison Crawford — who has lots of important stories to her credit, and wasn’t even the first to write about this, but whose article is the focus of this piece. Please keep reading her stuff — especially on the judicial file.

Ultimately, this CBC could have been an amazing article about how the ‘poopitch’ and ‘Rush’ incidents were crazy over reactions and we need to figure out how to manage the public service in a digital age.

But it wasn’t. It was a cheap thrill. Less of that please CBC. More of the real journalism we desperately need.

The Uncertain Future of Open Data in the Government of Canada

It is possible to state that presently, open data is at its high water mark in the Government of Canada. Data.gc.ca has been refreshed, more importantly, the government has signed the Open Data Charter committing it to making data “open” by default, and a rash of new data sets have been made available.

In other words there is a lot of momentum in the right direction. So what could go wrong.

The answer…? Everything.

The reason is the upcoming cabinet shuffle.

I confess that Minister Clement and I have not agreed on all things. I believe – like the evidence shows us – that needle injection sites such as Insite make communities safer, save lives and make it easier for drug users to get help. As Health Minister, Clement did not. I argued strongly against dismantling of the mandatory long form census, noting its demise would make our government dumber and, ultimately, more expensive. As Industry Minister, Minister Clement was responsible for the end of a reliable long form census.

However, when it comes to open data, Minister Clement has been a powerful voice in a government that has, on many occasions, looked for ways to make access to information harder, not easier. Indeed, open data advocates have been lucky to have had two deeply supportive ministers, Clement and, prior to him, Stockwell Day (who also felt strongly about this issue and was incredibly responsive to many of my concerns when I shared them). This run, though, may be ending.

With the Government in trouble there is wide spread acceptance that a major cabinet re-shuffle will be in order. While Minister Clement has been laying a lot of groundwork for the upcoming negotiations with the public sector unions and a rethink of the public service could be more effective and accountable, he may not be sticking around to see this work (that I’m sure the government sees as essential) through to the end. Nor may he want to. Treasury Board remains a relatively inward facing ministry and it would not surprise me if both the Minister, and the PMO, were interested in moving him to a portfolio that was more outward and public facing. Only a notable few politicians dream of wrestling with public servants and figuring out how to reform the public service. (Indeed Reg Alcock is the only one I can think of).

If the Minister is moved it will be a real test for the sustainability of open data at the federal level. Between the Open Data charter, the expertise and team built up within Treasury Board and hopefully some educational work Minister Clement has done within his own caucus, ideally there is enough momentum and infrastructure in place that the open data file will carry on. This is very much what I hope to be the case.

But much may depend on who is made President of the Treasury Board.  If that role changes open data advocates may find themselves busy not doing new things, but rather safe guarding gains already made.

 

Some thoughts on the relaunched data.gc.ca

Yesterday, I talked about what I thought was the real story that got missed in the fanfare surrounding the relaunch of data.gc.ca. Today I’ll talk about the new data.gc.ca itself.

Before I begin, there is an important disclaimer to share (to be open!). Earlier this year Treasury Board asked me to chair five public consultations across Canada to gather feedback on both its open data program and data.gc.ca in particular. As such, I solicited peoples suggestions on how data.gc.ca could be improved – as well as shared my own – but I was not involved in the creation of data.gc.ca. Indeed the first time I saw the site was on Tuesday when it launched. My role was merely to gather feedback. For those curious you can read the report I wrote here

There is, I’m happy to say, much to commend about the new open data portal. Of course, aesthetically, it is much easier on the eye, but this is really trivial compared to a number of other changes.

The most important shift relates to the desire of the site to foster community. Users can now register with the site as well as rate and comment on data sets. There are also places like the Developers’ Corner which contains documentation that potential users might find helpful and a sort of app store where government agencies and citizens can posts applications they have created. This shift mirrors the evolution of data.govdata.gov.uk and DataBC which started out as data repositories but sought to foster and nurture a community of data users. The critical piece here is that simply creating the functionality will probably not be sufficient, in the US, UK and BC it has required dedicated community managers/engagers to help foster such a community. At present it is unclear if that exists behind the website at data.gc.ca.

The other two noteworthy improvements to the site are an improved search and the availability of API’s. While not perfect, the improved search is nonetheless helpful as previously it was basically impossible to find anything on the site. Today a search for “border time” and a border wait time data set is the top result. However, search for “border wait times” and “Biogeochemical exploration using Douglas-fir tree tops in the Mabel Lake area, southern British Columbia (NTS 82L09 and 10)” becomes the top hit with actual border wait time data set pushed down to fifth. That said the search is still a vast improvement and this alone could be a boon to policy wonks, researchers and developers who elect to make use of the site.

The introduction of APIs is another interesting development. For the uninitiated an API (application programming interface) provides continuous access to updated data, so rather than downloading a file, it is more like you are plugging into a socket that delivers data, rather than electricity. The aforementioned border wait time data set is a fantastic example. It is less of a “data set” than of a “data stream” providing the most recent updates of border wait times, like what you would see on the big signs across the highway as you approach the border. By providing it through the open data site it would not, for example, be impossible for Google Maps to scan this data set daily, understand how border wait times fluctuate and incorporate these delays in its predicted travel times. Indeed, it could even querry the API  in real time and tell you how long it will take to drive from Vancouver to Seattle, with border delays taken into account. The opportunity for developers and, equally intriguing, government employees and contractors, to build applications a top of these APIs is, in my mind, quite exciting. It is a much, much cheaper and flexible approach than how a lot of government software is currently built.

I also welcome the addition of the ability to search Access to Information (ATIP) requests summaries. That said, I’d like for there to be more than just the summaries, that actually responses would be nice, particularly given that ATIP requests likely represent information people have identified as important. In addition, the tool for exploring government expenditures is interesting, but it is weirdly more notable because, as far as I can tell, none of the data displayed in the tool can be downloaded, meaning it is not very open.

Finally, I will briefly note that the license is another welcome change. For more on that I recommend checking out Teresa Scassa’s blog post on it. Contrary to my above disclaimer I have been more active on this side of things, and hope to have more to share on that another time.

I’m sure, as I and others explore the site in the coming days we will discover more to like and dislike about it, but it is a helpful step forward and another signal that open data is, slowly, being baked into the public service as a core service.

 

The Real News Story about the Relaunch of data.gc.ca

As many of my open data friends know, yesterday the government launched its new open data portal to great fanfare. While there is much to talk about there – something I will dive into tomorrow – that was not the only thing that happened yesterday.

Indeed, I did a lot of media yesterday between flights and only after it was over did I notice that virtually all the questions focused on the relaunch of data.gc.ca. Yet it is increasingly clear that for me, the much, much bigger story of the portal relaunch was the Prime Minister announcing that Canada would adopt the Open Data Charter.

In other words, Canada just announced that it is moving towards making all government data open by default. Moreover, it even made commitments to make specific “high value” data sets open in the next couple of years.

As an aside, I don’t think the Prime Minister’s office has ever mentioned open data – as far as I can remember, so that was interesting in of itself. But what is still more interesting is what the Prime Minister committed Canada to. The open data charter commits the government to make data open by default as well as four other principles including:

  • Quality and Quantity
  • Useable by All
  • Releasing Data for Improved Governance
  • Releasing Data for Innovation

In some ways Canada has effectively agreed to implement the equivalent to Presidential Executive Order on Open Data the White House announced last month (and that I analyzed in this blog post). Indeed, the charter is more aggressive than the executive order since it goes on to layout the need to open up not just future data, but also current “high value” data sets. Included among these are data sets the Open Knowledge Foundation has been seeking to get opened via its open data census, as well as some data sets I and many others have argued should be made open, such as the company/business register. Other suggested high value data sets include data on crime, school performance, energy and environment pollution levels, energy consumption, government contracts, national budgets, health prescription data and many, many others. Also included on the list… postcodes – something we are presently struggling with here in Canada.

But the charter wasn’t all the government committed to. The final G8 communique contained many interesting tidbits that again, highlighted commitments to open up data and adhere to international data schemas.

Among these were:

  • Corporate Registry Data: There was a very interesting section on “Transparency of companies and legal arrangements” which is essentially on sharing data about who owns companies. As an advisory board member to OpenCorporates, this was music to my ears. However, the federal government already does this, the much, much bigger problem is with the provinces, like BC and Quebec that make it difficult or expensive to access this data.
  • Extractive Industries Transparency Initiative: A commitment that “Canada will launch consultations with stakeholders across Canada with a view to developing an equivalent mandatory reporting regime for extractive companies within the next two years.” This is something I fought to get included into our OGP commitment two years ago but failed to succeed at. Again, I’m thrilled to see this appear in the communique and look forward to the government’s action.
  • International Aid Transparency Initiative (IATI) and Busan Common Standard on Aid Transparency,: A commitment to make aid data more transparent and downloadable by 2015. Indeed, with all the G8 countries agreed to taking this step it may be possible to get greater transparency around who is spending what money, where on aid. This could help identify duplication as well as in assessments around effectiveness. Given how precious aid dollars are, this is a very welcome development. (h/t Michael Roberts of Acclar.org)

So lots of commitments, some on the more vague side (the open data charter) but some very explicit and precise. And that is the real story of yesterday, not that the country has a new open data portal, but that a lot more data is likely going to get put into that portal over then next 2-5 years. And a tsunami of data could end up in it over the next 10-25 years. Indeed, so much data, that I suspect a portal will no longer be a logical way to share it all.

And therein lies the deeper business and government story in all this. As I mentioned in my analysis of the White House Executive Order that made open data default, the big change here is in procurement. If implemented, this could have a dramatic impact on vendors and suppliers of equipement and computers that collect and store data for the government. Many vendors try to find ways to make their data difficult to export and share so as to lock the government in to their solution. Again, if (and this is a big if) the charter is implemented it will hopefully require a lot of companies to rethink what they offer to government. This is a potentially huge story as it could disrupt incumbents and lead to either big reductions in the costs of procurement (if done right) or big increases and the establishment of the same, or new, impossible to work with incumbents (if done incorrectly).

There is potentially a tremendous amount at stake in how the government handles the procurement side of all this, because whether it realizes it or not, it may have just completely shaken up the IT industry that serves it.

 

Postscript: One thing I found interesting about the G8 communique was how many times commitments about open data and open data sets occurred in the section that had nothing to do with open data. Will be interesting if that is a trend that continues at the next G8 meeting. Indeed, I wouldn’t be surprised is a specific open data section disappears and instead these references just become part of various issue related commitments.

 

 

 

Duffy, the Government and the problem with “no-notes” meetings

So, for my non-canadian readers, there is a significant scandal brewing up here in Canadaland, regarding a senator, who claimed certain expenses he was not allowed to (to the tune of $90,000) and then had that debt paid for by the Prime Minister’s chief of staff (who has now resigned).

This short paragraph in an article by the star captures the dark nature of the exchange:

…once the repayment was made, Duffy stopped cooperating with independent auditors examining his expense claims. When a Senate committee met behind closed doors in early May to write the final report on the results of the audit, Conservatives used their majority to soften the conclusions about Duffy’s misuse of taxpayers’ money.

So, it was money designed to make a potential minor scandal go away.

Now the opposition party  ethics critic Charlie Angus is calling for an RCMP investigation. From the same story:

“Where is the paper trail? Who was involved,” Angus said.

“If there is a signed agreement between Mr. Duffy and Mr. Wright or Mr. Duffy and the Prime Minister’s Office, we need to see that documentation,” he said.

And herein lies an interesting rub. This government is infamous for holding “no-note” meetings. I’ve been told about such meetings on numerous occasions by public servants. Perhaps they are making it up. But I don’t think so. The accusations have happened too many times.

So maybe there is a paper trail between the Prime Minister’s Office and Senator Duffy. There were lawyers involved… so one suspects there was. But who knows. Maybe there isn’t. And the lack of a paper trail won’t give many people confidence. Indeed, with Duffy now no longer in the Conservative Caucus and with the Chief of Staff resigned everyone is now looking at the Prime Minister, people are now starting to focus on the Prime Minister. What did he know?

And therein lies the bigger rub. In an environment where there is no paper trail one cannot audit who knew what or who is responsible. This means everyone is responsible, all the way to the top. So a “no-notes” meeting can be good for keeping the public from knowing certain decisions, and who made them, but the approach fails once the public finds out about one of these decisions and starts to care. If there is no smoking email in which someone else claims responsibility, it is going to be hard for this not to reach the Prime Minister’s desk.

Politicians and political staff will sometimes forget that the rules and processes around meetings – which appear to be designed to simply promote a degree of (annoying to them) transparency and accountability – are as much about creating a record for the public as they are about protecting the people involved.

 

 

Canada Post and the War on Open Data, Innovation & Common Sense (continued, sadly)

Almost exactly a year ago I wrote a blog post on Canada Post’s War on the 21st Century, Innovation & Productivity. In it I highlighted how Canada Post launched a lawsuit against a company – Geocoder.ca – that recreates the postal code database via crowdsourcing. Canada Posts case was never strong, but then, that was not their goal. As a large, tax payer backed company the point wasn’t to be right, it was to use the law as a way to financial bankrupt a small innovator.

This case matters – especially to small start ups and non-profits. Open North – a non-profit on which I sit on the board of directors – recently explored what it would cost to use Canada Posts postal code data base on represent.opennorth.ca, a website that helps identify elected officials who serve a given address. The cost? $9,000 a year, nothing near what it could afford.

But that’s not it. There are several non-profits that use Represent to help inform donors and other users of their website about which elected officials represent geographies where they advocate for change. The licensing cost if you include all of these non-profits and academic groups? $50,000 a year.

This is not a trivial sum, and it is very significant for non-profits and academics. It is also a window into why Canada Post is trying to sue Geocoder.ca – which offers a version of its database for… free. That a private company can offers a similar service at a fraction of the cost (or for nothing) is, of couse, a threat.

Sadly, I wish I could report good news on the one year anniversary of the case. Indeed, I should be!

This is because what should have been the most important development was how the Federal Court of Appeal made it even more clear that data cannot be copyrighted. This probably made it Canada Post’s lawyers that they were not going to win and made it even more obvious to us in the public that the lawsuit against geocoder.ca – which has not been dropped-  was completely frivolous.

Sadly, Canada Post reaction to this erosion of its position was not to back off, but to double down. Recognizing that they likely won’t win a copyright case over postal code data, they have decided:

a) to assert that they hold trademark on the words ‘postal code’

b) to name Ervin Ruci – the opertator of Geocoder.ca – as a defendent in the case, as opposed to just his company.

The second part shows just how vindictive Canada Post’s lawyers are, and reveals the true nature of this lawsuit. This is not about protecting trademark. This is about sending a message about legal costs and fees. This is a predatory lawsuit, funded by you, the tax payer.

But part a is also sad. Having seen the writing on the wall around its capacity to win the case around data, Canada Post is suddenly decided – 88 years after it first started using “Postal Zones” and 43 years after it started using “Postal Codes” to assert a trade mark on the term? (You can read more on the history of postal codes in canada here).

Moreover the legal implications if Canada Post actually won the case would be fascinating. It is unclear that anyone would be allowed to solicit anybody’s postal code – at least if they mentioned the term “postal code” – on any form or website without Canada Posts express permission. It leads one to ask. Does the federal government have Canada Post’s express permission to solicit postal code information on tax forms? On Passport renewal forms? On any form they have ever published? Because if not, they are, I understand Canada Posts claim correctly, in violation of Canada Post trademark.

Given the current government’s goal to increase the use of government data and spur innovation, will they finally intervene in what is an absurd case that Canada Post cannot win, that is using tax payer dollars to snuff out innovators, increases the costs of academics to do geospatial oriented social research and that creates a great deal of uncertainty about how anyone online be they non-profits, companies, academics, or governments, can use postal codes.

I know of no other country in the world that has to deal with this kind of behaviour from their postal service. The United Kingdom compelled its postal service to make postal code information public years ago.In Canada, we handle the same situation by letting a tax payer subsidized monopoly hire expensive lawyers to launch frivolous lawsuits against innovators who are not breaking the law.

That is pretty telling.

You can read more about this this, and see the legal documents on Ervin Ruci’s blog has also done a good job covering this story at canada.com.

How not to sell the Oil Sands

If you haven’t read Tzeporah Berman’s Daily Kos piece – My Government Doesn’t Believe in Climate Change – go check it out. It’s amazing to see how out of sync, and behind the ball, the government has gotten on this issue.

Indeed, the current government really is becoming the best weapon opponents to the pipelines have against their construction and the further development of the oil sands.

First, the Natural Resource Minister called environmentalists who opposed Northern Gateway – the pipeline that would take Oil Sand crude from Alberta to British Columbia’s west coast – a “radical”. In short order it turns out that, well, the vast majority of British Columbians were radicals since opposition to the pipeline has been, and remains, strong.

Worse, by declaring war on environmentalists and those concerned about climate change, the Minister set a table that would become distinctly awkward when it came to trying to get the White House and State Department to approve Keystone XL – the pipeline that would export oil sands oil down to the US. Consider that the US President mentioned his concerns about climate change in his State of the Union address and the US ambassador to Canada has hinted that Canada’s commitment to addressing climate change may factor into their decisions around Keystone. So now, at the very moment the Canadian Government needs help to persuade the Americans they actually are serious about climate change, they just come out of a five year patch where they’ve labeled everyone who disagrees with it as a radical, sicced auditors on them and tried to shut many of them down. So suddenly they are in a difficult position of having few, if any, allies in the environmental movement they can turn to to help it gain credibility around any of the actions they wish to take. It’s a communications and policy disaster.

Of course, it doesn’t help that even in the midst of trying to convince the United States it actually is serious about Climate Change, the Natural Resource Minister has speaking notes that feature strong climate change denier comments. Elizabeth May, Green Party candidate in the house of commons summed up the Minister Oliver’s quotes – made to the editorial board of La Presse – quite nicely:

This is what Oliver told the editorial board of La Presse: “I think that people aren’t as worried as they were before about global warming of two degrees… Scientists have recently told us that our fears (on climate change) are exaggerated.”

Thank goodness the editorial board at La Presse knows how to ask questions. They pressed him to name any scientist who thinks our fears are exaggerated. He couldn’t.

So basically, we have a minister who is willing to go on record to make a claim about the future of the planet without being able to reference a single source. I’m sure the White House and State Department are deeply comforted.

It is all doubly amazing as this government has been masterful at handling communications around virtually every issue – except for the one that matters most to it, where it has suddenly becoming a bumbling idiot.