Category Archives: commentary

Algorithmic Regulation Spreading Across Government?

I was very, very excited to learn that the City of Vancouver is exploring implementing a program started in San Francisco in which “smart” parking meters adjust their price to reflect supply and demand (story is here in the Vancouver Sun).

For those unfamiliar with the program, here is a breakdown. In San Francisco, the city has the goal of ensuring at least one free parking spot is available on every block in the downtown core. As I learned during the San Fran’s presentation at the Code for America summit, such a goal has several important consequences. Specifically, it reduces the likelihood of people double parking, reduces smog and greenhouse gas emissions as people don’t troll for parking as long and because trolling time is reduced, people searching for parking don’t slow down other traffic and buses as they drive around slowly looking for a spot. In short, it has a very helpful impact on traffic more broadly.

So how does it work? The city’s smart parking meters are networked together and constantly assess how many spots on a given block are free. If, at the end of the week, it turns out that all the spaces are frequently in use, the cost of parking on that block is increased by 25 cents. Conversely if many of the spots were free, the price is reduced by 25 cents. Generally, each block finds an equilibrium point where the cost meets the demand but is also able to adjust in reaction to changing trends.

Technologist Tim O’Reilly has referred to these types of automated systems in the government context as “algorithmic regulation” – a phrase I think could become more popular over the coming decade. As software is deployed into more and more systems, the algorithms will be creating market places and resource allocation systems – in effect regulating us. A little over a year ago I said that contrary to what many open data advocates believe, open data will make data political – e.g. that open data wasn’t going to depoliticize public policy and make it purely evidenced base, quite the opposite, it will make the choices around what data we collect more contested (Canadians, think long form census). The same is also – and already – true of the algorithms, the code, that will increasingly regulate our lives. Code is political.

Personally I think the smart parking meter plan is exciting and hope the city will consider it seriously, but be prepared, I’m confident that much like smart electrical meters, an army of naysayers will emerge who simply don’t want a public resource (roads and parking spaces) to be efficiently used.

It’s like the Spirit of the West said: Everything is so political.

Public Servants Self-Organizing for Efficiency (and sanity) – Collaborative Management Day

Most of the time, when I engage with or speak to federal public servants, they are among the most eager to find ways to work around the bureaucracy in which they find themselves. They want to make stuff happen, and ideally, to make it happen right and more quickly. This is particularly true of younger public servants and those below middle management in general (I also find it is often the case of those at the senior levels, who often can’t pierce the fog of middle management to see what is actually happening).

I’m sure this dynamic is not new. In large bureaucracies around the world the self-organizing capacity of public servants have forever been in a low level guerrilla conflict against the hierarchies that both protect but also restrain them. What makes all this more interesting today however, is never before have public servants had more independent capacity to self-organize and never before have the tools at their disposal been more powerful.

So, for those who live in work in Ottawa who’d like to learn some of the tools public servants are using to better network and get work done across groups and ministries, let me point you to “Collaborative Management Day 2012.” (For those of us who aren’t public servants, that link, which directs into GCPEDIA won’t work – but I’m confident it will work for insiders). To be clear, it’s the ideas that are batted around at events like this that I believe will shape how the government will work in the coming decades. Much like the boomers created the public service of today in the 1960’s, millennials are starting to figure out how to remake it in a world of networks, and diminished resources.

Good luck guys. We are counting on you.

Details:

When: Wednesday, January 25, 2012 from 8 a.m. to 4 p.m.

Where: Canada Aviation and Space Museum, 11 Aviation Parkway, Ottawa, ON or via Webcast

Cost: Free! Seats are limited; registration is required for attendance.

The GCPedia community defines collaboration as being “a recursive process where two or more people or organizations work together in an intersection of common goals—for example, an intellectual endeavour that is creative in nature—by sharing knowledge, learning and building consensus.” And this is exactly what the Collaborative Culture Camp (GOC3) will teach you to achieve at the next Collaborative Management Day on January 25, 2012.

This free event will offer you a day of workshops and learning sessions that will help you:

  • Expand your knowledge and use of collaborative tools and culture
  • Develop an awareness of alternative processes that deliver results
  • Understand how to foster an environment of openness and transparency
  • Develop networks to support the application of new tools

At the end of the day you will be able to bring a collaborative toolkit back to your organization to share with your employees and colleagues!

Keep up to date on the event by keeping an eye on our GCPedia pages and by following us on Twitter (@GOC_3) and watching the #goc3 conversation (no account needed to check out the conversation!).

Questions? Concerns? Feedback? Feel free to email the event organizers or leave a message on our Discussion page on GCPedia.

My Canadian Open Government Consultation Submission

Attached below is my submission to the Open Government Consultation conducted by Treasury Board over the last couple of weeks. There appear to be a remarkable number of submission that were made by citizens, which you can explore on the Treasury Board website. In addition, Tracey Lauriault has tracked some of the submissions on her website.

I actually wish the submissions on the Government website were both searchable and could be downloaded in there entirety. That way we could re-organize them, visualize them, search and parse them as well as play with the submissions so as to make the enormous number of answers easier to navigate and read. I can imagine a lot of creative ways people could re-format all that text and make it much more accessible and fun.

Finally, for reference, in addition to my submission I wrote this blog post a couple months ago suggesting goals the government set for itself as part of its Open Government Partnership commitments. Happily, since writing that post, the government has moved on a number of those recommendations.

So, below is my response to the government’s questions (in bold):

What could be done to make it easier for you to find and use government data provided online?

First, I want to recognize that a tremendous amount of work has been done to get the present website and number of data sets up online.

FINDING DATA:

My advice on making data easier to engage Socrata to create the front end. Socrata has an enormous amount of experience in how to share government data effectively. Consider http://data.oregon.gov here is a site that is clean, easy to navigate and offers a number of ways to access and engage the governments data.

More specifically, what works includes:

1. Effective search: a simple search mechanism returns all results
2. Good filters: Because the data is categorized by type (Internal vs. external, charts, maps, calendars, etc…) it is much easier to filter. One thing not seen on Socrata that would be helpful would be the ability to sort by ministry.
3. Preview: Once I choose a data set I’m given a preview of what it looks like, this enables me to assess whether or not it is useful
4. Social: Here there is a ton on offer
– I’m able to sort data sets by popularity – being able to see what others find interesting is, in of itself interesting.
– Being able to easily share data sets via email, or twitter and facebook means I’m more likely to find something interesting because friends will tell me about it
– Data sets can also be commented upon so I can see what others think of the data, if they think it is useful or not, and what for or not.
– Finally, it would be nice if citizens could add meta data, to make it easier for others to do keyword searches. If the government was worried about the wrong meta data being added, one could always offer a search with crowd sourced meta data included or excluded
5. Tools: Finally, there are a large number of tools that make it easier to quickly play with and make use of the data, regardless of one’s skills as a developer. This makes the data much more accessible to the general public.

USING DATA

Finding data is part of the problem, being able to USE the data is a much bigger issue.

Here the single most useful thing would be to offer API’s into government data. My own personal hope is that one day there will be a large number of systems both within and outside of government that will integrate government data right into their applications. For example, as I blogged about here – https://eaves.ca/2011/02/18/sharing-critical-information-with-public-lessons-for-governments/ – product recall data would be fantastic to have as an API so that major retailers could simply query the API every time they scan inventory in a warehouse or at the point of sale, any product that appears on the list could then be automatically removed. Internally, Borders and Customs could also query the API when scanning exports to ensure that nothing exported is recalled.

Second, if companies and non-profits are going to invest in using open data, they need assurances that both they are legally allowed to use the data and that the data isn’t going to suddenly disappear on them. This means, a robust license that is clear about reuse. The government would be wise to adopt the OGL or even improve on it. Better still helping establish a standardized open data license for Canada and ideally internationally could help reduce some legal uncertainty for more conservative actors.

More importantly, and missing from Socrata’s sites, would be a way of identifying data sets on the security of their longevity. For example, data sets that are required by legislation – such as the NPRI – are the least likely to disappear, whereas data sets the the long form census which have no legal protection could be seen as at higher risk.

 

How would you use or manipulate this data?

I’m already involved in a number of projects that use and share government data. Among those are Emitter.ca – which maps and shares NPRI pollution data and Recollect.net, which shares garbage calendar information.

While I’ve seen dramatically different uses of data, for me personally, I’m interested mostly in using data for thinking and writing about public policy issues. Indeed, much has been made of the use of data in “apps” but I think it is worth noting that the single biggest use of data will be in analysis – government officials, citizens, academics and others using the data to better understand the world around them and lobby for change.

This all said, there are some data sets that are of particular usefulness to people, these include:

1. Data sets on sensitive issues, this includes health, inspection and performance data (Say surgery outcomes for specific hospitals, or restaurant inspection data, crime and procurement data are often in great demand).
2. Dynamic real-time Data: Data that is frequently updated (such a border, passport renewal or emergency room wait times). This data is shared in the right way can often help people adjust schedules and plans or reallocate resources more effectively. Obviously this requires an API.
3.Geodata: Because GIS standards are very mature it is easy to “mashup” geo data to create new maps or offer new services. These common standards means that geo data from different sources will work together or can be easily compared. This is in sharp contrast to say budget data, where there are few common standards around naming and organizing the data, making it harder to share and compare.

What could be done to make it easier for you to find government information online?

It is absolutely essential that all government records be machine readable.

Some of the most deplorable moment in open government occur when the government shares documents with the press, citizens or parliamentary officers in paper form. The first and most important thing to make government information easier to find online is to ensure that it is machine readable and searchable by words. If it does not meet this criteria I increasingly question whether or not it can be declared open.

As part of the Open Government Partnership commitments it would be great for the government to commit to guarantee that every request for information made of it would include a digital version of the document that can be searched.

Second, the government should commit that every document it publishes be available online. For example, I remember in 2009 being told that if I wanted a copy of the Health Canada report “Human Health in a Changing Climate:A Canadian Assessment of Vulnerabilities and Adaptive Capacity” I had to request of CD, which was then mailed to me which had a PDF copy of the report on it. Why was the report not simply available for download? Because the Minister had ordered it not to appear on the website. Instead, I as a taxpayer and to see more of my tax dollars wasted for someone to receive my mail, process it, then mail me a custom printed cd. Enabling ministers to create barriers to access government information, simply because they do not like the contents, is an affront to the use of tax payer dollars and our right to access information.

Finally, Allow Government Scientists to speak directly to the media about their research.

It has become a reoccurring embarrassment. Scientists who work for Canada publish an internationally recognized ground break paper that provides some insight about the environment or geography of Canada and journalists must talk to government scientists from other countries in order to get the details. Why? Because the Canadian government blocks access. Canadians have a right to hear the perspectives of scientists their tax dollars paid for – and enjoy the opportunity to get as well informed as the government on these issues.

Thus, lift the ban that blocks government scientists from speaking with the media.

 

Do you have suggestions on how the Government of Canada could improve how it consults with Canadians?

1. Honour Consultation Processes that have started

The process of public consultation is insulted when the government itself intervenes to bring the process into disrepute. The first thing the government could do to improve how it consults is not sabotage processes that already ongoing. The recent letter from Natural Resources Minister Joe Oliver regarding the public consultation on the Northern Gateway Pipelines has damaged Canadians confidence in the governments willingness to engage in and make effective use of public consultations.

2. Focus on collecting and sharing relevant data

It would be excellent if the government shared relevant data from its data portal on the public consultation webpage. For example, in the United States, the government shares a data set with the number and location of spills generated by Enbridge pipelines, similar data for Canada would be ideal to share on a consultation. Also useful would be economic figures, job figures for the impacted regions, perhaps also data from nearby parks (visitations, acres of land, kml/shape boundary files). Indeed, data about the pipeline route itself that could be downloaded and viewed in Google earth would be interesting. In short, there are all sorts of ways in which open data could help power public consultations.

3. Consultations should be ongoing

It would be great to see a 311 like application for the federal government. Something that when loaded up, would use GPS to identify the services, infrastructure or other resources near the user that is operated by the federal government and allow the user to give feedback right then and there. Such “ongoing” public feedback could then be used as data when a formal public consultation process is kicked off.

 

Are there approaches used by other governments that you believe the Government of Canada could/should model?

1. The UK governments expense disclosure and release of the COINS database more generally is probably the most radical act of government transparency to date. Given the government’s interest in budget cuts this is one area that might be of great interest to pursue.

2. For critical data sets, those that are either required by legislation or essential to the operation of a ministry or the government generally, it would be best to model the city of Chicago or Washington DC and foster the creation of a data warehouse where this data could be easily shared both internally and externally (as privacy and security permits). These cities are leading governments in this space because they have tackled both the technical challenges (getting the data on a platform where it can be shared easily) and around governance (tackling the problem of managing data sets from various departments on a shared piece of infrastructure).

 

Are there any other comments or suggestions you would like to make pertaining to the Government of Canada’s Open Government initiative?

Some additional ideas:

Redefine Public as Digital: Pass an Online Information Act

a) Any document it produces should be available digitally, in a machine readable format. The sham that the government can produce 3000-10,000 printed pages about Afghan detainees or the F-35 and claim it is publicly disclosing information must end.

b) Any data collected for legislative reasons must be made available – in machine readable formats – via a government open data portal.

c) Any information that is ATIPable must be made available in a digital format. And that any excess costs of generating that information can be born by the requester, up until a certain date (say 2015) at which point the excess costs will be born by the ministry responsible. There is no reason why, in a digital world, there should be any cost to extracting information – indeed, I fear a world where the government can’t cheaply locate and copy its own information for an ATIP request as it would suggest it can’t get that information for its own operations.

Use Open Data to drive efficiency in Government Services: Require the provinces to share health data – particularly hospital performance – as part of its next funding agreement within the Canada Health Act.

Comparing hospitals to one another is always a difficult task, and open data is not a panacea. However, more data about hospitals is rarely harmful and there are a number of issues on which it would be downright beneficial. The most obvious of these would be deaths caused by infection. The number of deaths that occur due to infections in Canadian hospitals is a growing problem (sigh, if only open data could help ban the antibacterial wipes that are helping propagate them). Having open data that allows for league tables to show the scope and location of the problem will likely cause many hospitals to rethink processes and, I suspect, save lives.

Open data can supply some of the competitive pressure that is often lacking in a public healthcare system. It could also better educate Canadians about their options within that system, as well as make them more aware of its benefits.

Reduce Fraud: Creating a Death List

In an era where online identity is a problem it is surprising to me that I’m unable to locate a database of expired social insurance numbers. Being able to query a list of social security numbers that belong to dead people might be a simple way to prevent fraud. Interestingly, the United States has just such a list available for free online. (Side fact: Known as the Social Security Death Index this database is also beloved by genealogist who use it to trace ancestry).

Open Budget and Actual Spending Data

For almost a year the UK government has published all spending data, month by month, for each government ministry (down to the £500 in some, £25,000 in others). More over, as an increasing number of local governments are required to share their spending data it has lead to savings, as government begin to learn what other ministries and governments are paying for similar services.

Create a steering group of leading Provincial and Municipal CIOs to create common schema for core data about the country.

While open data is good, open data organized the same way for different departments and provinces is even better. When data is organized the same way it makes it easier to citizens to compare one jurisdiction against another, and for software solutions and online services to emerge that use that data to enhance the lives of Canadians. The Federal Government should use its convening authority to bring together some of the countries leading government CIOs to establish common data schemas for things like crime, healthcare, procurement, and budget data. The list of what could be worked on is virtually endless, but those four areas all represent data sets that are frequently requested, so might make for a good starting point.

Statistics Canada Data to become OpenData – Background, Winners and Next Steps

As some of you learned last night, Embassy Magazine broke the story that all of Statistics Canada’s online data will not only be made free, but released under the Government of Canada’s Open Data License Agreement (updated and reviewed earlier this week) that allows for commercial re-use.

This decision has been in the works for months, and while it does not appear to have been formally announced, Embassy Magazine does appear to have managed to get a Statistics Canada spokesperson to confirm it is true. I have a few thoughts about this story: Some background, who wins from this decision, and most importantly, some hope for what it will, and won’t lead to next.

Background

In the embassy article, the spokesperson claimed this decision had been in the works for years, something that is probably technically true. Such a decision – or something akin to it – has likely been contemplated a number of times. And there have been a number of trials and projects that have allowed for some data to be made accessible albeit under fairly restrictive licenses.

But it is less clear that the culture of open data has arrived at StatsCan, and less clear to me that this decision was internally driven. I’ve met many a Statscan employee who encountered enormous resistance while advocating for data open. I remember pressing the issue during a talk at one of the department’s middle managers conference in November of 2008 and seeing half the room nod vigorously in agreement, while the other half crossed it arms in strong disapproval.

Consequently, with the federal government increasingly interested in open data, coupled with a desire to have a good news story coming out of statscan after last summer census debacle, and with many decisions in Ottawa happening centrally, I suspect this decision occurred outside the department. This does not diminish its positive impact, but it does mean that a number of the next steps, many of which will require StatsCan to adapt its role, may not happen as quickly as some will hope, as the organization may take some time to come to terms with the new reality and the culture shift it will entail.

This may be compounded by the fact that there may be tougher news on the horizon for StatsCan. With every department required to have submitted proposal to cut their budgets by either 5% and 10%, and with StatsCan having already seen a number of its programs cut, there may be fewer resources in the organization to take advantage of the opportunity making its data open creates, or even just adjust to what has happened.

Winners (briefly)

The winners from this decision are of course, consumers of statscan’s data. Indirectly, this includes all of us, since provincial and local governments are big consumers of statscan data and so now – assuming it is structured in such a manner – they will have easier (and cheaper) access to it. This is also true of large companies and non-profits which have used statscan data to locate stores, target services and generally allocate resources more efficiently. The opportunity now opens for smaller players to also benefit.

Indeed, this is the real hope. That a whole new category of winners emerges. That the barrier to use for software developers, entrepreneurs, students, academics, smaller companies and non-profits will be lowered in a manner that will enable a larger community to make use of the data and therefor create economic or social goods.

Such a community, however, will take time to evolve, and will benefit from support.

And finally, I think StatsCan is a winner. This decision brings it more profoundly into the digital age. It opens up new possibilities and, frankly, pushes a culture change that I believe is long over due. I suspect times are tough at StatsCan – although not as a result of this decision – this decision creates room to rethink how the department works and thinks.

Next Steps

The first thing everybody will be waiting for is to see exactly what data gets shared, in what structure and to what detail. Indeed this question arose a number of times on twitter with people posting tweets such as “Cool. This is all sorts of awesome. Are geo boundary files included too, like Census Tracts and postcodes?” We shall see. My hope is yes and I think the odds are good. But I could be wrong, at which point all this could turn into the most over hyped data story of the year. (Which actually matters now that data analysts are one of the fastest growing categories of jobs in North America).

Second, open data creates an opportunity for a new and more relevant role for StatsCan to a broader set of Canadians. Someone from StatsCan should talk to the data group at the World Bank around their transformation after they launched their open data portal (I’d be happy to make the introduction). That data portal now accounts for a significant portion of all the Bank’s web traffic, and the group is going through a dramatic transformation, realizing they are no longer curators of data for bank staff and a small elite group of clients around the world but curators of economic data for the world. I’m told a new, while the change has not been easy, a broader set of users have brought a new sense of purpose and identity. The same could be true of StatsCan. Rather than just an organization that serves the government of Canada and a select groups of clients, StatsCan could become the curators of data for all Canadians. This is a much more ambitious, but I’d argue more democratized and important goal.

And it is here that I hope other next steps will unfold. In the United States, (which has had free census data for as long as anyone I talked to can remember) whenever new data is released the census bureau runs workshops around the country, educating people on how to use and work with its data. StatsCan and a number of other partners already do some of this, but my hope is that there will be much, much more of it. We need a society that is significantly more data literate, and StatsCan along with the universities, colleges and schools could have a powerful role in cultivating this. Tracey Lauriault over at the DataLibre blog has been a fantastic advocate of such an approach.

I also hope that StatsCan will take its role as data curator for the country very seriously and think of new ways that its products can foster economic and social development. Offering APIs into its data sets would be a logical next step, something that would allow developers to embed census data right into their applications and ensure the data was always up to date. No one is expecting this to happen right away, but it was another question that arose on twitter after the story broke, so one can see that new types of users will be interested in new, and more efficient ways, of accessing the data.

But I think most importantly, the next step will need to come from us citizens. This announcement marks a major change in how StatsCan works. We need to be supportive, particularly at a time of budget cuts. While we are grateful for open data, it would be a shame if the institution that makes it all possible was reduced to a shell of its former self. Good quality data – and analysis to inform public policy – is essential to a modern economy, society, and government. Now that we will have free access to what our tax dollars have already paid for, let’s make sure that it stays that way, by both ensure it continues to be available, and that there continues to be a quality institution capable of collecting and analyzing it.

(sorry for typos – it’s 4am, will revise in the morning)

The Canadian Government's New Web 2.0 Guidelines: the Good, the Bad & the Ugly

Yesterday, the government of Canada released its new Guidelines for external use of Web 2.0. For the 99.99% of you unfamiliar  with what this is, it’s the guidelines (rules) that govern how, and when, public servants may use web 2.0 tools such as twitter and facebook.

You, of course, likely work in organization that survives without such documents. Congratulations. You work in a place where the general rule is “don’t be an idiot” and your bosses trust your sense of judgement. That said, you probably also don’t work somewhere where disgruntled former employees and the CBC are trolling the essentially personal online statements of your summer interns so they can turn it into a scandal. (Yes, summer student border guards have political opinions, don’t like guns and enjoy partying. Shocker). All this to say, there are good and rational reasons why the public service creates guidelines: to protect not just the government, but public servants.

So for those uninterested in reading the 31 page, 12,055 word guidelines document here’s a review:

The Good

Sending the right message

First off, the document, for all its faults, does get one overarching piece right. Almost right off the bat (top of section 3.2) is shares that Ministries should be using Web 2.0 tools:

Government of Canada departments are encouraged to use Web 2.0 tools and services as an efficient and effective additional channel to interact with the public. A large number of Canadians are now regularly using Web 2.0 tools and services to find information about, and interact with, individuals and organizations.

Given the paucity of Web 2.0 use in the Federal government internally or externally this clear message from Treasury Board, and from a government minister, is the type of encouragement needed to bring government communications into 2008 (the British Government, with its amazing Power of Information Taskforce, has been there for years).

Note: there is a very, very, ugly counterpart to this point. See below.

Good stuff for the little guy

Second, the rules for Professional Networking & Personal Use are fairly reasonable. There are some challenges (notes below), but if any public servant ever finds them or has the energy to read the document, they are completely workable.

The medium is the message

Finally, the document acknowledges that the web 2.0 world is constantly evolving and references a web 2.0 tool by which public servants can find ways to adapt. THIS IS EXACTLY THE RIGHT APPROACH. You don’t deal with fast evolving social media environment by handing out decrees in stone tablets, you manage it by offering people communities of practice where they can get the latest and best information. Hence this line:

Additional guidance on the use of Web 2.0 tools and services is in various stages of development by communities of expertise and Web 2.0 practitioners within the Government of Canada. Many of these resources are available to public servants on the Government of Canada’s internal wiki, GCpedia. While these resources are not official Government of Canada policies or guidelines, they are valuable sources of information in this rapidly evolving environment.

Represents a somewhat truly exciting development in the glacially paced evolution of government procedures. The use of social media (GCPEDIA) to manage social media.

Indeed, still more exciting for me is that this was the first time I’ve seen an official government document reference GCPEDIA as a canonical source of information. And it did it twice, once, above, pointing to a community of practice, the second was pointing to the GCPEDIA “Social media procurement process” page. Getting government to use social media internally is I think the biggest challenge at the moment, and this document does it.

The Bad

Too big to succeed

The biggest problem with the document is its structure. It is so long, and so filled with various forms of compliance, that only the most dedicated public servant (read, communications officer tasked with a social media task) will ever read this. Indeed for a document that is supposed to encourage public servants to use social media, I suspect it will do just the opposite. Its density and list of controls will cause many who were on the fence to stay there – if not retreat further. While the directions for departments are more clear, for the little guy… (See next piece)

Sledgehammers for nails

The documents main problem is that it tries to address all uses of social media. Helpfully, it acknowledges there are broadly two types of uses “Departmental Web 2.0 initiatives” (e.g. a facebook group for a employment insurance program) and “personnel/professional use” (e.g. a individual public servant’s use of twitter or linked in to do their job). Unhelpfully, it addresses both of them.

In my mind 95% of the document relates to departmental uses… this is about ensuring that someone claiming to represent the government in an official capacity does not screw up. The problem is, all those policies aren’t as relevant to Joe/Jane public servant in their cubicle trying to find an old colleague on LinkedIn (assuming they can access linkedin). It’s overkill. These should be separate documents, that way the personal use document could be smaller, more accessible and far less intimidating. Indeed, as the guidelines suggest, all it should really have to do is reference the Values and Ethics Code for the Public Service (essentially the “idiots guide to how not to be an idiot on the job” for public servants) and that would have been sufficient. Happily most public servants are already familiar with this document, so simply understanding that those guidelines apply online as much as offline, gets us 90% of the way there.

In summary, despite a worthy effort, it seem unlikely this document will encourage public servants to use Web 2.0 tools in their jobs. Indeed, for a (Canadian) comparison consider the BC Government’s guidelines document, the dryly named “Policy No. 33: Use of Social Media in the B.C. Public Service.”  Indeed, despite engaging both use cases it manages covers all the bases, is straightforward, and encouraging, and treats the employee with an enormous amount of respect. All this in a nifty 2 pages and 1,394 words. Pretty much exactly what a public servant is looking for.

The Ugly

Sadly, there is some ugliness.

Suggestions, not change

In the good section I mentioned that the government is encouraging ministries to use social media… this is true. But it is not mandating it. Nor does these guidelines say anything to Ministerial IT staff, most of whom are blocking public servant’s access to sites like facebook, twitter, in many cases, my blog, etc… The sad fact is, there may now be guidelines that allow public servants to use these tools, but in most cases, they’d have to go home, or to a local coffee shop (many do) in order to actually make use of these guidelines. For most public servants, much of the internet remains beyond their reach, causing them to fall further and further behind in understanding how technology will effect their jobs and their department/program’s function in society.

It’s not about communication, it’s about control

In his speech at PSEngage yesterday the Treasury Board Minister talked about the need for collaboration on how technology can help the public service reinvent how it collaborates:

The Government encourages the use of new Web 2.0 tools and technologies such as blogs, wikis, Facebook, Twitter and YouTube. These tools help create a more modern, open and collaborative workplace and lead to more “just-in-time” communications with the public.

This is great news. And I believe the Minister believes it too. He’s definitely a fan of technology in all the right ways. However, the guidelines are mostly about control. Consider this paragraph:

Departments should designate a senior official accountable and responsible for the coordination of all Web 2.0 activities as well as an appropriate governance structure. It is recommended that the Head of Communications be the designated official. This designate should collaborate with departmental personnel who have expertise in using and executing Web 2.0 initiatives, as well as with representatives from the following fields in their governance structure: information management, information technology, communications, official languages, the Federal Identity Program, legal services, access to information and privacy, security, values and ethics, programs and services, human resources, the user community, as well as the Senior Departmental Official as established by the Standard on Web Accessibility. A multidisciplinary team is particularly important so that policy interpretations are appropriately made and followed when managing information resources through Web 2.0 tools and services.

You get all that? That’s at least 11 variables that need to be managed. Or, put another way, 11 different manuals you need to have at your desk when using social media for departmental purposes. That makes for a pretty constricted hole for information to get out through, and I suspect it pretty much kills most of the spontaneity, rapid response time and personal voice that makes social media effective. Moreover, with one person accountable, and this area of communications still relatively new, I suspect that the person in charge, given all these requirements, is going to have a fairly low level of risk. Even I might conclude it is safer to just post an ad in the newspaper and let the phone operators at Service Canada deal with the public.

Conclusion

So it ain’t all bad. Indeed, there is much that is commendable and could be worked with. I think, in the end, 80% of the problems with the document could be resolved if the government simply created two versions, one for official departmental uses, the other for individual public servants. If it could then restrain the lawyers from repeating everything in the Values and Ethics code all over again, you’d have something that social media activists in the public service could seize upon.

My sense is that the Minister is genuinely interested in enabling public servants to use technology to do their jobs better – he knows from personal experience how helpful social media can be. This is great news for those who care about these issues, and it means that pressing for a better revised version might yield a positive outcome. Better to try now, with a true ally in the president’s office than with someone who probably won’t care.

 

How Architecture Made SFU Vancouver’s University

For those unfamiliar with Vancouver, it is a city that enjoys a healthy one way rivalry between two university: the University of British Columbia (UBC) and Simon Fraser University (SFU).

Growing up here I didn’t think much of Simon Fraser. I don’t mean that in a disparaging way, I mean it literally. SFU was simply never on my radar. UBC I knew. As high school students we would sneak out to its libraries to study for finals and pretend we were mature than we were. But SFU? It was far away. Too remote. Too inaccessible by public transit. Heck, too inaccessible by car(!).

And yet today when I think of Vancouver’s two universities UBC is the one that is never on my radar. After noticing that several friends will be on a panel tonight on How Social Media is Changing Politics at UBC’s downtown campus I was reminded of the fact that UBC has a downtown campus. It may be the most underutilized and unloved space in the University. This despite the fact it sits in the heart of Vancouver and under some of the most prime real estate in the city. In fact I don’t think I’ve actually ever been to UBC’s downtown campus.

In contrast I can’t count the number of time’s I’ve been to SFU’s downtown campus. And the reason is simple: architecture. It’s not that SFU simple invests in its downtown campus making it part of the university, it’s that it invested in Vancouver by building one of the most remarkable buildings in the city. If you are in Vancouver, go visit The Wosk Centre for Dialogue. It is amazing. Indeed, I feel so strongly about it, I included it in my top ten favourite places when Google Maps added me to their list of Lat/Long experts for the 2010 Winter Olympics.

Photo by Bryan Hughes

What makes the Wosk Centre so fantastic? It seats 180 or so people in concentric circles, each with their own a mic. It may be the only place where I’ve felt a genuine conversation can take place with such a large group. I’ve seen professors lecture, drug addicts share stories, environmentalists argue among one another and friends debate one another, and it has always been eye opening. Here, in the heart of the city, is a disarming space where stakeholders, experts, citizens or anyone, can be gathered to share ideas and explore their differences in a respectful manner. Moreover, the place just looks beautiful.

Centre-for-dialogue-300x300

The building is a testament to how architecture and design can fundamentally alter the relationship between an institution and the city within which it resides. Without the Wosk Centre I’m confident SFU’s downtown presence would have meant much less to me. Moreover, I’m fully willing to agree that UBC is the better university. It ranks better in virtually ever survey, it has global ambitions that even achievable and likely does not want to be involved in the city. That’s a strategic choice I can, on one level, respect. But on a basic level, the Wosk Centre makes SFU relevant to Vancouverites and in doing so, allows the University to punch above its weight, at least locally. And that has real impact, at least for the city’s residents. But I think for the university as well.

Reading the always excellent Steven Johnson’s Where Good Ideas Come From I can’t help but think the UBC is missing out on something larger. As Johnson observers, good ideas arise from tensions, from the remixing of other ideas, particularly those from disparate places. They rarely come from the deep thinker isolated out in the woods (UBC lies at the edge of Vancouver beyond a large park) or meditating on a mountain top (SFU’s core campus is atop a small mountain) but out of dense networks where ideas, hunches and thoughts can find one another. Quiet meditation is important. But so to is engagement. Being in the heart of a bustling city is perhaps a distraction, but that may be the point. Those distractions create opportunities, new avenues for exploration and, for universities concerned with raising money from their intellectual capital, to find problems in search of solutions. So raising a structure that is designed to explicitly allow tensions and conflicts to play out… I can’t help but feel that is a real commitment to growth and innovation in a manner that not only gives back to its host community, but positions one to innovate in a manner and pace the 21st century demands.

As such, the Wosk Centre, while maybe a shade formal, is a feat of architecture and design, a building that I hope enables a university to rethink itself, but that has definitely become a core part of the social infrastructure of the city and redrawn at least my own relationship with SFU.

Weaving Foreign Ministries into the Digital Era: Three ideas

Last week I was in Ottawa giving a talk at the Department of Foreign Affairs talking about how technology, new media and open innovation will impact the department’s it work internally, across Ottawa and around the world.

While there is lots to share, here are three ideas I’ve been stewing on:

Keep more citizens safe when abroad – better danger zone notification

Some people believe that open data isn’t relevant to departments like Foreign Affairs or the State Department. Nothing could be further than the truth.

One challenge the department has is getting Canadians to register with them when they visit or live in a country labeled by the department as problematic for traveling in its travel reports (sample here). As you can suspect, few Canadians register with the embassy as they are likely not aware of the program or travel a lot and simply don’t get around to  it.

There are other ways of tackling this problem that might yield broader participation.

Why not turn the Travel Report system into an open data with an API? I’d tackle this by approaching a company like TripIt. Every time I book an airplane ticket or a hotel I simply forward TripIt the reservation, which they scan and turn into events that then automatically appear my calendar. Since they scan my travel plans they also know which country, city and hotel I’m staying in… they also know where I live and could easily ask me for my citizenship. Working with companies like TripIt (or Travelocity, Expedia, etc…) DFAIT could co-design an API into the departments travel report data that would be useful to them. Specifically, I could imagine that if TripIt could query all my trips against those reports then any time they notice I’m traveling somewhere the Foreign Ministry has labelled “exercise a high-degree of caution” or worse trip TripIt could ask me if I’d be willing to let them forward my itinerary to the department. That way I could registry my travel automatically, making the service more convenient for me, and getting the department more information that it believes to be critical as well.

Of course, it might be wise to work with the State Department so that their travel advisories used a similarly structured API (since I can assume TripIt will be more interested in the larger US market than the Canadian market) But facilitating that conversation would be nothing but wins for the department.

More bang for buck in election monitoring

One question that arose during my talk came from an official interested in elections monitoring. In my mind, one thing the department should be considering is a fund to help local democracy groups spin up installations of Ushahidi in countries with fragile democracies that are gearing up for elections. For those unfamiliar with Ushahidi it is a platform developed after the disputed 2007 presidential election in Kenya that plotted eyewitness reports of violence sent in by email and text-message on a google map.

Today it is used to track a number of issues – but problems with elections remain one of its core purposes. The department should think about grants that would help spin up a Ushahidi install to enable citizens of the country register concerns and allegations around fraud, violence, intimidation, etc… It could then verify and inspect issues that are flagged by the countries citizens. This would allow the department to deploy its resources more effectively and ensure that its work was speaking to concerns raised by citizens.

A Developer version of DART?

One of the most popular programs the Canadian government has around international issues is the Disaster Assistance Response Team (DART). In particular, Canadians have often been big fans of DART’s work in purifying water after the boxing day tsunami in Asia as well as its work in Haiti. Maybe the department could have a digital DART team, a group of developers that, in an emergency could help spin up Ushahidi, Fixmystreet, or OpenMRS installations to provide some quick but critical shared infrastructure for Canadians, other countries’ response teams and for non-profits. During periods of non-crisis the team could work on these projects or supporting groups like CrisisCommons or OpenStreetMaps, helping contribute to open source projects that can be instrumental in a humanitarian crisis.

 

Gov 2.0: Network Analysis for Income Inequality?

I’ve been thinking a lot about these two types of graphs at the moment.  This first is a single chart that shows income growth for various segments of the US population broken down by wealth.

This second is a group of graphs that talk about pageviews and visits to various websites on the internet.

bits-tue71-custom2Top-10-Social-Networking-Sites-by-Market-Share-of-Visits-June-2011July-Search-Engine-Market-Share

What is fascinating about the internet stats is that they are broadly talking about distribution among the top websites – forget about everyone else where the pageviews become infinitesimally small. So even among top websites have a power law distribution, which must be even stronger once one starts talking about all websites.

And this is what I’m frequently told. That the distribution of pageviews, visits and links on the internet looks a lot like the first graph, although possibly even more radically skewed.

In other words while the after-tax income chart isn’t a clean curve, the trends of the two are likely very similar – except that the top 1% of websites do even better than the top 1% of after tax income earners. So both charts look like power law distributions.

Does this matter? I’m not sure, but I’m playing with some thoughts. While I’m confident that the income chart as power law distribution has replicated itself several times in history (such as during the lead up to the great depression, what is less clear to me is if the exponential growth has ever happened so fast? (would be fascinating to know if others have written on this). The rich have often gotten richer – but have they gotten richer this quickly before?

And is this what happens in a faster, more networked economy? Maybe the traits of the online network and its power law distribution are beginning to impact the socioeconomic network of our society at large?

Could this also mean that we need some new ways to ensure social and economic mobility in our economy and society. Network effects are obviously powerful online, but have also, historically, been important offline. In society, your location on that curve creates advantages, it likely gives you access to peers and capital which position you to maintain your status in the network. Perhaps the internet, rather than making the network that is our society more fluid, is actually doing the opposite. It is increasingly the power law distribution, meaning the network effects are getting stronger, further reinforcing advantages and disadvantages. This might have important implications for social and economic mobility.

Either way, applying some network analysis to income inequality and social mobility as well as the social programs we put in place to ensure equality of opportunity, might be a good frame on these problems. I’d love to read anything anyone has written on this – very much open to suggestions.

Using Open Data to drive good policy outcomes – Vancouver’s Rental Database

One of the best signs for open data is when governments are starting to grasp its potential to achieve policy objectives. Rather than just being about compliance, it is seen as a tool that can support the growth and management of a jurisdiction.

This why I was excited to see Vision Vancouver (in which I’m involved in generally, but was not involved in the development of this policy proposal) announced the other day that, if elected, it intends to create a comprehensive online registry that will track work orders and property violations in Vancouver apartments, highlighting negligent landlords and giving a new tool to empower renters.

As the press release goes on to state, the database is “Modeled after a successful on-line watchlist created by New York City’s Public Advocate, the database will allow Vancouver residents to search out landlords and identify any building or safety violations issued by the City of Vancouver to specific rental buildings.”

Much like the pieces I’ve written around restaurant inspection and product recall data, this is a great example of a data set, that when shared the right way, can empower citizens to make better choices and foster better behaviour from landlords.

My main hope is that in the implementation of this proposal, the city does the right thing and doesn’t create a searchable database on its own website, but actually creates an API that software developers and others can tap into. If they do this, someone may develop a mobile app for renters that would show you the repair record of the building you are standing in front of, or in. This could be very helpful for renters, one could even imagine an app where you SMS the postal code of a rental building and it sends you back some basic information. Also exciting to me is the possibility that a university student might look for trends in the data over time, maybe there is an analysis that my yield and insight that could help landlords mitigate against problems, and reduce the number of repairs they have to make (and so help reduce their costs).

But if Vancouver and New York actually structured the data in the same way, it might create an incentive for other cities to do the same. That might entice some of the better known services to use the data to augment their offerings as well. Imagine if PadMapper, in addition to allowing a prospective renter to search for apartments based on rent costs and number of rooms, could also search based on number of infractions?

pad-mapper-rental

That might have a salutary effect on some (but sadly not all) landlords. All an all an exciting step forward from my friends at Vision who brought open data to Canada.

And Now… Another Message on Open Innovation for Realtors

Over the past few months I’ve given a number of talks on open data and open innovation to groups of realtors around the country. During these talks I have cautioned that the more the real estate industry tries to protect (e.g. not share) its data, the more it risks making access to data (control) be the source of competition as opposed to accessibility with the data (allowing others to create value-added services).

Consequently, recreating already existing data sets will become the goal of competitors if working with the real estate industries data to innovate new services is not possible or prohibitively expensive. Competition on this axis has, I believe, two possible outcomes: One is a winner take all world where the person with the biggest data set wins, or put another way, either the current monopoly maintains its grip or a new monopoly takes over. The other is that there ceases to be, in a meaningful way, a single data repository on real estate and market place data gets broken into lots of different silos. This implications of this outcome are less clear, but there is a risk it could be bad for consumers as, essentially, market information would be fragmented. In either case, both outcomes carry significant risks for organized real estate.

Despite this, many realtors don’t believe it is likely to happen, because they don’t believe their data can be duplicated, no matter how much I try to tell them otherwise.

But…. whoops! Look what happened!

Today, as if to hammer home what I believe is the inevitable, the Globe and Mail has an article today on a new service that allows one to get an estimate on the assessed value of one’s home by accessing an alternative data set (one essentially created by the banks). It is basically a data set that is outside of any owned by organized real estate (that found in MLS). Look! Someone has recreated what was previously seen as a data set that could not be replicated!

Of course the counter is: “It isn’t as good.” Well, two things here. First, it may not have to be. If it offers 80% of the accuracy and 20% of the cost then it will probably be good enough for at least part of the market. And once it is established, I’m confident the owners of the website will find ways to make their service better and the data more accurate.

The real estate industry has an opportunity to shape its future or be shaped by the future. The market (and the competition bureau) isn’t going to give them for forever to make up their minds.