Category Archives: public policy

Neo-Progressive Watch: Rahm Emanuel vs. Teachers Union

Anyone who read Obama’s book, The Audacity of Hope will have been struck with the amount of time the then aspiring presidential candidate spent writing about public education policy. More notably, he seemed to acknowledge that any effort at education reform was, at some point, going to butt heads with teachers unions and that new approaches were either going to have to be negotiated or imposed. It was a point of tension that wasn’t much talked about in the reviews I read. But it always struck me as interesting that here was Obama, a next generation progressive, railing against the conservatism of what is possible the original pillar of the progressive movement: public education.

All of this has, of course, been decidedly forgotten given both the bigger problems the president has faced and by the fact that he’s been basically disinterested in monkeying around in public education policy since taking office. That’s why it is still more fascinating to see what his disciples are doing as they get involved in levels of government that are in more direct contact with this policy area. Here, none is more interesting to watch than Rahm Emanuel.

This Saturday my friend Amy L. pointed me to a New York Times article outlining the most recent battle between Rahm Emanuel and the teacher’s union. My own take is that the specifics of the article are irrelevant, what matters is the broad theme. In short, Rahm Emanuel is on a short timeline. He needs to produce results immediately since local elections both happen more frequently and one is much, much closer to the citizen. That said, he doesn’t have to deliver uniform results, progress, in of itself may be sufficient. Indeed, a little experimentation is profoundly good given it can tease out faster and cheaper ways to deliver said results.

In contrast, the teacher’s union faces few of the pressures experienced by Rahm. It can afford to move at a slower pace and, more importantly, wants a uniform level of treatment across the entire system. Indeed, its entire structure is built around the guarantee of uniform treatment for its members. This uniformity is a value that evolved parallel to but not of progressive thinking. It is an artifact of industrial production that gets confused with progressive thought because of the common temporal lineage.

This skirmish offers a window into the major battle that is going to dominate the our politics in about a decade. I increasingly suspect we are moving into a world where the possibilities for education, thanks to the web and social networks, is going to be completely altered. What we deem is possible, what parents demand, and the skills that are seen as essential, are all going to shift. Our educational system, its schools, the school boards and, of course, the unions, are still bound in a world of mass production – shifting students from room to room to prepare them for the labour and production jobs of the 20th century. No matter how gifted the teachers (and there are many who are exceedingly gifted) they remain bound by the structure of the system the education system, the school boards, and the unions, have built and enforce.

Of course, what is going to be in demand are students that can thrive in the world of mass collaboration and peer production in the 21st century -behaviours that are generally viewed as “cheating” in the current model. And parents who are successful in 21st century jobs are going to be the first to ensure their children get the “right” kind of education. Which is going to put them at odds with the current education system.

This is all this is to say that the real question crisis is: how quickly will educational systems be able to adapt? Here both the school boards and the unions play an enormous role, but it is the unions that, it would appear, may be a constraining factor. If they find that having Rahm engage schools directly feels like a threat, I suspect they are going to find the next 20 years a rough, rough ride. Something akin to how the newspapers have felt regarding the arrival of the internet and craigslist.

What terrifies me most, is that unless we can devise a system where teachers are measured and so good results can be both rewarded and shared… and where parents and students have more choices around education, then families (that can afford to) are going to vote with their feet. In fact, you already see it in my home town.

The myth in Vancouver is that high property values are driving families – and thus children – out of the city. But this is patently not true. The fantastic guys over at Bing Thom Architects wrote a report on student populations in Vancouver. According to their research, in the last 10 years the estimated number of elementary and secondary aged children in Vancouver has risen by 3% (around 2,513 new students). And yet, the number of students enrolled in public education facilities has declined by 5.46%. (around 3,092 students). In fact, the Vancouver School Boards numbers seem to indicate the decline may be more pronounced.

In the meantime the number of private/independent schools has exploded by 43% going from 39 to 68 with enrollment increases of 13.8%. (Yes that does leave a surplus of students unaccounted for, I suspect they are also in private/independent schools, but outside of the City of Vancouver’s boundaries). As a public school graduate myself, one who had truly fantastic teachers but who also benefited from enormous choice (IB, French Immersion) the numbers of the past decade are very interesting to immerse oneself in.

Correct or incorrect, it would seem parents are opting for schools that offer a range of choices around education. Of course, it is only the parents who can afford to do this that are doing it. But that makes the outcome worse, not better. With or without the unions, education is going to get radically rethought. It would be nice if it was the public sector that lead that revolution, or at least was on the vanguard of it. But if our public sector managers and teachers are caught arguing over how to adjust the status quo by increments, it is hard to see how our education policy is going to make a quantum leap into the 21st century.

Interview with Charles Leadbeater – Monday September 19th

I’m excited to share that I’ll be interviewing British public policy and open innovation expert Charles Leadbeater on September 19th as part of a SIG’s webinar series. For readers not familiar with Charles Leadbeater, he is the author of We-Think and numerous other chapters, pamphlets and articles, ranging in focus from social innovation, to entrepreneurship to public sector reform. He served as an adviser to Tony Blair and has a long standing relationship with the British think tank Demos.

Our conversation will initially focus on open innovation, but I’m sure will range all over, touching on the impact of open source methodologies on the private, non-profit and public sector, the future of government services and, of course, the challenges and opportunities around open data.

If you are interested in participating in the webinar you can register here. There is a small fee I’m told is being charged to recover some of the costs for running the event.

If you are participating and have a question you’d like to see asked, or a theme or topic you’d like to see covered, please feel free to comment below or, if you prefer more discretion, send me an email.

The Economics of Open Data – Mini-Case, Transit Data & TransLink

TransLink, the company that runs public transit in the region where I live (Vancouver/Lower Mainland) has launched a real time bus tracking app that uses GPS data to figure out how far away the next the bus you are waiting for really is. This is great news for everyone.

Of course for those interested in government innovation and public policy it also leads to another question. Will this GPS data be open data?

Presently TransLink does make its transit schedule “open” under a non-commercial license (you can download it here). I can imagine a number of senior TransLink officials (and the board) scratching their head asking: “Why, when we are short of money, would we make our data freely available?”

The answer is that TransLink should make its current data, as well as its upcoming GPS data, open and available under a license that allows for both non-commercial and commercial re-use, not just because it is the right thing to do, but because the economics of it make WAY MORE SENSE FOR TRANSLINK.

Let me explain.

First, there are not a lot of obvious ways TransLink could generate wealth directly from its data. But let’s take two possible opportunities: the first involves selling a transit app to the public (or advertising in such an app), the second is through selling a “next bus” service to companies (say coffee shops or organizations) that believe showing this information might be a convenience to their employees or customers.

TransLink has already abandoned doing paid apps – instead it maintains a mobile website at m.translink.ca – but even if it created an app and charged $1 per download, the revenue would be pitiful. Assuming a very generous customer base of 100,000 users, TransLink would generate maybe $85,000 dollars (once Apple takes its cut from the iPhone downloads, assuming zero cut for Androids). But remember, this is not a yearly revenue stream, it is one time. Maybe, 10-20,000 people upgrade their phone, arrive in Vancouver and decide to download every year. So your year on year revenue is maybe $15K? So over a 5 year period, TransLink ends up with an extra, say $145,000 dollars. Nothing to sneeze at, but not notable.

In contrast a free application encourages use. So there is also a cost to not giving it away. It could be that, having transit data more readily available might cause some people to choose taking transit over say, walking, or taking a taxi or driving. Last year TransLink handled 211.3 million trips. Let’s assume that more accessible data from wider access to the data meant there was a .1% increase in the number of trips. An infinitesimally small increase – but it means 211,300 more trips. Assuming each rider pays a one zone $2.50 fare that would still translate in an additional revenue of $528,250. Over the same five year period cited above… that’s revenue of $2.641M, much better than $145,000. And this is just calculating money. Let’s say nothing of less congested roads, less smog and a lower carbon footprint for the region…

When the this analysis is applied to licensing data it produces the same result. Will UBC pay to have TransLink’s real time data on terminals in the Student Union building? I doubt it. Would some strategically placed coffee shops… possibly. Obviously organizations would have to pay for the signs, but adding on annual “data license fee” to display’s cost would cause some to opt out. And once you take into account managing the signs, legal fees, dealing with the contract and going through the sales process, it is almost inconceivable that TransLink would make more money from these agreements than it would from simply having more signs everywhere created by other people that generated more customers for its actual core business: moving people from A to B for a fee. Just to show you the numbers, if shops that weren’t willing to pay for the data put up “next bus” screens that generated a mere 1000 new regular bus users who did only 40 one way trips a year (or 40,000 new trips), this would equal revenue of $100,000 every year at no cost to translink. Someone else could install and maintain the signs, no contracts or licenses would need to be managed.

From a cost recovery perspective it is almost impossible to imagine a scenario where TransLink is better off not allowing commercial re-use of its data.

My point is that TransLink should not be focused on creating a few bucks from licensing its data (which it doesn’t do right now anyway). It should be focused on shifting the competitive value in the marketplace from access to accessibility.

Being the monopoly holder of transit data does not benefit TransLink. All it means is that fewer people see and engage with its data. When it makes the data open and available “access” no longer becomes the defining advantage. When anybody (e.g. TransLink, Google, independent developers) can access the data, the market place shifts to competing on access to competing on accessibility. Consumers don’t turn to who has the data, they turn to who makes the data easiest to use.

For example, Translink has noted that in 2011 it will have a record number of trips. Part of me wonders to what degree the increase in trips over the past few years is a result of making transit data accessible in Google Maps. (Has anyone done a study on this in any jurisdiction?) The simple fact is that Google maps is radically easier to use for planning transit journeys than Translink’s own website AND THAT IS A GOOD THING FOR TRANSLINK. Now imagine if lots of companies were sharing translink’s data? The local Starbucks and Blenz Coffee, to colleges and universities and busy buildings downtown. Indeed, the real crime right now is that Translink has handed Google a defacto monopoly. It is allowed to use the data for commercial re-use. Local tax-paying developers…? Not so according to the license they have to click through.

Translink, you want a world where everyone is competing (including against you) on accessibility. In the end… you win with greater use and revenue.

But let me go further. There are other benefits to having Translink share its data for commercial re-use.

Procurement

Some riders will note that there are already bus stops in Vancouver which display “next bus” data (e.g. how many minutes away the next bus is). If TransLink made its next bus data freely available via an API it could conceivably alter the procurement process for buying and maintaining these signs. Any vendor could see how the data is structured and so take over the management of the signs, and/or experiment with creating more innovative or cheaper ways of manufacturing them.

The same is true of creating the RFP for TransLink’s website. With the data publicly available, TransLink could simple ask developers to mock up what they think is the most effective way of displaying the data. More development houses might be enticed to respond to the RFP increasing the likelihood of innovations and putting downward pressure of fees.

Analysis

Of course, making GPS data free could have an additional benefit. Local news companies might be able to use the bus’s GPS data to calculate traffic flow rates and so predict traffic jams. Might they be willing to pay TransLink for the data? Maybe, but again probably not enough to justify the legal and sales overhead. Moreover, TransLink would benefit from this analysis – as it could use the reports to adjust its schedule and notify its drivers of problems beforehand. Of course everyone would benefit as well as better informed commuters might change their behaviour (including taking transit!) reducing congestion, smog, carbon footprint, etc…

Indeed, the analysis opportunities using GPS data are potentially endless – much of which might be done by bloggers and university students. One could imagine correlating actual bus/subway times with any other number of data sets (crime, commute times, weather) that could yield interesting information that could help TransLink with its planning. There is no world where TransLink has the resources to do all this analysis, so enabling others to do it, can only benefit it.

Conclusion

So if you are at TransLink/Coast Mountain Bus Company (or any transit authority in the world), this post is for you. Here’s what I suggest as next steps:

1) Add GPS bus tracking API to your open data portal.

2) Change your license. Drop the non-commercial part. It hurts your business more than you realize and is anti competitive (why does can Google use the data for a commercial application while residents of the lower mainland cannot?). My suggestion, adopt the BC Government Open Government License or the PDDL.

3) Add an RSS feed to your GTFS data. Like Google, we’d all like to know when you update your data. Given we live here and are users, it be nice to extend the same service to us as you do them.

4) Maybe hold a Transit Data Camp where you could invite local developers and entrepreneurs to meet your staff and encourage people to find ways to get transit data into the hands of more Lower Mainlanders and drive up ridership!

 

 

Smarter Ways to Have School Boards Update Parents

Earlier this month the Vancouver School Board (VSB) released an iPhone app that – helpfully – will use push notifications to inform parents about school holidays, parent interviews, and scheduling disruptions such as snow days. The app is okay, it’s a little clunky to use, and a lot of the data – such as professional days – while helpful in an app, would be even more helpful as an iCal feed parents could subscribe to in their calendars.

That said, the VSB deserves credit for having the vision of developing an app. Positively, the VSB app team hopes to add new features, such as letting parents know about after school activities like concerts, plays and sporting events.

This is a great innovation and without a doubt, other school boards will want apps of their own. The problem is, this is very likely to lead to an enormous amount of waste and duplication. The last thing citizens want is for every school board to be spending $15-50K developing iPhone apps.

Which leads to a broader opportunity for the Minister of Education.

Were I the Education Minister, I’d have my technology team recreate the specs of the VSB app and propose an RFP for it but under an open source license and using phonegap so it would work on both iPhone and Android. In addition, I’d ensure it could offer reminders – like we do at recollect.net – so that people could get email or text messages without a smart phone at all.

I would then propose the ministry cover %60 percent of the development and yearly upkeep costs. The other 40% would be covered by the school boards interested in joining the project. Thus, assuming the app had a development cost of $40K and a yearly upkeep of $5K, if only one school board signed up it would have to pay $16K for the app (a pretty good deal) and $2K a year in upkeep. But if 5 school districts signed up, each would only pay $3.2K in development costs and $400 dollars a year in upkeep costs. Better still, the more that sign up, the cheaper it gets for each of them. I’d also propose a governance model in which those who contribute money for develop would have the right to elect a sub-group to oversee the feature roadmap.

Since the code would be open source other provinces, school districts and private schools could also use the app (although not participate in the development roadmap), and any improvements they made to the code base would be shared back to the benefit of BC school districts.

Of course by signing up to the app project school boards would be committing to ensure their schools shared up to date notifications about the relevant information – probably a best practice that they should be doing anyways. This process work is where the real work lies. However, a simple webform (included in the price) would cover much of the technical side of that problem. Better still the Ministry of Education could offer its infrastructure for hosting and managing any data the school boards wish to collect and share, further reducing costs and, equally important, ensuring the data was standardized across the participating school boards.

So why should the Ministry of Education care?

First, creating new ways to update parents about important events – like when report cards are issued so that parents know to ask for them – helps improve education outcomes. That should probably reason enough, but there are other reasons as well.

Second, it would allow the ministry, and the school boards, to collect some new data: professional day dates, average number of snow days, frequency of emergency disruptions, number of parents in a district interested in these types of notifications. Over time, this data could reveal important information about educational outcomes and be helpful.

But the real benefit would be in both cost savings and in enabling less well resourced school districts to benefit from technological innovation wealthier school districts will likely pursue if left to their own devices. Given there are 59 english school districts in BC, if even half of them spent 30K developing their own iPhone apps, then almost $1M dollars would be collectively spent on software development. By spending $24K, the ministry ensures that this $1M dollars instead gets spent on teachers, resources and schools. Equally important, less tech savvy or well equipped school districts would be able to participate and benefit.

Of course, if the City of Vancouver school district was smart, they’d open source their app, approach the Ministry of Education and offer it as the basis of such a venture. Doing that wouldn’t just make them head of the class, it’d be helping everyone get smarter, faster.

Open Data and New Public Management

This morning I got an email thread pointing to an article by Justin Longo on #Opendata: Digital-Era Governance Thoroughbred or New Public Management Trojan Horse? I’m still digesting it all but wanted to share some initial thoughts.

The article begins with discussion about the benefits of open data but its real goal is to argue how open data is a pawn in a game to revive the New Public Management Reform Agenda:

My hypothesis, based on a small but growing number of examples highlighting political support for open data, is that some advocates—particularly politicians, but not exclusively—are motivated by beliefs (both explicit and unconscious) forged in the New Public Management (NPM) reform agenda.

From this perspective, support for more open data aims at building coalitions of citizen consumers who are encouraged to use open data to expose public service decisions, highlight perceived performance issues, increase competition within the public sector, and strengthen the hand of the citizen as customer.

What I found disappointing is the article’s one dimensional approach to the problem: open data may support a theory/approach to public management disliked by the author, consequently (inferring from the article’s title and tone) it must be bad. This is akin to saying any technology that could be used to advance an approach I don’t support, must be opposed.

In addition, I’d say that the idea of exposing public service decisions, highlighting perceived performance issues, increasing competition within the public sector, and strengthening the hand of the citizen as customer are goals I don’t necessarily oppose, certainly not categorically. Moreover, I would hope such goals are not exclusively the domain of NPM. Do we want a society where government’s performance issues are not highlighted? Or where public service decisions are kept secret?

These are not binary choices. You can support the outcomes highlighted above and simultaneously believe in other approaches to public sector management and/or be agnostic about the size of government. Could open data be used to advance NPM? Possibly (although I’m doubtful). But it definitely can also be used to accomplish a lot of other good and potentially advance other approaches as well. Let’s not conflate a small subset of ways open data can be used or a small subset of its supporters with the entire project and then to lump them all into a single school of thought around public service management.

Moreover, I’ve always argued that the biggest users and benefactors of open data would be government – and in particular the public service. While open data could be used to build “coalitions of citizen consumers who are encouraged to use open data to expose public service decisions” it will also be used by public servants to better understand citizens needs, be more responsive and allocate resources more effectively. Moreover, those “citizen consumers” will probably be effective in helping them achieve this task. The alternative is to have better shared data internally (which will eventually happen), an outcome that might allow the government to achieve these efficiencies but will also radically increase the asymmetry in the relationship between the government and its citizens and worse, between the elites that do have privileged access to this data, and the citizenry (See Taggart below).

So ignoring tangible benefits because of a potential fear feels very problematic. It all takes me back to Kevin Kelly and What Technology Wants… this is an attempt to prevent an incredibly powerful technology because of a threat it poses to how the public sector works. Of course, it presumes that a) you can prevent the technology and b) that not acting will allow the status quo or some other preferred approach to prevail. Again, there are outcomes much, much worse the NPM that are possible (again, I don’t believe that open data leads directly to NPM) and I would argue, indeed likely, given evolving public expectations, demographics, and fiscal constraints.

In this regard, the article sets of up a false choice. Open data is going to reshape all theories of public management. To claim it supports or biases in favour of one outcome is, I think beyond premature. But more importantly, it is to miss the trees for the forest and the much bigger fish we need to fry. The always thoughtful Chris Taggart summed much of this up beautifully in an email thread:

I think the title — making it out to be a choice between a thoroughbred or Trojan Horse — says it all. It’s a false dichotomy, as neither of those are what the open data advocates are suggesting it is, nor do most of us believe that open data is solution to all our problems (far from it — see some of my presentations[1]).

It also seems to offer a choice between New Public Management (which I think Emer Coleman does a fairly good job of illuminating in her paper[2]) and the brave new world of Digital Era Governance, which is also to misunderstand the changes being brought about in society, with or without open government data.
The point is not that open data is the answer to our problem but society’s chance to stay in the game (and even then, the odds are arguably against it). We already have ever increasing numbers of huge closed databases, many made up of largely government data, available to small number of people and companies.
This leads to an asymmetry of power and friction that completely undermines democracy; open data is not a sufficiency to counteract that, but I think it is a requirement.

It’s possible I’ve misunderstood Longo’s article and he is just across the straights at the University of Victoria, so hopefully we can grab a beer and talk it through. But my sense is this article is much more about a political battle between New Public Management and Digital Era Governance in which open data is being used as a pawn. As an advocate, I’m not wholly comfortable with that, as I think it risks misrepresenting it.

Edmonton Heads for the Cloud

I’m confident that somewhere in Canada, some resource strapped innovative small town has abandoned desktop software and uses a cloud based service but so far no city of any real size has even publicly said they were considering the possibility.

That is, until today.

Looks like Edmonton’s IT group – which is not just one of the most forward looking in the country continues to make the rubber hit the road – is moving its email and office suite to the cloud. (I’ve posted the entire doc below since it isn’t easy to link to)

They aren’t the first city in the world to do this: Washington D.C., Orlando and Los Angeles have all moved to Google apps (in each case displacing Microsoft Office) but they are the first in Canada – a country not known for its risk taking IT departments.

I can imagine that a lot government IT people will be watching closely. And that’s too bad. There is far too much watching in Canada when there could be a lot of innovating and saving. While some will site LA’s bumpy transition, Orlando’s and DC’s were relatively smooth and are still cities that are far larger than most of their Canadian counterparts. LA is more akin to transitioning a province (or Toronto). Nobody else get’s that pass.

Two things:

1) I’ve highlighted what I think is some of the interesting points in the document being presented to council.

2) A lot of IT staff in other cities will claim that it is “too early” to know if this is going to work.

People. Wake up. It is really hard to imagine you won’t be moving to the cloud at some point in the VERY near future. I frankly don’t care which cloud solution you choose (Google vs. Microsoft) that choice is less important than actually making the move. Is Edmonton taking some risks? Yes. But it is also going to be the first city to learn the lessons, change its job descriptions, work flows, processes and the zillion other things that will come out of this. This means they’ll have a cost and productivity advantage over other cities as they play catch up. And I suspect, that there will never be a catch up, as Edmonton will already be doing the next obvious thing.

If your a IT person in a city, the question is no longer, do you lead or follow. It is merely, how far behind are you going to be comfortable being?

6. 13

Workspace Edmonton

Sole Source Agreement

Recommendation:

That, subject to the necessary funding being made available, Administration enter into a sole source agreement, in an amount not exceeding $5 million, and a period not exceeding five years, with Google Inc., for the provision of computing productivity tools, and that the contract be in form and content acceptable to the City Manager.

Report Summary

The IT Branch undertook a technical assessment of seven options for the delivery of desktop productivity tools. Software as a Service (‘cloud computing’) was identified as the preferred direction as it allows the corporation to work from anytime, place or device. Google Mail and Google Apps were determined to provide the best solution. The change will ensure ongoing sustainability of the services, provides opportunities for service and productivity gains, and align IT services with key principles in The Way We Green, The Way We Live and The Way We Move.

Report

The City Administration Bylaw 12005 requires approval from Executive Committee for Single Source Contracts (contracts to be awarded without tendering) in excess of $500,000, and those contracts that may exceed ten years in duration.

The Workspace Edmonton Program consists of two initiatives, which will allow the delivery of information technology software and services to employees, contractors and third party partners anytime and place, and on any device. In order to accomplish this the administration is proposing moving away from a model where software is installed on every computer to a solution where the software is housed on the internet (‘cloud computing’).

Administration is recommending the implementation of Google Apps Premier Edition as the primary computing productivity tool, with targeted use of Microsoft Office and SharePoint. The recommended direction will allow the City to move to Google Mail as the corporate messaging tool and Google Apps as the primary office productivity tools. It will also allow the corporation access to other applications offered by Google Inc. and partners to Google Inc. Microsoft Office and SharePoint will remain as the secondary office productivity tools for business areas that require these applications for specific business needs. Use of the Microsoft tools will require completion of the appropriate use case and approval by the Chief Information Officer.

Administration is requesting approval to proceed to negotiation of a contract with Google Inc. The sole source agreement is required at this time to allow the program to be developed in 2011. This is foundational work that will allow the program to proceed to implementation in 2012. The contract is also required in order to complete the Privacy Impact Assessment and develop implementation plans.

Benefits

Workspace Edmonton creates the opportunity for the City of Edmonton to significantly change the way we work. Administration will have increased options for delivering services to citizens, including enhanced mobile field services and new opportunities for community consultation and collaboration. The consumer version of Google is free to private citizens and not-for-profit groups and would allow additional options for collaboration with organizations such as community leagues with no net cost to the corporation or organization.

The move to G-Mail will allow the corporation to extend email access to all city employees, improving access to information and communications. It will also allow for implementation of a number of services without additional licensing costs, including:

  • audio and video chat
  • group sites to allow improved collaboration with external
    partners and community groups
  • internal Youtube for training and information sharing
  • increased collaboration through document sharing and simultaneous authoring capabilities

The program presents the opportunity for the City to better address the expectations of the next generation of workers by providing options to bring your device and to work with software many already use. Both Edmonton Public Schools and the University of Alberta have implemented Google Apps.

In addition, the implementation of Google Apps will include an e-records
solution for documents stored in Google Apps. This will be implemented in partnership with the Office of the City Clerk. The benefit of this being alignment with legislated and corporate requirements for records retention, retrieval, and disposal.

Moving to the Software as a Service Model (‘cloud computing’) through the internet will avoid additional hardware and support costs associated with increased service demands due to growth. This solution provides a more sustainable business model, reducing demands on resources for regular product upgrades and services support. Finally, the relocation of software and data to multiple secure data centres facilitates continuation of services during emergencies such as natural disasters and pandemics. City employees will be able to access email and documents through the internet from any office or home computer.

Solution Assessments

The IT Branch undertook a technical assessment of seven office productivity software and service delivery options. A financial assessment of the top three options was subsequently completed and the recommended direction to move to Google Inc. as the service provider was based on these assessments. Following this, the IT Branch undertook a security assessment to ensure the option chosen met security requirements and industry standards. A Privacy Impact Assessment has been initiated and will be completed upon negotiation of an agreement. Precedent in Alberta has been set with both the Edmonton Public School Board and the University of Alberta entering into agreements with Google Inc.

Strategic Direction

The Workspace Edmonton Program supports Council’s strategic direction for innovation and a well managed city, as well as key principles in The Way We Green, The Way We Move, and the Way We Live.

Budget/Financial Implications

Google Messaging and Apps will replace the existing Microsoft Exchange and majority of Office licenses. The funding currently in place for Microsoft license maintenance will be sufficient to fund the annual Google services.

2011 funding for the implementation of overall Workspace Edmonton Program is within the current IT budgets and will be the source of funding. Funding for 2012 will be included in the 2012 budget request.    A business case for this initiative was completed and is available for review.

The Workspace Edmonton model aligns with and complements the corporate initiative of Transforming Edmonton. The administration will look for opportunities to integrate the programs and utilize a portion of the funding for Transforming Edmonton to fund Workspace Edmonton change and transition requirements.

Risks

If the recommendation is not supported, Workspace Edmonton will stop and the corporation will be required to either go to Request For Proposal or remain on the existing platform. Remaining on the existing platform will require additional funding in future years to support continued maintenance costs and future growth. (Extending email only to city staff who do not currently have email accounts would cost the corporation approximately $900,000 per year with the existing solution.) Delaying the implementation to 2012 would result in delays to return on investment and achievement of the benefits.

Justification of Recommendation
Technical, financial and security assessments have been completed. The recommended solution meets business requirements, provides opportunities to increase and improve service delivery and is projected to garner a return on investment within 18 to 24 months of implementation. Approval of this recommendation will allow Administration to proceed to negotiation of a contract.

Others Reviewing this Report
• L. Rosen, Chief Financial Officer and Treasurer

WRITTEN BY – D. Kronewitt-Martin | August 24, 2011 – Corporate Services 2011COT006

Why Social Media behind the Government Firewall Matters

This comment, posted four months ago to my blog by Jesse G. in response to this post on GCPEDIA, remains one of the favorite comments posted to my blog ever. This is a public servant who understands the future and is trying to live it. I’ve literally had this comment sitting in my inbox because this whole time because I didn’t want to forget about it.

For those opposing to the use of wiki’s social media behind the government firewall this is a must read (of course I’d say it is a must read for those in favour as well). It’s just a small example of how tiny transactions costs are killing government, and how social media can flatten them.

I wish more elements of the Canadian government got it, but despite the success of GCPEDIA and its endorsement by the Clerk there are still a ton of forces pitted against it, from the procurement officers in Public Works who’d rather pay for a bulky expensive alternative that no one will use to middle managers who forbid their staff from using it out of some misdirected fear.

Is GCPEDIA the solution to everything? No. But it is a cheap solution to a lot of problems, indeed I’ll bet its solved more problems per dollar than any other IT solution put forward by the government.

So for the (efficient) future, read on:

Here’s a really untimely comment – GCPEDIA now has over 22,000 registered users and around 11,000 pages of content. Something like 6.5 million pageviews and around .5 million edits. It has ~2,000 visitors a week and around 15,000 pageviews a week. On average, people are using the wiki for around 5.5 minutes per visit. I’m an admin for GCPEDIA and it’s sister tools – GCCONNEX (a professional networking platform built using elgg) and GCForums (a forum build using YAF). Collectively the tools are known as GC2.0.

Anyways, I’m only piping up because I love GCPEDIA so much. For me and for thousand of public servants, it is something we use every day and I cannot emphasize strongly enough how friggin’ awesome it is to have so much knowledge in one place. It’s a great platform for connecting people and knowledge. And it’s changing the way the public service works.

A couple of examples are probably in order. I know one group of around 40 public servants from 20 departments who are collaborating on GCPEDIA to develop a new set of standards for IT. Every step of the project has taken place on GCPEDIA (though I don’t want to imply that the wiki is everything – face-to-face can’t be replaced by wiki), from the initial project planning, through producing deliverables. I’ve watched their pages transform since the day they were first created and I attest that they are really doing some innovative work on the wiki to support their project.

Another example, which is really a thought experiment: Imagine you’re a coop student hired on a 4 month term. Your director has been hearing some buzz about this new thing called Twitter and wants an official account right away. She asks you to find out what other official Twitter accounts are being used across all the other departments and agencies. So you get on the internet, try to track down the contact details for the comms shops of all those departments and agencies, and send an email to ask what accounts they have. Anyone who knows government can imagine that a best case turnaround time for that kind of answer will take at least 24 hours, but probably more like a few days. So you keep making calls and maybe if everything goes perfectly you get 8 responses a day (good luck!). There are a couple hundred departments and agencies so you’re looking at about 100 business days to get a full inventory. But by the time you’ve finished, your research is out of date and no longer valid and your 4 month coop term is over. Now a first year coop student makes about $14.50/hour (sweet gig if you can get it students!), so over a 4 month term that’s about $10,000. Now repeat this process for every single department and agency that wants a twitter account and you can see it’s a staggering cost. Let’s be conservative and say only 25 departments care enough about twitter to do this sort of exercise – you’re talking about $275,000 of research. Realistically, there are many more departments that want to get on the twitter bandwagon, but the point still holds.

Anyways, did you know that on GCPEDIA there is a crowd-sourced page with hundreds of contributors that lists all of the official GC twitter accounts? One source is kept up to date through contributions of users that literally take a few seconds to make. The savings are enormous – and this is just one page.

Because I know GCPEDIA’s content so well, I can point anyone to almost any piece of information they want to know – or, because GCPEDIA is also a social platform, if I can’t find the info you’re looking for, I can at least find the person who is the expert. I am not an auditor, but I can tell you exactly where to go for the audit policies and frameworks, resources and tools, experts and communities of practice, and pictures of a bunch of internal auditors clowning around during National Public Service Week. There is tremendous value in this – my service as an information “wayfinder” has won me a few fans.

Final point before I stop – a couple of weeks ago, I was doing a presentation to a manager’s leadership network about unconferences. I made three pages – one on the topic of unconferences, one on the facilitation method for building the unconference agenda, and one that is a practical 12-step guide for anyone who wants to plan and organize their own (this last was a group effort with my co-organizers of Collaborative Culture Camp). Instead of preparing a powerpoint and handouts I brought the page up on the projector. I encouraged everyone to check the pages out and to contribute their thoughts and ideas about how they could apply them to their own work. I asked them to improve the pages if they could. But the real value is that instead of me showing up, doing my bit, and then vanishing into the ether I left a valuable information resource behind that other GCPEDIA users will find, use, and improve (maybe because they are searching for unconferences, or maybe it’s just serendipity). Either way, when public servants begin to change how they think of their role in government – not just as employees of x department, but as an integral person in the greater-whole; not in terms of “information is power”, but rather the power of sharing information; not as cogs in the machine, but as responsible change agents working to bring collaborative culture to government – there is a huge benefit for Canadian citizens, whether the wiki is behind a firewall or not.

p.s. To Stephane’s point about approval processes – I confront resistance frequently when I am presenting about GCPEDIA, but there is always someone who “gets” it. Some departments are indeed trying to prevent employees from posting to GCPEDIA – but it isn’t widespread. Even the most security-conscious departments are using the wiki. And Wayne Wouters, the Clerk of the Privy Council has been explicit in his support of the wiki, going so far as to say that no one requires manager’s approval to use the wiki. I hope that if anyone has a boss that says, “You can’t use GCPEDIA” that that employee plops down the latest PS Renewal Action Plan on his desk and says, “You’ve got a lot to learn”.

Shared IT Services across the Canadian Government – three opportunities

Earlier this week the Canadian Federal Government announced it will be creating Shared Services Canada which will absorb the resources and functions associated with the delivery of email, data centres and network services from 44 departments.

These types of shared services projects are always fraught with danger. While they sometimes are successful, they are often disasters. Highly disruptive with little to show for results (and eventually get unwound). However, I suspect there is a significant amount of savings that can be made and I remain optimistic. With luck the analogy here is the work outgoing US CIO Vivek Kundra accomplished as he has sought to close down and consolidate 800 data centres across the US which is yielding some serious savings.

So here’s what I’m hoping Shared Services Canada will mean:

1) A bigger opportunity for Open Source

What I’m still more hopeful about – although not overly optimistic – is the role that open source solutions could play in the solutions Shared Services Canada will implement. Over on the Drupal site, one contributor claims government officials have been told to hold off buying web content management systems as the government prepares to buy a single solution for across all departments.

If the government is serious about lowering its costs it absolutely must rethink its procurement models so that open source solutions can at least be made a viable option. If not this whole exercise will mean the government may save money, but it will be the we move from 5 expensive solutions to one expensive solution variety.

On the upside some of that work has clearly taken place. Already there are several federal government websites running on Drupal such as this Ministry of Public Works website, the NRCAN and DND intranet. Moreover, there are real efforts in the open source community to accommodate government. In the United States OpenPublic has fostered a version of Drupal designed for government’s needs.

Open source solutions have the added bonus of allowing you the option of using more local talent, which, if stimulus is part of the goal, would be wise. Also, any open source solutions fostered by the federal government could be picked up by the provinces, creating further savings to tax payers. As a bonus, you can also fire incompetent implementors, something that needs to happen a little more often in government IT.

2) More accountability

Ministers Ambrose and Clement are laser focused on finding savings – pretty much every ministry needs to find 5 or 10% savings across the board. I also know both speak passionately about managing tax payers dollars: “Canadians work hard for their money and expect our Government to manage taxpayers dollars responsibly, Shared Services Canada will have a mandate to streamline IT, save money, and end waste and duplication.”

Great. I agree. So one of Shared Service Canada’s first act should be to follow in the footsteps of another Vivek Kundra initiative and recreate his incredibly successful IT Dashboard. Indeed it was by using the dashboard Vivek was able to “cut the time in half to deliver meaningful [IT system] functionality and critical services, and reduced total budgeted [Federal government IT] costs by over $3 billion.” Now that some serious savings. It’s a great example of how transparency can drive effective organizational change.

And here’s the kicker. The White House open sourced the IT Dashboard (the code can be downloaded here). So while it will require some work adapting it, the software is there and a lot of the heavy work has been done. Again, if we are serious about this, the path forward is straightforward.

3) More open data

Speaking of transparency… one place shared services could really come in handy is creating some data warehouses for hosting critical government data sets (ideally in the cloud). I suspect there are a number of important datasets that are used by public servants across ministries, and so getting them on a robust platform that is accessible would make a lot of sense. This of course, would also be an ideal opportunity to engage in a massive open data project. It might be easier to create policy for making the data managed by Shared Service Canada “open.” Indeed, this blog post covers some of the reasons why now is the time to think about that issue.

So congratulations on the big move everyone and I hope these suggestions are helpful. Certainly we’ll be watching with interest – we can’t have a 21st century government unless we have 21st century infrastructure, and you’re now the group responsible for it.

Open Source Data Journalism – Happening now at Buzz Data

(there is a section on this topic focused on governments below)

A hint of how social data could change journalism

Anyone who’s heard me speak in the last 6 months knows I’m excited about BuzzData. This week, while still in limited access beta, the site is showing hints its potential – and it still has only a few hundred users.

First, what is BuzzData? It’s a website that allows data to be easily uploaded and shared among any number of users. (For hackers – it’s essentially github for data, but more social). It makes it easy for people to copy data sets, tinker with them, share the results back with the original master, mash them up with other data sets, all while engaging with those who care about that data set.

So, what happened? Why is any of this interesting? And what does it have to do with journalism?

Exactly a month ago Svetlana Kovalyova of Reuters had her article – Food prices to remain high, UN warns – re-published in the Globe and Mail.  The piece essentially outlined that food commodities were getting cheaper because of local conditions in a number of regions.

Someone at the Globe and Mail decided to go a step further and upload the data – the annual food price indices from 1990-present – onto the BuzzData site, presumably so they could play around with it. This is nothing complicated, it’s a pretty basic chart. Nonetheless a dozen or so users started “following” the dataset and about 11 days ago, one of them, David Joerg, asked:

The article focused on short-term price movements, but what really blew me away is: 1) how the price of all these agricultural commodities has doubled since 2003 and 2) how sugar has more than TRIPLED since 2003. I have to ask, can anyone explain WHY these prices have gone up so much faster than other prices? Is it all about the price of oil?

He then did a simple visualization of the data.

FoodPrices

In response someone from the Globe and Mail entitled Mason answered:

Hi David… did you create your viz based on the data I posted? I can’t answer your question but clearly your visualization brought it to the forefront. Thanks!

But of course, in a process that mirrors what often happens in the open source community, another “follower” of the data shows up and refines the work of the original commentator. In this case, an Alexander Smith notes:

I added some oil price data to this visualization. As you can see the lines for everything except sugar seem to move more or less with the oil. It would be interesting to do a little regression on this and see how close the actual correlation is.

The first thing to note is that Smith has added data, “mashing in” Oil Price per barrel. So now the data set has been made richer. In addition his graph quite nice as it makes the correlation more visible than the graph by Joerg which only referenced the Oil Price Index. It also becomes apparent, looking at this chart, how much of an outlier sugar really is.

oilandfood

Perhaps some regression is required, but Smith’s graph is pretty compelling. What’s more interesting is not once is the price of oil mentioned in the article as a driver of food commodity prices. So maybe it’s not relevant. But maybe it deserves more investigation – and a significantly better piece, one that would provide better information to the public – could be written in the future. In either case, this discussion, conducted by non-experts simply looking at the data, helped surface some interesting leads.

And therein lies the power of social data.

With even only a handful of users a deeper, better analysis of the story has taken place. Why? Because people are able to access the data and look at it directly. If you’re a follower of Julian Assange of wikileaks, you might call this scientific journalism, maybe it is, maybe it isn’t, but it certainly is a much more transparent way for doing analysis and a potential audience builder – imagine if 100s or 1000s of readers were engaged in the data underlying a story. What would that do to the story? What would that do to journalism? With BuzzData it also becomes less difficult to imagine a data journalists who spends a significant amount of their time in BuzzData working with a community of engaged pro-ams trying to find hidden meaning in data they amass.

Obviously, this back and forth isn’t game changing. No smoking gun has been found. But I think it hints at a larger potential, one that it would be very interesting to see unlocked.

More than Journalism – I’m looking at you government

Of course, it isn’t just media companies that should be paying attention. For years I argued that governments – and especially politicians – interested in open data have an unhealthy appetite for applications. They like the idea of sexy apps on smart phones enabling citizens to do cool things. To be clear, I think apps are cool too. I hope in cities and jurisdictions with open data we see more of them.

But open data isn’t just about apps. It’s about the analysis.

Imagine a city’s budget up on Buzzdata. Imagine, the flow rates of the water or sewage system. Or the inventory of trees. Think of how a community of interested and engaged “followers” could supplement that data, analyze it, visualize it. Maybe they would be able to explain it to others better, to find savings or potential problems, develop new forms of risk assessment.

It would certainly make for an interesting discussion. If 100 or even just 5 new analyses were to emerge, maybe none of them would be helpful, or would provide any insights. But I have my doubts. I suspect it would enrich the public debate.

It could be that the analysis would become as sexy as the apps. And that’s an outcome that would warm this policy wonk’s soul.

How Dirty is Your Data? Greenpeace Wants the Cloud to be Greener

My friends over at Greenpeace recently published an interesting report entitled “How dirty is your data?
A Look at the Energy Choices That Power Cloud Computing
.”

For those who think that cloud computing is an environmentally friendly business, let’s just say… it’s not without its problems.

What’s most interesting is the huge opportunity the cloud presents for changing the energy sector – especially in developing economies. Consider the follow factoids from the report:

  • Data centres to house the explosion of virtual information currently consume 1.5-2% of all global electricity; this is growing at a rate of 12% a year.
  • The IT industry points to cloud computing as the new, green model for our IT infrastructure needs, but few companies provide data that would allow us to objectively evaluate these claims.
  • The technologies of the 21st century are still largely powered by the dirty coal power of the past, with over half of the companies rated herein relying on coal for between 50% and 80% of their energy needs.

The 12% growth rate is astounding. It essentially makes it the fastest growing segment in the energy business – so the choices these companies make around how they power their server farms will dictate what the energy industry invests in. If they are content with coal – we’ll burn more coal. If they demand renewables, we’ll end up investing in renewables and that’s what will end up powering not just server farms, but lots of things. It’s a powerful position big data and the cloud hold in the energy marketplace.

And of course, the report notes that many companies say many of the right things:

“Our main goal at Facebook is to help make the world more open and transparent. We believe that if we want to lead the world in this direction, then we must set an example by running our service in this way.”

– Mark Zuckerberg

But then Facebook is patently not transparent about where its energy comes from, so it is not easy to assess how good or bad they are, or how they are trending.

Indeed it is worth looking at Greenpeace’s Clean Cloud report card to see – just how dirty is your data?

Report-card-cloud

I’d love to see a session at the upcoming (or next year) Strata Big Data Conference on say “How to use Big Data to make Big Data more Green.” Maybe even a competition to that effect if there was some data that could be shared? Or maybe just a session where Greenpeace could present their research and engage the community.

Just a thought. Big data has got some big responsibilities on its shoulders when it comes to the environment. It would be great to see them engage on it.