Tag Archives: technology

Applications and Hardware Already Running On Open Data

Yesterday, Gerry T shared a photo he snapped at the University of Alberta in Edmonton of a “departure board” in the university’s Student Union building that uses open transportation data from the city’s website.

Essentially the display board is composed of a simply application, displayed over a large flat screen TV turned vertically.

TransitApp_BusDepartures-217x300It’s exactly the kind of thing that I imagine University Students in many cities around the world wish they had – especially if you are on a campus that is cold and/or wet. Wouldn’t it be nice to wait inside that warm student union building rather than at the bus stop?

Of course in Boston they’ve gone further than just providing the schedule online. They provide real-time data on bus locations which some students and engineers have used to create $350 LED signs in coffee houses to let users know when the next bus is coming.

It’s the kind of simple innovations you wish you’d see in more places: government’s letting people help themselves at making their lives a little easier. Yes, this isn’t changing the world, but its a start, and an example of what more could happen.

Mostly it’s nice to see innovators in Canada like playing with the technology. Hopefully governments will catch up and let the even bigger ideas students around the country have be more than just visions in their heads.

Not sure who at the University created this, but nice work.

City of Vancouver Wins Top Innovator Award from BC Business

To be clear, this is not top innovator among governments, this is top innovator among all organizations – for-profit, non-profit and government – in the province.

You can see the award write up here.

As the article states, Vancouver Open-Data initiative “floored the [judging] panel.” Indeed, one panellist stated: “I have never seen a municipality open to new ideas in my life. When was the last time any level of government said, Here are our books; fill your boots?”

Back in October BC Business asked me to write a think piece explaining open data, I ended up penning this piece entitled “The Difference Data Makes”. It fantastic to see the business community recognizing the potential of open data and how it could transform both the way government works, and the opportunities it poses for the private and non-profit organizations as well as citizens.

It’s a great data for the City of Vancouver and for Open Data.

Calgary Launches Business Plan and Budget App

So this is interesting. The City of Calgary has launched a Business Plan & Budget app for free from iTunes.

It’s a smart move as it creates an easy, “one button” option for citizens to participate in and learn about the city’s financial planning process. You can read (a tiny bit) more at the City of Calgary’s blog.

Looking more closely at the app, it doesn’t offer a huge amount but don’t dismiss it too quickly. Consolidating all the information into a single place and making it available to people on the go is a great starting point. Secondly, it is worth remembering that this is just a starting point – there is obviously lots to be learned about how to engage citizens online – especially using mobile technology. If this is done right, Calgary will be learning these lessons first, which means their 2nd and 3rd generation versions of the app and the process will be more sophisticated while others are left catching up (think of Apple and the iPad).

So while the app is fairly light on features today… I can imagine a future where it becomes significantly more engaging and comprehensive, using open data on the data and city services to show maps of where and how money is spent, as well as post reminders for in person meet ups, tours of facilities, and dial in townhall meetings. The best way to get to these more advanced features is to experiment with getting the lighter features right today. The challenge for Calgary on this front is that it seems to have no plans for sharing much data with the public (that I’ve heard of), it’s open data portal has few offerings and its design is sorely lacking. Ultimately, if you want to consult citizens on planning and the budget it might be nice to go beyond surveys and share more raw data and information with them, it’s a piece of the puzzle I think will be essential. This is something no city seems to be tackling with any gusto and, along with crime data, is emerging as a serious litmus test of a city’s intention to be transparent.

The possibilities that Calgary’s consultation app presents are exciting – and again it is early days – so it will be interesting if developers in Calgary and elsewhere can begin to figuring out how to easily extend and enhance this type of approach. Moreover, it’s nice to see a city venturing out and experimenting with this technology, I hope other cities will not just watch, but start experiments of their own, it’s the best way to learn.

 

Access to Information is Fatally Broken… You Just Don’t Know it Yet

I’ve been doing a lot of thinking about access to information, and am working on a longer analysis, but in the short term I wanted to share two graphs – graphs that outline why Access to Information (Freedom of Information in the United States) is unsustainable and will, eventually, need to be radically rethought.

First, this analysis is made possible by the enormous generosity of the Canadian Federal Information Commissioners Office which several weeks ago sent me a tremendous amount of useful data regarding access to information requests over the past 15 years at the Treasury Board Secretariat (TBS).

The first figure I created shows both the absolute number of Access to Information Requests (ATIP) since 1996 as well as the running year on year percentage increase. The dotted line represents the average percentage increase over this time. As you can see the number of ATIP requests has almost tripled in this time period. This is very significant growth – the kind you’d want to see in a well run company. Alas, for those processing ATIP requests, I suspect it represents a significant headache.

That’s because, of course, such growth is likely unmanageable. It might be manageable if say, the costs of handling each requests was dropping rapidly. If such efficiencies were being wrestled out of the system of routing and sorting requests then we could simply ignore the chart above. Sadly, as the next chart I created demonstrates this is not the case.

ATIPcosts

In fact the costs of managing these transactions has not tripled. It has more than quadrupled. This means that not only are the number of transactions increasing at about 8% a year, the cost of fulfilling each of those transactions is itself rising at a rate above inflation.

Now remember, I’m not event talking about the effectiveness of ATIP. I’m not talking about how quickly requests are turned around (as the Information Commissioner has discussed, it is broadly getting worse) nor am I discussing less information is being restricted (it’s not, things are getting worse). These are important – and difficult to assess – metrics.

I am, instead, merely looking at the economics of ATIP and the situation looks grim. Basically two interrelated problems threaten the current system.

1) As the number of ATIP requests increase, the manpower required to answer them also appears to be increasing. At some point the hours required to fulfill all requests sent to a ministry will equal the total hours of manpower at that ministry’s  disposal. Yes that day may be far off, but they day where it hits some meaningful percentage – say 1%, 3% or 5% of total hours worked at Treasury Board, may not be that far off. That’s a significant drag on efficiency. I recall talking to a foreign service officer who mentioned that during the Afghan prisoner scandal an entire department of foreign service officers – some 60 people in all – were working full time on assessing access to information requests. That’s an enormous amount of time, energy and money.

2) Even more problematic than the number of work hours is the cost. According to the data I received, Access to Information requests costs The Treasury Board $47,196,030 last year. Yes, that’s 47 with a “million” behind it. And remember, this is just one ministry. Multiply that by 25 (let’s pretend that’s the number of ministries, there are actually many more, but I’m trying to be really conservative with my assumptions) and it means last year the government may have spent over $1.175 Billion fulfilling ATIP requests. That is a staggering number. And its growing.

Transparency, apparently, is very, very expensive. At some point, it risks becoming too expensive.

Indeed, ATIP reminds me of healthcare. It’s completely unsustainable, and absolutely necessary.

To be clear, I’m not saying we should get rid of ATIP. That, I believe, to be folly. It is and remains a powerful tool for holding government accountable. Nor do I believe that requesters should pay for ATIP requests as a way to offset costs (like BC Ferries does) – this creates a barrier that punishes the most marginalized and threatened, while enabling only the wealthy or well financed to hold government accountable.

I do think it suggests that governments need to radical rethink how manage ATIP. More importantly I think it suggests that government needs to rethink how it manages information. Open data, digital documents are all part of a strategy that, I hope, can lighten the load. I’ve also felt that if/as government’s move their work onto online platforms like GCPEDIA, we should simply make non-classified pages open to the public on something like a 5 year timeline. This could also help reduce requests.

I’ve more ideas, but at its core we need a system rethink. ATIP is broken. You may not know it yet, but it is. The question is, what are we going to do before it peels off the cliff? Can we invent something new and better in time?

Articles I'm Digesting: Feb 28th, 2011

Been a while since I’ve done one of these. A surprising amount of reading getting done in my life despite a hectic schedule. In addition to the articles below, I recently finished Shirky’s Cognitive Surplus (solid read) and am almost done Kevin Kelly’s What Technology Wants, which, is blowing my mind. More on both soon, I hope.

Why Blogs (Still) Aren’t Dead…No Matter What You’ve Heard by Kimberly Turner

I got to this via Mathew Ingram of GigaOM. A few months ago there was some talk about the decline of blogs. You could almost hear the newspaper people rubbing their hands with glee. Turns out it was all bogus. This article outlines some great stats on the issue and lays out where things are at, and why the rumor got started. The sooner than everyone, from the newspaper writer, to the professional blogger, to the amateur blogger to the everyday twitterer accepts/realizes they are on the same continuum and actually support one another, the happier I suspect we’re all going to be.

The Inside Story of How Facebook Responded to Tunisian Hacks by Alexis Madrigal

Totally fascinating and fairly self-explanatory:

By January 5, it was clear that an entire country’s worth of passwords were in the process of being stolen right in the midst of the greatest political upheaval in two decades. Sullivan and his team decided they needed a country-level solution — and fast…

…At Facebook, Sullivan’s team decided to take an apolitical approach to the problem. This was simply a hack that required a technical response. “At its core, from our standpoint, it’s a security issue around passwords and making sure that we protect the integrity of passwords and accounts,” he said. “It was very much a black and white security issue and less of a political issue.”

That’s pretty much the stand I’d like a software service to take.

Work on Stuff that Matters: First Principles by Tim O’Reilly

Basically, some good touch stones for work, and life, from someone I’ve got a ton of respect for.

Love and Hate on Twitter by Jeff Clark

Awesome visualizations of the use of the words love and hate on twitter. It is amazing that Justin Bieber always turns up high. More interesting are how brands and politicians get ranked.

The Neoformix blog is just fantastic. For hockey fans, be sure to check out this post.

Saving Healthcare Billions: Let's fork the VA's Electronic Health Records System

Alternative title for this post: How our Government’s fear of Open Source Software is costing us Billions.

So, I’ve been meaning to blog this for several months now.

Back in November I remember coming across this great, but very short, interview in the Globe and Mail with Ken Kizer. Who, you might ask, is Ken Kizer? He’s a former Naval officer and emergency medicine physician who became the US Veteran’s Affair’s undersecretary for health in 1994.

While the list of changes he made is startling and impressive, what particularly caught my attention is that he accomplished what the Government of Ontario failed to do with $1Billion in spending: implementing an electronic medical record system that works. And, let’s be clear, it not only works, it is saving lives and controlling costs.

And while the VA has spent millions in time and energy developing that code, what is amazing is that it’s all been open sourced, so the cost of leveraging it is relatively low. Indeed, today, Ken Kizer heads up a company that implements the VA’s now open source solution – called VistA – in hospitals in the US. Consdier this extract from his interview:

You have headed a company that promoted “open-source” software for EHR, instead of a pricier proprietary system. Why do you think open source is better?

I believe the solution to health-care information technology lies in the open-source world that basically gives away the code. That is then adapted to local circumstances. With the proprietary model, you are always going back to the vendor for changes, and they decide whether to do them and how much they will cost. In Europe, open source EHR software is zooming. It’s the most widely deployed EHR system in the world, but not here.

Sometimes I wonder, do any Canadian government’s ever look at simply forking VistA and creating a Canadian version?

I wonder all the more after reading a Fortune Magazine article on the changes achieved in the VA during this period. The story is impressive, and VistA played a key role. Indeed, during Kizer’s tenure:

  • The VA saw the number of patents it treat almost doubke from 2.9 million 1996 to 5.4 million patients in 2006.
  • Customer satisfaction ratings within the VA system exceeded those of  private health care providers during many of those years.
  • All this has been achieved as the cost per patient has held steady at roughly $5,000. In contrast the rest of the US medical system saw costs rise 60 percent to $6,300.
  • And perhaps most importantly, in a time of crises the new system proved critical: while Hurricane Katrina destroyed untold numbers of cilivians (paper) healthcare records, VistA’s ensured that health records of veterans in the impacted areas could be called upon in a heartbeat.

This is a story that any Canadian province would be proud to tell its citizens. It would be fascinating to see some of the smaller provinces begin to jointly fund some e-health open source software initiatives, particularly one to create an electronic healthcare record system. Rather than relying on a single vendor with its coterie of expensive consultants, a variety of vendors, all serving the same platform could emerge, helping keep costs down.

It’s the kind of solution that seems custom built for Canada’s healthcare system. Funny how it took a US government agency to show us how to make it a reality.

What I’m doing at Code for America

For the last two weeks – and for much of January – I’m in San Francisco helping out with Code for America. What’s Code for America? Think Teach for America, but rather than deploying people into classrooms to help provide positive experiences for students and teachers while attempting to shift the culture of school districts, Code for America has fellows work with cities to help develop reusable code to save cities money, make local government as accessible as your favorite website, and help shift the government’s culture around technology.

code-for-america1-150x112The whole affair is powered by a group of 20 amazing fellows and an equally awesome staff that has been working for months to make it all come together. My role – in comparison – is relatively minor, I head up the Code for America Institute – a month long educational program the fellows go through when they first arrive.  I wanted to write about what I’ve been trying to do both because of the openness ideals of Code for America and to share any lessons for others who might attempt a similar effort.

First, to understand what I’m doing, you have to understand the goal. On the surface, to an outsider, the Code for America change process might look something like this:

  1. Get together some crazy talented computer programers (hackers, if you want to make the government folks nervous)
  2. Unleash them on a partner city with a specific need
  3. Take resulting output and share across cities

Which of course, would mistakenly frame the problem as technical. However, Code for America is not about technology. It’s about culture change. The goal is about rethinking and reimagining  government as better, faster, cheaper and adaptive. It’s about helping think of the ways its culture can embrace government as a platform, as open and as highly responsive.

I’m helping (I think) because I’ve enjoyed some success in getting government’s to think differently. I’m not a computer developer and at their core, these successes were never technology problems. The challenge is understanding how the system works, identify the leverage points for making change, develop partners and collaborate to engage those leverage points, and do whatever it takes to ensure it all comes together.

So this is the message and the concept the speakers are trying to impart on the fellows. Or, in other words, my job is to help unleash the already vibrant change agents within the 20 awesome fellows and make them effective in the government context.

So what have we done so far?

We’ve focused on three areas:

1) Understand Government: Some of the fellows are new to government, so we’ve had presentations from local government experts like Jay Nath, Ed Reiskin and Peter Koht as well as the Mayor of Tuscon’s chief of staff (to give a political perspective). And of course, Tim O’Reilly has spoken about how he thinks government must evolve in the 21st century. The goal: understand the system as well as, understand and respect the actors within that system.

2) Initiate & Influence: Whether it is launching you own business (Eric Ries on startups), starting a project (Luke Closs on Vantrash) or understanding what happens when two cultures come together (Caterina Fake on Yahoo buying Flickr) or myself on negotiating, influence and collaboration, our main challenges will not be technical, they will be systems based and social. If we are to build projects and systems that are successful and sustainable we need to ask the right questions and engage with these systems respectfully as we try to shift them.

3) Plan & Focus: Finally, we’ve had experts in planning and organizing. People like Allen Gunn (Gunner) and the folks from Cooper Design, who’ve helped the fellows think about what they want, where they are going, and what they want to achieve. Know thyself, be prepared, have a plan.

The last two weeks will continue to pick up these themes but also give the fellows more time to (a) prepare for the work they will be doing with their partner cities; and (b) give them more opportunities to learn from one another. We’re half way through the institute at this point and I’m hoping the experience has been a rich – if sometimes overwhelming – one. Hopefully I’ll have an update again at the end of the month.

Honourable Mention! The Mozilla Visualization Challenge Update

Really pleased to share that Diederik and I earned an honourable mention for our submission to the Mozilla Open Data Competition.

For those who missed it – and who find opendata, open source and visualization interesting – you can read a description of and see images from our submission to the competition in this blog post I wrote a month ago.

Canada's Secret Open Data Strategy?

Be prepared for the most boring sentence to an intriguing blog post.

The other night, I was, as one is wont to do, reading through a random Organization for Economic Coordination and Development report entitled Towards Recovery and Partnership with Citizens: The Call for Innovative and Open Government. The report was, in fact, a summary of its recent Ministerial Meeting of the OECD’s Public Governance Committee.

Naturally, I flipped to the section authored by Canada and, imagine the interest with which I read the following:

The Government of Canada currently makes a significant amount of open data available through various departmental websites. Fall 2010 will see the launch of a new portal to provide one-stop access to federal data sets by providing a “single-window” to government data. In addition to providing a common “front door” to government data, a searchable catalogue of available data, and one-touch data downloading, it will also encourage users to develop applications that re-use and combine government data to make it useful in new and unanticipated ways, creating new value for Canadians. Canada is also exploring the development of open data policies to regularise the publication of open data across government. The Government of Canada is also working on a strategy, with engagement and input from across the public service, developing short and longer-term strategies to fully incorporate Web 2.0 across the government.

In addition, Canada’s proactive disclosure initiatives represent an ongoing contribution to open and transparent government. These initiatives include the posting of travel and hospitality expenses, government contracts, and grants and contribution funding exceeding pre-set thresholds. Subsequent phases will involve the alignment of proactive disclosure activities with those of the Access to Information Act, which gives citizens the right to access information in federal government records.

Lots of interesting things packed into these two paragraphs, something I’m sure readers concerned with open data, open government and proactive, would agree with. So let’s look at the good, the bad and the ugly, of all of this, in that order.

The Good

So naturally the first sentence is debatable. I don’t think Canada makes a significant amount of its data available at all. Indeed, across every government website there is probably no more than 400 data sets available in machine readable format. That’s less that the city of Washington DC. It’s about (less than) 1% of what Britain or the United States disclose. But, okay,let’s put that unfortunate fact aside.

The good and really interesting thing here is that the Government is stating that it was going to launch an open data portal. This means the government is thinking seriously about open data. This means – in all likelihood – policies are being written, people are being consulted (internally), processes are being thought through. This is good news.

It is equally good news that the government is developing a strategy for deploying Web 2.0 technologies across the government. I hope this will be happening quickly as I’m hearing that in many departments this is still not embraced and, quite often, is banned outright. Of course, using social media tools to talk with the public is actually the wrong focus (Since the communications groups will own it all and likely not get it right for quite a while), the real hope is being allowed to use the tools internally.

The Bad

On the open data front, the bad is that the portal has not launched. We are now definitely passed the fall of 2010 and, as for whatever reason, there is no Canadian federal open data portal. This may mean that the policy (despite being announced publicly in the above document) is in peril or that it is simply delayed. Innumerable things can delay a project like this (especially on the open data front). Hopefully whatever the problem is, it can be overcome. More importantly, let us hope the government does something sensible around licensing and uses the PDDL and not some other license.

The Ugly

Possibly the heart stopping moment in this brief comes in the last paragraph where the government talks about posting travel and hospitality expenses. While these are often posted (such as here) they are almost never published in machine readable format and so have to be scrapped in order to be organized, mashed up or compared to other departments. Worse still, these files are scattered across literally hundreds of government websites and so are virtually impossible to track down. This guy has done just that, but of course now he has the data, it is more easily navigable but no more open then before. In addition, it takes him weeks (if not months) to do it, something the government could fix rather simply.

The government should be lauded for trying to make this information public. But if this is their notion of proactive disclosure and open data, then we are in for a bumpy, ugly ride.

One Simple Step to Figure out if your Government "gets" Information Technology

Chris Moore has a good post up on his blog at the moment that asks “Will Canadian Cities ever be Strategic?” In it (and it is very much worth reading) he hits on a theme I’ve focused on in many of my talks to government but that I think is also relevant to citizens who care about how technologically sophisticated their government is (which is a metric of how relevant you think you government is going to be in a few years).

If you want to know how serious your government or ministry is about technology there is a first simple step you can take: look at the org chart.

Locate the Chief Information Officer (CIO) or Chief Technology Officer (CTO). Simple question: Is he or she part of the senior executive team?

If yes – there is hope that your government (or the ministry you are looking at) may have some strategic vision for IT.

If no – and this would put you in the bucket with about 80% of local governments, and provincial/federal ministries – then your government almost certainly does not have a strong vision, and it isn’t going to be getting one any time soon.

As Chris also notes and I’ve been banging away at (such as during my keynote to MISA Ontario), in most governments in Canada the CIO/CTO does not sit at the executive table. At the federal level they are frequently Director Generals, (or lower), not Assistant Deputy Minister level roles. At the local level, they often report into someone at the C-level.

This is insanity.

I’m trying to think of a Fortune 500 company – particularly one which operates in the knowledge economy – with this type of reporting structure. The business of government is about managing information… to better regulate, manage resources and/or deliver services. You can’t be talking about how to do that without having the CIO/CTO being part of that conversation.

But that’s what happens almost every single day in many govenrment orgs.

Sadly, it gets worse.

In most organizations the CIO/CTO reports into the Chief Financial Officer (CFO). This really tells you what the organization thinks of technology: It is a cost centre that needs to be contained.

We aren’t going to reinvent government when the person in charge of the infrastructure upon which most of the work is done, the services are delivered and pretty much everything else the org does depends on, isn’t even part of the management team or part of the organizations strategic conversations.

It’s a sad state of affairs and indicative of why our government’s are so slow in engaging in new technology.