Tag Archives: technology

Why the Internet Will Shape Social Values (and not the other way around)

crystal-ballThe biggest problem in predicting the future isn’t envisaging what technologies will emerge – it is forecasting how individuals and communities will respond to these technologies. In other words I often find people treat technology as a variable, but social values as a constant. Consequently, as they peer into tomorrow, technology is examined only in terms of how it will change (and make easier) tasks – and not on how it will cause social values and relationships to shift. By treating social values as a constant we assume that technology will conform to today’s values. In truth, it is often the reverse that is the case – social values change and come to reflect the technology we use.

For example, I find people ask me if I’m nervous about blogging since, 20 years hence, someone may dig up a post and use to demonstrate how my thinking or values were flawed. Conversely, a friend suggested that social networks will eventually “auto-delete” photos so that any embarrassing pictures that might have ended up online will not be searchable. (Let’s put aside the fact that a truly embarrassing picture will likely get copied to several places.) In short, these friends cannot imagine a future where your past is accessible and visible to a wider group of people. In their view an archived personal history is anathema as it violates some basic expectations of anonymity (not to be confused with privacy) they are accustomed to. In their minds our mistakes, misadventures or even poor fashion choices need to be forgotten (or hidden in the vast grayness of history) in order for us to be successful. If not, we will somehow become social pariahs or certain doors may forever be closed to us.

To put it another way, it presumes that our future employers, social circles and even society in general will punish people who’ve ever had a thought others disagree with or will refuse to hire someone who’s ever had a embarrassing photo of themselves posted to the internet.

Really? If this is the case then the jobs of tomorrow are going to be filled by either the most conservative and/or timid people or (more troubling, but less surprising) by those best able to cover their tracks. I’m not sure either of these traits are what I’m want in a prospective employee. Should I hire someone who is afraid to publicly share independent thoughts? Do I want to work with someone too risk-averse to push a boundary or have fun? Or worse, should I contract someone who is highly adept at covering up their mistakes? If the jobs of the future are going to require creativity, originality and integrity why would I hire for the opposite traits?

Perhaps those whose lives are more visible online will be discriminated against. But it is also possible the inverse could be true. Those who have no online history have no discernible, verifiable track record, no narrative about how their values and thinking has evolved over time. While such a history will be filled with flaws and mistakes, it will at least be open and visible, whereas those who have lived offline will have a history that is opaque and verifiable only by their own handpicked references.

If anything, I suspect the internet is going to create a society that is more honest and forgiving. We will be returning to a world of thin anonymity – a world where it is difficult to escape from the choices you’ve made in the past. But the result won’t be a world where fewer people take risks, it will be a world that recognizes those risks were necessary and expected.

What would such a world look like? Well naturally it is going to be hard to imagine, because it is a world that would likely make you deeply uncomfortable (think of how hard it would have been 25 years ago to imagine a large swath of the population being comfortable with online dating). But there are perhaps microcosm we can look at. While dysfunctional in many ways the culture of Silicon Valley – in how it treats failure – may be a good example. While I’ve not lived in the valley, everything I’ve read about it suggests that it is hard to be taken seriously unless you’ve taken risks and have failedit demonstrates your willingness to try and learn. It is a community where it is easy to look into everyone else’s past – either by searching online or simply asking around. In this regard Silicon Valley is deeply honest – people own their successes and their failures – and it is a place that, in regards to business, is forgiving. Compared to many places on the planet, past failures (depending of course on the nature of depth of the error) are forgivable and even seen as a necessary right of passage.

All this isn’t to say that we should be limiting people’s ability for anonymity or privacy online. If someone wants their photos auto-deleted after 5 years, please let them do it. But let us at least always preserve choice – let us not architect our technology to solely conform to today’s social norms as we may discover we will be willing to make different choices in a few years.

5 Ways to get to the Next Million Mozillians

Mark Surman, Executive Director of the Mozilla Foundation has been ruminating on:

how Mozilla can actively encourage large numbers of people to participate on making the web more open and awesome.”

For a long time I’ve been a supporter of the idea that supporters of an Open Web are part of a social movement and that mobilizing these supporters could be a helpful part of a strategy for preserving and promoting the openness of the web. More importantly, I think the rise of open source, and in particular the rise of Mozilla tracks shockingly well against the structure of a social movement.

So if we are interested in increasing interest in the openness of the web and believe that recruiting the next million Mozillians can helps us accomplish that, then I think there are 3 things any strategy must do:

1. Increase the range of stake holders involved (this is part of why I write about women in open source so much) as this gives open web supporters more leverage when negotiating with those who threaten the web’s openness or who influence its development

2. Connect nebulous ideas like “security” and “openness” to tangible experiences people (users?) can relate to and to core values they believe in. (This is why Mark’s “seatbelt moment” narrative is awesome in this regard)

3. Outline actions that stakeholders and supporters can take.

So with the (not always successful) intent to focus on these 3 objectives here are five ideas I think could help us:

Idea 1: Partner with Consumer Reports and help shape the criteria by which they evaluate ISPs

One key to ensuring an open web is ensuring that people’s connection to the web is itself open. ISPs are a critical component in this ecosystem. As some observers have noted, ISPs engage in all sorts of nefarious activities such as bandwidth shaping, throttling, etc… Ensure that the net stays neutral feels like a critical part of ensuring it stays open.

One small way to address this would be make the neutrality of a network part of the evaluation criteria for Consumer Report reviews of ISPs. This would help make the openness of an ISP a competitive, would increase the profile of this problem and would engage a group of people (Consumer Report users) that are probably not generally part of the Mozilla Community.

Idea 2: Invest in an enterprise level support company for Firefox & Thunderbird.

Having a million citizens supporting Firefox and Mozilla is great, but if each of those supporters looks and acts the same then their impact is limited. Successful movements are not just large they are also diverse. This means having a range of stakeholders to help advocate for the open web. One powerful group of stakeholders are large enterprises & governments. They have money, they have clout and they have large user bases. They are also – as far as I can tell – one of the groups that Firefox has had the hardest time achieving market share with.

From my limited experience working with governments, adopting Firefox is difficult. There is no one to sign an SLA with, no dedicated support desk and no assurances problems will be escalated to the right developer within a fixed time period. Many of these challenges are highlighted by Tauvix in the comment section of this post). We could spend our time arguing about whether these issues are legitimate or if those large organizations simply need a culture shift. But such a shift will take a LONG time to materialize, if it ever does.

Finding a way to satisfy the concerns of large organizations – perhaps through a Redhat type model – might be a good way to invest Mozilla Foundation money. Not only could there be a solid return on this investment, but it could bring a number of large powerful companies and governments into the Mozilla camp. These would be important allies in the quest for an open web.

Idea 3: Promote add-ons that increase security, privacy and control in the cloud.

One reason behind Mozilla’s enormous success is that the community has always provided innovative technical solutions to policy/privacy/openness problems. Don’t like they way Microsoft is trying to shape the internet? Here, use a better browser. Don’t want to receive target advertising on website? Here, download this plug-in. Don’t want to see any advertising? Here, download this plug-in. Not sure if a website is safe? Here, use this plug-in. In short, Mozilla has allowed its software to serve as a laboratory to experiment with new and interesting ways to allow users to control their browsing experience.

While not a complete solution, it might be interesting to push the community to explore how Greasemonkey scripts, Jetpack plug-ins, or ordinary plug-ins might provide users with greater control over the cloud. For example, could a plug-in create automatic local backups of google docs on your computer? Could a Thunderbird plugin scan facebook messages and allow users a choice of mediums to respond with (say email). Fostering a “product-line” of cloud specific plug-ins that increase user control over their experience might be an interesting place to start.

Idea 4: Create and brand the idea of an openness audit

As more and more personal data ends up in servers controlled by companies, governments and non-profits there are real concerns around how secure and private this information is. Does anyone know that Google isn’t peeking at your Google docs every once in a while? Do you know if you’ll ever be able to delete your personal information from facebook?

These are legitimate questions. Outlining some guidelines around how companies manage privacy and security and then creating an audit system might be an interesting way to nudge companies towards adopting stronger standards and policies in the cloud. This might also increase public awareness and encourage a upwards spiral among competing service providers. Working with companies like KPMG and Deloitte Mozilla and others could help foster a new type of audit, one that would allow consumers to easily discriminate against cloud service providers that respect their rights, and those that don’t.

Idea 5: Let’s use that Firefox launch screen to create the next million Mozillians

At the moment, when you download and install Firefox the first website you see when you load the program congratulates you on downloading the program, tells you that you are helping keep the internet open and outlines some of Firefox’s new features. We could do more. Why not prompt people to join a “Mozillians” club where they will be kept up to date on threats and opportunities around the open web. Or maybe we should list 3 actions (with hyperlinks) they can take to increase the openness of the web (say, upgrade a friend, send a form letter to their member of congress and read an intro article on internet security?)

With maybe 300+ million people likely to download Firefox 3.5, that’s a lot of people we could be mobilizing to be more active, technically, socially and politically, around an open web.

There’s a start… I’ll keep brainstorming more ideas but in the interim, please feel free to let me know if you think any of these have real problems and/or are bunk.

Creating the Open Data Bargain in Cities

Embedded below is the talk I’ve given to both community hackers (at Open Web Vancouver) as well as City of Vancouver Staff regarding the opportunities and challenges around open data and the open motion. (Here’s an update on where Vancouver is at courtesy of some amazing work by city staff).

For those willing to brave through the presentation (or simply fast forward to the end) one piece I felt is most important is the talk’s last section which outlines what I term “The Bargain” in a reference to the informal contract Clay Shirky says exists between every Web 2.0 site and their users.

The bargain comes last, because it matters only if there is a promise (open and shared data) and a set of tools (applications languages) that are already working together. The bargain is also the most complex aspect of a functioning group, in part because it is the least explicit aspect and in part because it is the one the users have the biggest hand in creating, which means it can’t be completely determined in advance… A bargain helps clarify what you can expect of others and what they can expect of you.

Clay Shirky in Here Comes Everybody (my italics, page 270)

I believe that in an open city, a similar bargain exists between a government and its citizens. To make open data a success and to engage the community a city must listen, engage, ask for help, and of course, fulfil its promise to open data as quickly as possible. But this bargain runs both ways. The city must to its part, but so, on the flip side, must the local tech community. They must participate, be patient (cities move slower than tech companies), offer help and, most importantly, make the data come alive for each other, policy makers and citizens through applications and shared analysis.

Articles I'm digesting 24/7/2009

Been a while since I’ve done one of these and I’ve got a lot of great pieces I’ve been reading. So let’s get to it.

Designs on Policy by Allison Arieff (via David B.) and TED Talk: Are we in Control of our own Decisions? by Dan Ariely

I keep hearing about the interaction between policy and design (most flatteringly an architecture professor said I had a designer’s mind” the other day) and so over the past few years I try (with some success) to read as much as I can about design. David B sent me the Arieff piece which, of course, weds my passion for public policy with design. One thing I like is the way the piece doesn’t try to boil the ocean – it doesn’t claim (like in other places) that good design will solve every problem – just that it will help mitigate against it. Most intriguing for me is this line:

“It feels weird to have to defend design’s importance, yet also completely necessary. The United Kingdom has had a policy in place since 1949; Japan since 1956. In countries like Finland, Sweden, South Korea and the Netherlands, design is a no-brainer, reflected by the impeccable elegance, usability and readability of everything in those countries from currency to airport signage.”

A design policy? How civilized. That’s something I could get behind – especially after listening to Dan Ariely’s TED talk which is downright frightening at moments given how susceptible our decisions are (and most disconcerting the decisions of our doctors, dates and who knows whose) to the layout/perception of the choice.

Lost in the Cloud by John Zittrain

A few months ago I was in Ottawa and – surprisingly and unplanned – ended up at a pub with Richard Stallman. I asked him what he thought of Cloud Computing (a term he believes is too vague to be helpful) but was nonetheless viscerally opposed to it. Many of the reasons he cites are covered by Zittrain in this thoughtful piece. The fact is, Cloud Computing (or whatever term you may wish to use) is very convenient and it carries with it huge privacy, security and access challenges. This is potentially the next big challenge for those of us who support and Open Internet – the possibility of the internet being segmented into a series of walled gardens controlled by those who run the cloud servers is real and must be thought through. If you care about public policy and/or are a geek, read this.

Is it Time to Get Rid of the Foreign Service Designation?

Am I reading my own articles? No. I am, however, absorbed by the fascinating and constructive conversation taking place – mostly involving public servants – in the comments section underneath. Here are just some snippets:

  • “For 8 years I worked at DFAIT, observing and participating in the culture within the walls of a building named after a diplomat that Wikipedia states “is generally considered among the most influential Canadians of the 20th century.” Sadly, the elitism (whether earned or not) is only the cause of a bigger problem; lack of desire to collaborate, and almost no desire to change in an era where the only constant is change.”
  • “…as I left the issue of the FS classification was quietly but passionately part of the watercooler discussion. From my perspective, in spite of a nasty AG report on the dismal state of affairs of HR at DFAIT, the department has more pressing problems, such as credibility with central agencies, a coherent sense of mission and talent attraction and retention.”
  • “I am also a bit puzzled by people who saw your piece as an attack on DFAIT – you’re advocating for human resource reform to improve the department, after all. I’m still not sure why you think DFAIT is required though, or why Canadian foreign policy suffers when departments forumulate it without involving DFAIT.”
  • “It’s good to see that even Craig Weichel, President of PAFSO, is open to your suggestion that it might be good to have more foreign service officers circulate through other government departments…”

Putting the Cart Before the Horse by Peter Cowan

A great blog post about the lessons from implementing social media in a government agency. Peter Cowan – an Open Everything alum – is part of the team at Natural Resources Canada team that has been doing amazing work (NRCan is one of the most forward looking ministries in the world in this regard). Peter’s piece focuses on misunderstanding the “business case” for social media and how it often trips up large government bureaucracies. This abbreviated but extended quote on why traditional IT business cases don’t work or aren’t necessary is filled with great thoughts and comments:

“They (Social Media tools) are simple and viral and they cost very little to implement so the traditional requirements for upfront business needs definition to control risk and guide investment are not as important. In fact it would take more time to write a proposal and business case than to just put something out there and see what happens.

More importantly though social media are fundamentally new technologies and the best way to understand their business value is to get them into the hands of the users and have them tell you. To a large degree this is what has happened with the NRCan Wiki. Most of the innovative uses of the wiki came from the employees experimenting. They have not come from a clearly articulated business needs analysis or business case done in advance.

In fact, determining business needs in advance of having a tool in hand may actually lead to status quo approaches and tools. There is the famous Henry Ford… quote goes something like “if I had asked people what they wanted in a car they would have said faster horses”. We social media folks usually deploy this quote to highlight the weakness of focusing too much on responding to people’s perceptions of their existing business needs as a determinant of technology solution since people invariably define their needs in terms of improving the way they are already doing things, not how things could be done in a fundamentally new way.

Genius.

Google’s Microsoft Moment by Anil Dash

A fantastic piece about how Google’s self-perception is causing it to make strategically unsound choices at the same time as its public perception may be radically shifting (from cute fuzzy Gizmo in to mean nasty Stripe). A thoughtful critique and a great read on how the growth and maturation of a company’s culture needs to match its economic growth. I’ve added Anil Dash to by must read blogs – he’s got lots of great content.

Open Cities – the Counter Reaction

The Washington Monthly has an interesting piece about how some bureaucracies are having a reactionary (but albeit unsurprising) reaction to open data initiatives. The article focuses on how the data used by one application, Stumble Safely “helps you find the best bars and a safe path to stumble home on” by mashing together DC Crime Data, DC Road Polygons, DC Liquor Licenses, DC Water, DC Parks, and DC Metro Stations.

However, arming citizens with precise knowledge doesn’t appear to make one group of people happy: The Washington, D.C. police department. As the article notes:

But a funny thing has happened since Eric Gundersen launched his application: Stumble Safely has become less useful, rather than more so. When you click on the gray and red crime-indicating dots that have appeared on the map in the past few months, you don’t get much information about what exactly happened—all you get is a terse, one-word description of the category of the incident (”assault,” or “theft”) and a time, with no details of whether it was a shootout or just a couple of kids punching each other in an alley.

This isn’t Gundersen’s fault—it’s the cops’. Because while Kundra and the open-data community were fans of opening up the city’s books, it turned out that the Metropolitan Police Department was not. Earlier this year, as apps like Stumble Safely grew in number and quality, the police stopped releasing the detailed incident reports—investigating officers’ write-ups of what happened—into the city’s data feed. The official reason for the change is concern over victims’ and suspects’ privacy. But considering that before the clampdown the reports were already being released with names and addresses redacted, it’s hard to believe that’s the real one. More likely, the idea of information traveling more or less unedited from cops’ keyboards to citizens’ computer screens made the brass skittish, and the department reacted the way bureaucracies usually do: it made public information harder to get. The imperatives of Government 2.0 were thwarted by the instincts of Government 1.0.

This is just one in a long list of ways that old-style government (1.0) is reacting against technology. The end result sadly however is that the action taken by the police doesn’t reduce crime, it just reduces the public’s confidence in the police force. This is just a small example of the next big debate that will take place at all levels of government: Will your government try to control information and services or will it develop trust by being both accountable and open to others building on its work? You can’t have it both ways and I suspect citizens – particularly creatives – are going to strongly prefer the latter.

This is a crosspost from my Open Cities Blog at CreativeClass.com

Open Data at the City of Vancouver – An Update 16/7/2009

matrix

For those interested in the progress around the Open Motion (or Open3 as city staff have started to call it) I’ve a little update.

Last week during a visit to city hall to talk about the motion I was shown a preview of the website the city has created to distribute and share data sets. For those unsure what such a website would look like, the baseline and example I’m measuring the city against is, of course, the Washington DC website. At the moment the city’s prospective website looks more like the (also impressive) beta site the City of Nanaimo set up after ChangeCamp – a little simpler and with a lot fewer data sets, but it is the first step.

As an aside kudos to the City of Nanaimo team which has been pushing open data and especially geo-data for quite some time as this must read Time magazine piece (yes, a must read Time magazine piece) will attest.

Anyway… back to Vancouver. The fact that the city has a beta website with a (very) modest amount of data ready to launch is a testament to the hard work and effort of the City’s IT staff. Rarely in my work with government’s have I seen something move so quickly and so needless to say… I’m quite excited. At the moment, I don’t know when the beta data site will go live – there are still a few remaining issues being worked on – but as soon as it launches I will be writing a blog post.

In the interim, big kudos should also go to the City’s Archives who posted a number of videos from the archives online and created it’s own YouTube Channel. They received so much traffic over the videos that the servers almost ground to a halt. Awesome, I say. It just goes to show how much interest there is out there.

Also exciting is that my post on how open data makes garbage collection sexy has inspired two local hackers (Luke and Kevin) to scrape the city’s garbage calendar and hand created digital versions of the city’s garbage maps to create the app I spec’ed out in the blog post. (I’ll have more on that, including links, in a few weeks) Luke also suggested I start recording other app ideas that come to me so over at the Vancouver Data Google group which was created on the fly by local coders in the audience during my and Andrea’s presentation at Open Web Vancouver. I’ve asked people to share their ideas for applications (mobile or desktop) that they’d like to see created with open data.

Sooooo… if there is an app you’d like to see created please post it to the google group or send me an email or write it in the comments below. No guarantees that it will be created but I’m hoping to help organize some hack-a-thons (or as my city friends prefer… code sprints). Having some ideas for people to sink their teeth into is always helpful.

How Open Data even makes Garbage collection sexier, easier and cheaper

So presently the City of Vancouver only shares its garbage schedule (which it divides into north and south) as a PDF file. This is a pity as it means that no one can build any apps around it. Imagine a website or Iphone app that mashed up google maps with a constantly up to date city garbage pick up schedule. With such a application one could:

  • Simply punch in your address and find out your garbage zone (no more guessing if you live on the border of a zone)
  • Get an email or SMS notification 15 minutes, 1 hour, 12 hours (whenever!) before pick up
  • Download the garbage schedule into your calendar using XML, HTML or ICAL

Garbage-App

Let’s face it, everyone could do with a garbage day reminder. I can imagine that there are 1000’s of Vancouverites who’d sign up for an email notification.

Maybe this seems all very simply, nice but unimportant. But this is more than just creating convenience. What are the implications for such an application?

For citizens:

  • Let’s face it, for many of us, it would be a convenient and nice to get a reminder of when the garbage is going to be picked up
  • It would improve citizen’s appreciation for city services
  • It might also increase the likelihood citizen will recycle as the notification would enable them to better plan and prepare for garbage pick up

For the city & taxpayers

  • Every day 100s of Vancouverites forget to put out there garbage out for pick up. If garbage isn’t picked up, it piles up. This creates several new risks and costs to the city including: increasing likelihood of rodents and health hazards and an increased risk that some residents will dispose of their garbage inappropriately/illegally
  • When garbage is not put out an expensive city asset (the garbage truck, fuel, garbage men) all pass by unused. This means that taxpayers money is not used as efficiently as it could.
  • Moreover,  when garbage isn’t put out there will be twice as much at the house the next week. Multiply that by 100’s of houses and the very quickly the city must deploy extra garbage trucks to deal with this unusually large volume of garbage – costing city taxpayers.

What would be the total value/savings of such an application? Hard to gauge. But add up, time saved by citizens better able to plan around garbage pick up, a small reduction in illegal garbage disposal, a small increase in recycling, slight increase in garbage pick up efficiency and I’m sure the total value would be in the low millions, and the direct savings for the city in the low $100,000 per year. That is not insignificant – especially if all the city had to do was free the data and allow an intrepid hacker to code up the website.

Of course it doesn’t end there. A reliable source tells me the city collects far more data about garbage collection. For example when the driver can’t pick up the garbage can, they make an entry on their device as to why (e.g., it is under a tree). This entry is sent via a wireless connection back to a City database, and includes the highly precise coordinates of the truck at that moment. Then when a resident calls in to find out why the crew did not pick up the garbage can from their residence, the operator can bring up the address on a map and pinpoint the problem.

Such data could also be on the website in an option with something like “Why didn’t my garbage get picked up?” By sharing this data the city could reduce its call volume (and thus costs). With open data, the possibilities, savings and convenience is endless.

When good companies go bad – How Nokia Siemens helped Iran monitor its citizens

Last week my friend Diederik wrote a blog titled “Twittering to End Dictatorship: Ensuring the Future of Web-based Social Movements” in which he expressed his concern that (Western) corporations might facilitate oppressive regimes in wiretapping and spying on their citizens.

Now it appears that his concerns have turned out to be true. As he points on more recently on his blog:

  • The Wall Street Journal reports that Nokia and Siemens have supplied Iran with deep-inspection technologies to develop “one of the world’s most sophisticated mechanisms for controlling and censoring the Internet, allowing it to examine the content of individual online communications on a massive scale”. The  Washington Post also reported this.
  • Siemens has not just sold the “Intelligence Platform” to Iran, but to a total of 60 countries. Siemens calls it “lawful interception”, but in countries with oppressive regimes everything that the government does is lawful.
  • The New York Times reports that China is requiring Internet censor software to be installed on all computers starting from July 1st.

Of course, being Nordic, the Nokia Siemens joint venture which developed and sold the monitoring centre to Iran has a strict code of ethics on their website that addresses issues of human rights, censorship and torture. In theory this should have guided their choice of selling equipment to Iran – obviously it has not.

So Diederik and his friends have started a petition to enable people voice their concern over the failure of Nokia Seimens to adhere to their own code of conduct by selling advanced technology to help the government of Iran to control its citizens. I hope it takes off…

The Rat Pack of Public Service Sector Renewal

As many of you know I spend a lot of time thinking about public service sector renewal – that’s a wonkish term for renewing the public service. I do it because I think the public service is one of the most important institutions in the country since it affects everything we do, pretty much every day.

Over the past few years I’ve met more and more people who are equally passionate about this issue. Some I’ve met in person, others I’ve just chatted with by email. But, over the last 4 years I’ve watched a small group of bloggers – a rat pack of public service sector renewal – emerge. We’re scattered across the country and have come to from different angles but we all care about how our government is, how it should be, and how we can get to from the first place to the latter.

This is no easy task. I’m outside of government so it’s easier for me to speak truth to power. That’s why I’m so impressed with the other rat packers, in pursuit of making government better some have put their jobs on the line from time to time. I’d encourage you to go check our their blogs and give them a read.

The CPSR rat pack:

Me: as my readers know, my own thinking on public service sector renewal tends to focus on public policy development, and how it is going to be impacted by demographic change, technology, social media, networks and emergent systems.

Nick Charney’s blog CPSRenewal is one of the best blogs on public service sector renewal out there. Nick often does a weekly roundup of CPSR articles and blog posts, interviews with public servants and generally shares his thoughts.

Etienne Laliberte is one of the bravest public servants I know. A couple of years ago he wrote “An Inconvenient Renewal” in which shared his thoughts on renewal. Most important, his is probably the only document I’ve seen that treats renewal as a management problem, not a policy problem (something I’ve discussed in the past and intend to talk about again shortly). You can catch him at his blog as well.

A couple of other people I think of as being part of the Rat Pack include Peter Cowan – an OpenEverything alumnus – whose part of a team doing very interesting work with social media tools at Natural Resources Canada.

Thomas Kearney, who doesn’t blog, but is amazing nonetheless, has been a big part of the work behind GCPEDIA.

There’s Laura Wesley’s who’s got a great blog over at Results for Canadians: Measuring Success in Government. Nice to have someone concerned with how we measure success!

And finally there is the outspoken Douglas Bastien at Government of Canada 2.0, ready to tell it as it is and take no prisoners.

I know there are more people than just those I’ve mentioned, but these are the group I know and who’ve always been kind about letting an semi-outsider like me in. If you care about Canadian public service sector renewal (twitter hashtag #cpsr) then I hop you’ll add their RSS feeds to your reader.

Will Firefox’s JetPack let us soar too high?

Recently Mozilla introduced Jetpack, a Firefox add-on that makes it possible to post-process webpages within the web browser. For the non-techies out there, this means that one can now create small software programs that, if installed, can alter a webpages content by changing, adding or removing parts of it before it is displayed on your computer screen.

For the more technically minded, this post-processing of web pages is made possible because JetPack plugins have access to the Document Object Model (DOM). Since the DOM describes the structure and content of a web page, the software can manipulate the webpage’s content after the page is received from the web server but before it is displayed to the user. As a result static web pages, even the ones you do not control, can become dynamic mashups.

This may seem insignificant but it has dramatic implications. For example, imagine a JetPack plugin that overlays a website – say of BarrackObama.com or FoxNews.com – that causes text bubbles that counterspin a story when your mouse hovers over it. The next republican nominee could encourage supporters to download such a hypothetical plugin and then direct their supporters to Obama’s website where each story could be re-spun and links to donating money to the republican campaign could be proffered. They would, in short, dynamically use Obama’s webpage and content as a way to generate money and support. TPM could create a similar Jetpack plugin for the FoxNews website which would do something similar to the title and body text of articles that were false or misleading.

Such plugins would have a dramatic impact on the web experience. First, they would lower costs for staying informed. Being informed would cease to be a matter of spending time searching for alternative sources, but a matter of installing the appropriate JetPack plugin. Second, every site would now be “hijackable” in that, with the right plugin a community could evolve that would alter its content without the permission of the site owner/author. On the flip side, it could also provide site owners with powerful community engagement tools: think open source editing of newspapers, open source editing of magazines, open source editing of television channels.

The ultimate conclusion however is that JetPack continues to tilt power away from the website creators to viewers. Webpage owners will have still less control over how their websites get viewed, used and understood. Effectively anyone who can persuade people to download their JetPack plugin can reappropriate a website – be it BarrackObama.com, FoxNews.com, eBay, or even little old eaves.ca – for their own purposes without the permission of the website owner. How the web eco-system and website developers in particular react to this loss of control will be interesting. Such speculation is difficult. Perhaps there will be no reaction. But one threat is that certain websites place content within proprietary systems like Flash where it would be more difficult for JetPack to alter their contents. More difficult to imagine, but worth discussion, is that some sites might simply not permit Firefox browsers to view their site.

In the interim three obstacles need to be overcome before JetPack realizes its full potential. Currently, only a relatively small community of technically minded people can develop JetPack add-ons. However, once Jetpack becomes an integral part of the Firefox browser this community will grow. Second, at present installing a JetPack plugin triggers a stern security warning that will likely scare many casual users away. Mozilla has hinted at developing a trusted friends system to help users determining whether a plug-in is safe. Such trust systems will probably be necessary to make JetPack a mainstream technology. If such a community can be built, and a system for sorting out trusted and untrustworthy plugins can be developed, then Jetpack might redefine our web experience.

We are in for some interesting times with the launch of Firefox 3.5 and new technologies like JetPack around the corner!

Jetpack is available at jetpack.mozillalabs.com

Diederik van Liere helped write this post and likes to think the world is one big network.