Background – how we got here
Over the past few years I’ve spent a great deal of time thinking about how we can improve both the efficiency of open source communities and contributors experience. Indeed, this was the focus, in part, of my talk at the Mozilla Summit last summer. For some years Diederik Van Liere – now with the Wikimedia foundation’s metrics team – and I have played with Bugzilla data a great deal to see if we could extract useful information from it. This led us to engaging closely with some members of the Mozilla Thunderbird team – in particular Dan Mosedale who immediately saw its potential and became a collaborator. Then, in November, we connected with Daniel Einspanjer of Mozilla Metrics and began to imagine ways to share data that could create opportunities to improve the participation experience.
Yesterday, thank’s to some amazing work on the part of the Mozilla Metrics team (listed at bottom of the post), we started sharing some of work at the Mozilla all hands. Specifically, Daniel demoed the first of a group of dashboards that describe what is going on in the Mozilla community, and that we hope, can help enable better community management. While these dashboards deal with the Mozilla community in particular I nonetheless hope they will be of interest to a number of open source communities more generally. (presently the link is only available to Mozilla staffers until the dashboard goes through security review – see more below, along with screen shots – you can see a screencast here).
Why – the contributor experience is a key driver for success of open source projects
My own feeling is that within the Mozilla community the products, like Firefox, evolve quickly, but the process by which people work together tends to evolve more slowly. This is a problem. If Mozilla cannot evolve and adopt new approaches with sufficient speed then potential and current contributors may go where the experience is better and, over time, the innovation and release cycle could itself cease to be competitive.
This task is made all the more complicated since Mozilla’s ability to fulfill its mission and compete against larger, better funded competitors depends on its capacity to tap into a large pool of social capital – a corps of paid and unpaid coders whose creativity can foster new features and ideas. Competing at this level requires Mozilla to provide processes and tools that can effectively harness and coordinate that energy at minimal cost to both contributors and the organization.
As I discussed in my Mozilla Summit talk on Community Management, processes that limit the size or potential of our community limit Mozilla. Conversely, making it easier for people to cooperate, collaborate, experiment and play enhances the community’s capacity. Consequently, open source projects should – in my opinion – constantly be looking to reduce or eliminate transactions costs and barriers to cooperation. A good example of this is how Github showed that forking can be a positive social contribution. Yes it made managing the code base easier, but what it really did was empower people. It took something everyone thought would kill open source projects – forking – and made it a powerful tool of experimentation and play.
How – Using data to enable better contributor experience
Unfortunately, it is often hard to quantitatively asses how effectively an open source community manages itself. Our goal is to change that. The hope is that these dashboards – and the data that underlies them – will provide contributors with an enhanced situational awareness of the community so they could improve not just the code base, but the community and its processes. If we can help instigate a faster pace of innovation of change in the processes of Mozilla, then I think this will both make it easier to improve the contributor experience and increase the pace of innovation and change in the software. That’s the hope.
That said, this first effort is a relatively conservative one. We wanted to create a dashboard that would allow us to identify some broader trends in the Mozilla Community, as well as provide tangible, useful data to Module Owners – particularly around identifying contributors who may be participating less frequently.
This dashboard is primarily designed to serve two purposes. First is to showcase what dashboards could be with the hope of inspiring the Mozilla community members to use it and, more importantly, to inspire them to build their own. The second reason was to provide module owners with a reliable tool with which to more effective manage their part of the community. So what are some of the ways I hope this dashboard might be helpful? One important feature is the ability to sort contributors by staff or volunteer. An open source communities volunteer contributors should be a treasured resource. One nice things about this dashboard is that you can not only see just volunteers, but you can get a quick sense of those who haven’t submitted a patch in a while.
In the picture below I de-selected all Mozilla employees so that we are only looking at volunteer contributors. Using this view we can see who are volunteers who are starting to participate less – note the red circle marked “everything okay?” A good community manager might send these people an email asking if everything is okay. Maybe they are moving on, or maybe they just had a baby (and so are busy with a totally different type of patch – diapers), but maybe they had a bad experience and are frustrated, or a bunch of code is stuck in review. These are things we would want to know, and know quickly, as losing these contributors would be bad. In addition, we can also see who are the emerging power contributors – they might be people we want to mentor, or connect with mentors in order to solidify their positive association with our community and speed up their development. In my view, this should be core responsibilities of community managers and this dashboard makes it much easier to execute on these opportunities.
Below you can see how zooming in more closely allows you to see trends for contributors over time. Again, sometimes large changes or shifts are for reasons we know of (they were working on features for a big release and its shipped) but where we don’t know the reasons maybe we should pick up the phone or email this person to check to see if everything is okay.
Again, if this contributor had a negative experience and was drifting away from the community – wouldn’t we want to know before they silently disappeared and moved on? This is in part the goal.
Some of you may also like the fact that you can dive a little deeper by clicking on a user to see what specific patches that user has worked on (see below).
Again, these are early days. My hope is that other dashboards will provide still more windows into the community and its processes so as to show us where there are bottlenecks and high transaction costs.
Some of the features we’d like to add to this or other dashboards include:
- a code-review dashboard that would show how long contributors have been waiting for code-review, and how long before their patches get pushed. This could be a powerful way to identify how to streamline processes and make the experience of participating in open source communities better for users.
- a semantic analysis of bugzilla discussion threads. This could allow us to flag threads that have become unwieldy or where people are behaving inappropriately so that module owners can better moderate or problem solve them
- a dashboard that, based on your past bugs and some basic attributes (e.g. skillsets) informs newbies and experienced contributors which outstanding bugs could most use their expertise
- Ultimately I’d like to see at least 3 core dashboards – one for contributors, one for module owners and one for overall projects – emerge and, as well as user generated dashboards developed using Mozilla metrics data.
- Access to all the data in Bugzilla so the contributors, module owners, researchers and others can build their own dashboards – they know what they need better than we do
What’s Next – How Do I Get To Access it and how can I contribute
At the moment the dashboard is going through security review which it must complete before being accessible. Our hope is that this will be complete by the end of Q2 (June).
More importantly, we’d love to hear from contributors, developers and other interested users. We have a standing weekly call every other Friday at 9am PST where we discuss development issues with this and the forthcoming code-review dashboard, contributors needs and wanted features, as well as use cases. If you are interested in participating on these calls please either let me know, or join the Mozilla Community Metrics Google group.
Again, a huge shout out is deserved by Daniel Einspanjer and the Mozilla Metrics team. Here is a list of contributors both so people know who they are but also in case anyone has question about specific aspects of the dashboard:
Pedro Alves – Team Lead
Paula Clemente – Dashboard implementor
Nuno Moreira – UX designer
Maria Roldan – Data Extraction
Nelson Sousa – Dashboard implementor
We would love to see you take a similar crack at the Eclipse Foundation’s Bugzilla database! I look forward to seeing the code being made available.
You may also want to take a look at the open metrics that we already publish at http://dash.eclipse.org/dash/commits/web-app/ If you think there’s an opportunity to collaborate further on that, please let us know.
And finally, the next time you’re visiting Ottawa, you should drop by and visit Eclipse HQ. It would be good to compare notes on community metrics. You can reach me at mike at eclipse dot org.
Great post, David. Really fun / inspiring to see inside the tent. :)
Pingback: Better Open Source Communities Through Data | JetLib News
Pingback: Developing Community Management Metrics and Tools for Mozilla | eaves.ca « VisLives!
Really excited to see this stuff coming together, David.
I quite like how you’re mining productivity and performance data from your SCM and tracker. Most projects that use bug tracking software have a great source of project productivity data that’s just waiting to be used. What I’d like to see, though, is addition of some statistical views of your data set. It’d be nice to be able to pull up a c-chart for the contributions by contributor data for better assessment of trends/changes from period to period. Also, it’d would be nice to provide a correlation metric between the contributors contributions and the overall project contributions. If overall project contributions are decreasing at the end of a release cycle, then a good amount of contributors contributions should also be decreasing. Anyway, I’ll be keeping my eye on this project.
Of the open-source communities I’ve participated in, outside those on GitHub, Drupal has been the most pleasurable experience. They are doing a lot of great community management. Angie Byron (webchick) is the Drupal 7 maintainer and now lives in Vancouver, in case you were interested in exchanging ideas on community management.
David – this is spot on and love seeing this. We should hopefully do the same with our Mifos contributor data in JIRA. Our Mifos community (open source MIS for microfinance) is really two-part – both contributor and implementers of our software. I’ve been trying to come up with better metrics and tools to analyze and track adoption and interaction amongst our ecosystem – specialists and users who are providing and giving support for our software.
Pingback: Assessing the familiar by means of the unfamiliar
I posted some snippets from your blog and thoughts about how this is relevant to our Mifos open source community – I welcome you all to check it out: http://bit.ly/gE6eyv
A metric that I would like to see is how many votes the bugs I fixed had, also how many dupes. I patch bugs to help others and it would be nice to see the number in black and white.
Is this dashboard itself open source? I’d love to be able to run this analysis on some of my projects.
concur. love to see this opened, if possible.
I’m interested and am asking to join the relevant Google Group.
Did you already know about what the MeeGo folks are doing along similar lines? I found out via Dawn Foster.
Sumana, super interesting. Am checking out the MeeGo stuff right now. Hope you join the Google group, we are making some good progress.
Pingback: OSCON 2011 « Ted Leung on the Air
Pingback: Study: Android is Least Open of Open Source Mobile Platforms
Pingback: Study: Android Is Least Open of Open Source Mobile Platforms - IProNews
Pingback: Study: Android Is Least Open of Open Source Mobile Platforms | t3knoDorKs
Pingback: Study: Android Is Least Open of Open Source Mobile Platforms | Christian Media Cross
Pingback: Study: Android is Least Open of Open Source Mobile Platforms | DotNetMobile
Pingback: New Work | colaboratorie mutopo
Pingback: New Work | mutopo
Pingback: OSCON 2011 | My Blog
Pingback: Study: Android Is Least Open of Open Source Mobile Platforms | Gadget Lab | Wired
Pingback: "It Wouldn't Be Like A Panacea For Deaf People," Said Paul Mitchell, Director Of The American Sign Language Institute - Hot News In Tech