Recently, there was a thread of discussion on the com-Prac list about the “death of a community” and a follow-up discussion about what or how CoPs should capture discussion-produced knowledge.
I found these to be very interesting and thought-provoking discussions. In this post, I will write about two aspects of these discussions – the retiring of a community and also a case study in how a community centered around a mailing list meets the challenge of knowledge capture.
Before getting into the details – I wanted to (re-)state that I recognize that a community is (much) more than a mailing list – community members interact in many ways, some online, some in “real space”. That being said, I also know that for many communities the tool of choice for group communication is a mailing list, so in this post, I will write about issues related to the use of mailing lists, though the ideas can be transferred to other means of electronic exchange. As John D. Smith notes in the second thread:
“All of the discussion about summarization so far assumes that a community almost exclusively lives on one platform. As Nancy alluded to, I think the reality is quite a bit more messy. Note the private emails between Eric and Miguel that were mentioned in this thread. We ourselves interact in LOTS of different locations.”
In other words, even if you could solve the knowledge capture challenge for one mode of discussion (mailing lists) you are still likely missing out on a lot of the learning and knowledge sharing going on in the community. Keep that in mind!
As I’ve written about before, within the context of my current employer’s community program, mailing lists, and their related archives are an important part of our community of practice initiative (and, by extension our KM program). We have not developed a formal means to retire (or “execute” in the terms used in the first thread mentioned above) a community, but we do have a formal process for retiring mailing lists. While the following is about mailing lists, I think the concepts can scale up to any community – though it might require aggregating similar insights about other channels used by the community.
Within our infrastructure, many of the existing mailing lists are associated with one (or more) communities and we provide a simple means for anyone to request a new mailing list. There is a very light review process, primarily focused on ensuring that the requested list is different enough from existing lists and also doesn’t have such a small topic space that it will likely be very under-utilized), which means that over time we can end up with a lot of mailing lists. Without some regular house-cleaning, this situation can have a very negative impact on how a user’s discovery process – hundreds and hundreds of mailing lists means a lot of confusion.
One way we grapple with this is to use the communities as a categorization of mailing lists. Instead of leaving a user with hundreds of mailing lists to wade through, we encourage them to look for a community in which they’re interested and, through that community, find associated mailing lists. This normally reduces the number of mailing lists to consider down to a small handful.
However, we still have needed a house-cleaning process, so several years ago, this is what we set up:
In our environment, doing this once a year typically reduces the count of lists by about 10% – though the count of lists has remained remarkably stable over time, which would say that we then have that same kind of growth over the next year. On the other hand, if we did not proactively review and retire lists like this, we would be seeing an ever-growing list of mailing lists, making it harder for everyone to find the lists that are engendering valuable discussions.
Or… How to lift knowledge out of the on-going discussion of a community into a better form of reusability.
If a community uses a tool like a mailing list to engender discussion and knowledge sharing – how does a community capture “nuggets” of knowledge from the discussion into a more easily digestible form? Does the community need to (perhaps not given a sophisticated enough means to find information in the archives)?
I have no magic solution to this problem but I did find another comment to be very illustrative of one aspect of the original discussion – who “owns” the archives of a community’s discussion and what is the value of those archives? Even in their raw form, why do those archives have value? As Nancy White notes:
“I suspect that only a small percentage of the members (over time) would actually use the archives. But because they hold the words of members, there may be both individual and collective sense of ownership that have little to do with “utility.”"
The rest of this post will be a brief description of a knowledge capture process I’m very familiar with – though I’m not sure if it will transfer well into other domains. For this description, I’m going completely outside of the enterprise and to a community of which I’m a member that revolves around a table-top fantasy war game named Warhammer.
A bit of background: Warhammer is a rather complex game, with a rulebook that weighs in at several hundred pages and about a dozen additional books that provides details on the various types of armies players can use. All told, probably something like 1,000 pages describing the rules and background of the game. Given the complexity of the game, it is very common that during any given game, the players will run into situations not covered well by the rules – these are usually areas involving interactions of special rules for the armies playing. In the many online forums / mailing lists that exist, one of the most frequent types of discussions revolves around these situations and how to interpret the rules. Many of the same questions come up repeatedly – obvious fodder for an FAQ.
(As an aside, given that Warhammer is published and sold by a company – Games Workshop – one could that they should publish all of the relevant FAQs. They do publish FAQs and errata but they do so at a sporadic pace at best and do not address many of the frequently asked questions.)
One particular Warhammer-related community of which I’m a member – the Direwolf (DW) community – has established a pretty well defined means to gather these FAQs and publish them back to the Warhammer community at large. A brief overview of the process:
Netting it out: A community-selected subset of the community monitors the community for questions in their area of expertise, vettes an answer with the rest of the FAQ council, and then the FAQ documentation is updated as appropriate.
This is pretty straightforward, but the value of this effort is reflected in the fact that the game publisher now very commonly uses input from the Direwolf FAQ council in considering their own responses to FAQs and also in the fact that many players from around the world use the Direwolf FAQ to ensure a consistent interpretation of those “fuzzy” areas of the game. A true value add for the Warhammer community at large.
That being said, this process does take quite a bit of energy and commitment, especially on the part of the “keeper” of the documentation, to keep things up to date. In this case, I believe that the value-add for members of the council is knowing that they are contributing to the Warhammer community at large and also knowing that they are helping themselves in their own engagement of playing the game.
My last several posts have been focused on various aspects of community metrics – primarily those derived from the use of a particular tool (mailing lists) used within our communities. While quite fruitful from an analysis perspective, these are not the only metrics we’ve looked at or reported on. In this post, I’ll provide some insights on other metrics we’ve used in case they might be of interest.
Before going on, though, I also wanted to highlight what I’ve found to be an extremely thorough and useful guide covering KPIs for knowledge management from a far more general perspective than just communities – How to Use KPIs in Knowledge Management by Patrick Lambe. I would highly recommend that anyone interested in measuring and evaluating a knowledge management program (or a community of practice initiative specifically) read this document for an excellent overview for a variety of areas. Go ahead… I’ll wait.
OK – Now that you’ve read a very thorough list, I will also direct you to Miguel Cornejo Castro’s blog, who has published on community metrics. I know I’ve seen his paper on this before, but in digging just now I could not seem to come up with a link to it. Hopefully, someone can provide a pointer.
UPDATE: Miguel was kind enough to provide the link to the paper I was recalling in my mention above: The Macuarium Set of CoP Measurements. Thanks, Miguel!
If you can provide pointers to additional papers or writings on metrics, please comment here or on the com-prac list.
With that aside, here are some of the additional metrics we’ve used in the past (when we were reporting regularly on the entire program, it was generally done quarterly to give you an idea of the span we looked at each time we assembled this):
This is my last planned post on community metrics for now. I will likely return to the topic in the future. I hope the posts have been interesting and also have provided food for thought for your own community programs or efforts.