"Proving return on investment is every bit as difficult for Enterprise 2.0 projects as it is for KM projects generally. Since we tend to get what we measure, what should we measure and how can we report the results in a fair and meaningful way? In this session, we’ll examine the basics of metrics, how to measure productivity rather than busy-ness, how to measure engagement, velocity and impact of information flows, and other ways to meaningfully mine data."
Clark R. Cordner - Orrick, Herrington & Sutcliffe LLP
Charlotte Herring - Chief, Information Technology Division and Deputy Chief Information Officer, The US Judge Advocate General's ("JAG") Corps
Moderated by V. Mary Abraham
Lisa Denissen and Steven Levy both helped out with the presentation.
What Are MetricsClark introduced metrics as a way to "measure progress and demonstrate how your project advances firm strategy."
In terms of formal project management language, Steven Levy appeared by video, with permission. He used the example of the Soviet screw factory that makes giant screws no one wants because it is assessed by how much material it spends. It's an input metric not an output metric.
Substitute metrics don't measure exactly what you want. What you want and such metrics often diverge significantly.
A good client metric is the likelihood of reuse of your firm. Repurchase intent is not a substitute metric; client satisfaction is.
The question is what the correlation between the output and the objective.
Some things are very hard to quantify, and you may need to quantify using satisfaction surveys.
Mary said that too often KM practitioners consider metrics too squishy and fail to push the analysis and identify all the things that we could actually track.
Clark recommends the blog Adam Smith, Esq. who talks a lot about metrics. You have to be thoughtful about what you spend your time measuring and communicating. A metric is like a lens. It's like the optician who asks if a given lens is better or worse (*flick*), better or worse (*flick*).
Have clear objectives and ensure your effort advances those objectives. Measure the factor most likely to strongly correlate with "success." If your success is shorter cycle time on a document, then you measure that.
Business Pain Points
LTC Charlotte Herring says that the only way to have a successful KM program is to talk to lots of people in your organization. What are the pains? It's going to vary between the senior partner, the junior associate, the secretary, or the finance administrator. Come up with tools to focus on those pains. Then implement them.
The JAG Corps has had 650 attorneys deployed as a result of the Iraq / Afghanistan wars. They sometimes don't have internet access and can't always rely on technology. KM is not just technology. Some metrics have nothing to do with numbers. KM can consist of two people talking.
What denotes health and sickness? In a firm, it might be profit margin. The JAG does not track time and doesn't care about money.
Consult with friendly people who can tell you if your metrics might make sense to your stakeholders.
Dangers with Metrics
You are only as credible as your metrics are reliable. Unconscious biases may lead you to emphasize the wrong thing. Be willing to acknowledge alternative interpretations. Numbers can be quite dangerous.
1. JAG's Automated Trial Process
For the JAG Corps, criminal work is a statutory obligation. The JAGC Military Justice Online was a web-based enterprise application for military justice from investigation to post-trial.
She started developing this new application in 2007. Everyone moves every 2 or 3 years. Each move led to a a different set of rules since practices varied. The system was designed to establish one system across the JAG Corps.
Stakeholders were very broad. The client is the institution of the US Army. A particular commander is not the client.
The commander needs information sufficient to deal with the soldier. Congress wants to know numbers of offenses and convictions. The JAG Corps wants to track how the system is working.
They succeeded in lessening time it took to process a charge. Efficient JAG procedures are more just ("justice delayed is justice denied"). Quicker processing reduces error.
[It was really great to hear the JAG Corp perspective, so different from a firm litigator perspective yet still with the same client service and zealous advocacy orientation. I wonder if big firm pro bono work could benefit from analysis of comparable metrics].
2. Portal / Intranet Rollout Metrics
Most of the audience had some sense of how to analyze success of a portal. Typical metrics on the slide and/or raised by the audience included page visits, clicks on content, reduction in certain types of RFI emails, frequency of visits, number of unique visitors, and so forth.
Measurement of an Activity Stream (such as internal twitter, Yammer, Google Reader, or RSS feed system) is more exotic. One can measure Activity Streams by the number of users. You can also measure penetration to managers and other members of the firm heirarchy. You could also look at whether it is "flattening" the heirarchy. Could track virality, or usage over time. Demographics of adoption also matter. One could also measure the pace of conversations and the type of communication (social or professional). How does it stack up to email?
I suggested that you might measure success by the number of links sent. One can also track time of usage. Often people will send links during commuting time as part of the transition to home life. (I've found that I often tweet during commutes).
Clark also spoke briefly about metrics for Orrick's Public Finance's "Online Closing System." I did not catch the substance of it. If someone would comment...
3. Rice Metrics
Finally Mary raised a "bowl of rice" metaphor for thinking about metrics. It's easy to quantify a bowl or sack of rice in terms of the number of grains or the weight. It's more useful perhaps to think about the value of someone getting a meal, or the value that person fed can add as a result of being fed for however long.