Assessing knowledge work is back on my menu for this year and I need to start somewhere simple(r): social web metrics.
Rather than focus on the high end of monitoring/evaluation (M&E) of knowledge work, I’d like to look into current web metrics in use and to understand what they are really capturing, what they fail to capture or what problems they pose and what links them together or what we could do with them.
The social media analytics framework below proposes a good entry point to this exploration. I will come back at a later stage to such analytical frameworks.
The social web metrics we have at our disposal to assess knowledge work are related along a chain from attention to action (the famous [social] AIDA again) – or from content to collective intelligence as suggested above. First comes discovering a particular resource and last comes using it, appreciating its use and being transformed (at scale) with it. Each resource, each piece of content, hopes to tick as many of these goals. With it comes a potential useful insight, but also limitations… and mitigations.
Here is an overview of some objectives we might legitimately have with our content (which metrics try, mostly insufficiently, to capture).
Sometimes we just bump into a site or resource on the web, while looking for something else… A page view is a reflection of that. Page views can thus be intentional views (effectively linked to your content/focus) or accidental views (someone ends on a web page after looking for term that is only vaguely connected with that content). An instance of this: I suspect quite a few visitors to my blog are actually looking for information about the gay podcast ‘the feast of fools’ but they end on this blog post – no connection other than the name. Then again, people do come across your content for good reasons too.
Limitations and mitigation: Page views are thus not entirely helpful. Oh, and in case you didn’t know by now, a hit really is not a useful metric, even though it’s all about finding stuff online.
The remedy is to properly (meta-)tag your content, use descriptions on your photos, add links to relevant related content. Linking is the currency of the time with search engine optimisation. The more you are linked to by others, the more likely people will find your content genuinely related to their focus, in relation with specific search terms.
Grab my attention!
Ending on a page or resource is the first step. Attracting our curiosity as online visitors is the second step, and it is not straightforward with our 8-second attention span. (Intentional) Page views are still the main metric here. But so are retweets on Twitter.
Limitations and mitigation:
This is where a good title comes in handy (one of the many useful tips of Ian Thorpe in sharing his blogging experience). But a retweet in particular doesn’t mean that the person re-tweeting the page/resource actually liked it… or even read it. These visitors just seemed to like your shop window’s look and feel! Mind that they like your content for the right reasons beyond that sweet first impression. All ‘Find me’ advices are applicable here too!
Like my content!
Ok, now people have checked your content. And they enjoy it! They ‘like’ it. Or they +1 it, or they rate it… There are various ways to show appreciation for content. Perhaps the most valuable one is to comment on content and show appreciation this way. It’s useful feedback, provided it’s genuine.
Limitations and mitigation: The danger is that some people just ‘like’ because the like button is easy to push, with or without checking the content in the first place (see the shop window problem above). The other problem is that there is still no indication as to why they like your content (perhaps the tone, the image you chose, the serendipity effect that led them to your content at a moment when they were looking for something similar). Most liking metrics are only partly useful, unless a certain volume of these signals is aggregated throughout various collections and it starts indicating trends.
Focused comments, however, should be encouraged as they help find out why people liked your content and helps you engage with your audience one step further…
Pass it on to others!
If people liked your content, perhaps they didn’t rate it (most people find giving feedback a daunting step) but they might have shared it with others. Metrics here include: linking to your content, social shares (re-tweets are a point in case, but Facebook shares, Google+ shares and email shares are other examples), citations of your work etc. People might be sharing a link to your content or the full content (re-blogging content is an indirect metric of sharing here).
Limitations and mitigation: The same danger of people sharing without having checked your content is still looming. But sharing content is generally a better indication of appreciation for your content, especially when it is shared in quantity and quality. Pay attention to who shares your content. Trusted and valued sources are great indicators of the quality of your content. I am not aware of tools that track the sharing of content with a specific breakdown of the popularity of sharing sources but that would be useful.
Keep me for later!
People may keep track of your content for different reasons:
- They haven’t read it yet but want to do so later when they find time for it;
- They want to share it with others but haven’t gotten around it;
- They like it so much – or find it useful enough – that they want to collect and curate your content.
At any rate, they seem attracted to your content enough to keep it for later.
Metrics here include: Bookmarks, favourites, downloads etc. These are possibly good measures of some following for your content.
Limitations and mitigation: Two out of three reasons above do not point to any particular appreciation. Resources could be put aside and never used again. Even when downloaded, their effective use depends on the discipline and willingness of the bookmarker to actually use his/her saved resources for another activity. Again here large numbers of these metrics can plot useful trends, but individual measurements or isolated bookmarks remain marginally useful.
The objective of your content is to be used – and re-used. Directly or indirectly, now or later, as intended or otherwise, as direct inspiration or diffuse source of innovation. But this is very difficult to track. Only direct references in someone else’s work are (usually) straightforward indications that content is being used.
Readily available metrics thus include: reblogs, citations, links in other important writings and works. Testimonies (e.g. stories of change and the likes) are not a given in social media but are probably the best approach to hear about the use of content. Indirectly, comments may play a similar role, if they mention how the content is being applied somewhere else (as opposed to just reacting on the content itself).
Limitations and mitigation: It is very difficult to get such references and accounts of use – but from this point on it becomes really interesting and relevant. Aiming at collecting such testimonies and developing a culture of feedback and critical reflection (e.g. by means of comments, ratings etc.) all contribute to getting better at and closer to collecting interesting results about the use of content.
Let me make a better you!
One of the best results we can hope for any resource we develop is for it to contribute to changing behaviour. Using content doesn’t equate change. Change is very elusive and difficult to assess as it is an intimate matter, which perhaps requires the realisation of the person changing that they are changing.
Among other metrics here, the most important one are testimonies, and to a lesser extent comments (provided these comments relate to the usefulness and effect of the resource itself, how it was used not just about the content of the resource). These are not available web metrics (yet?) and would be more typically part of process/outcome/impact monitoring efforts. But these results are worth tracking down.
Limitations and mitigation: As for the use of content, accounts of change brought about by resources or otherwise are very diffuse and hard to collect, even harder to attribute, unless mentioned in the testimonies. The same approach as for the use matters here, it just goes one level deeper in the exploration.
Become a movement thanks to me
The ultimate goal of any resource is that it is so seminal that it is referred to over and over again and has the tendency to provoke a knock-over domino effect on the behaviour of many. What the Bible or the Coran or the little red book achieved. Tough job…
Limitations and mitigation: Frankly, if you are at that stage, you should be blogging about this instead of me 😉 I can only say that radical innovation, use of locally nested word-of-mouth conversion effects and tapping into the viral potential of some technologies and their disruptive nature might offer shorter paths to this holy grail.
What is difficult is that there is no linear following along these metrics. Furthermore, some of these metrics only become useful at a certain scale – or in combination with other metrics occur e.g. only when various people have downloaded and favourited a resource can one tell that it probably has a transformative effect on people. The only sure way to get a relatively sure account of evidence is through testimonies – if they are truthful and sufficiently marginally biased.
The table below summarises some of the metrics available to suggest evidence of any impact of your content/resources.
|Direct metrics||Indirect metrics|
|Finding||Page views, hits|
|Liking||Likes, +1’s, ratings, comments
||Retweets and other social shares|
|Sharing||Links, Social shares, citations||Downloads, comments, reblogs|
|Keeping||Downloads, bookmarks, favourites||Re-tweets, Social Shares, (some) social ratings|
|Using||Citations, links, testimonies, reblogs||Comments|
|Being transformed by it||N/A||Comments, testimonies|
All in all, what matters in those web metrics are a combination of: effective consumption of the content, appreciation of that content (its quality and relevance), intent to use it, effective use of it, transformation brought by that use, scale of that transformation.
There are many tools to collect these. But the tools only address the collection part (your demand for it as content provider wishing feedback). What is more difficult is the supply of such evidence, and that comes only progressively with a culture of feedback and critical inquiry… Until that culture is there, we always navigate between the cracks of evidence and personal confidence.
Related blog posts:
- M&E of KM: the phoenix of KM is showing its head again – how to tackle it?
- I WANT (YOU) TO CHANGE! Yes but how?
- What the *tweet* do we know (about monitoring/assessing KM)?