Friday, 25 October 2013

How not to do it ... Metrics

Metrics, measurements and KPIs are merely decision support tools. In order to make the correct decision you need the right information, at the right time, interpreted correctly and presented correctly. You can then make your decision based on accurate facts. Once you have acted upon the information you can then review the information collected both from the perspective of what it was but also in terms of how useful it proved.

Seems really easy doesn't it? But there seem to be innumerable ways to muck it up:
  • Failure to identify
    If you don't identify the right metrics to capture then how can you possibly make a decision based on the correct facts?
  • Failure to collect
    Even when you've identified the right metrics to capture, often technical limitations get in the way or an incorrect interpretation of the priority or importance is put on the metric and it's simply not collected.
  • Failure to interpretI saw one brilliant explanation of this where gun crime in the US was mapped against the take-up of Internet Explorer over the past 10 years which showed that as more people began using Internet Explorer, gun crime rates went up correspondingly. Whilst I'm no fan of that browser, I think that the facts are pretty unlinked but it shows how disparate facts can, if carelessly used, 'prove' a complete fiction. Equally, if you see a correlating increase in Incidents as the number of Changes increases, this could mean there is a relationship but not necessarily.
  • Failure to extrapolate
    Basing your decisions on the information as it is now fails to recognise that things change. You often need to project the information forward and make prediction. Obviously this introduces a degree of uncertainty in itself but better than using old information.
  • Failure to interpolateTaking a macro view may give a good indication of the overall direction but might miss the shorter term variability of the information. The trend might be upwards over the course of 6 months but might indicate a sharp dip in the intervening months which could indicate a different option.
  • Failure to connectAlmost the opposite of 'Failure to interpret' sometimes there are interconnections that we do not spot (e.g. the apparent inability of local authorities to predict the need for school places based on changes in birth-rates).
Just remember, it's not completely accurate that you can make statistics say whatever you want. You can, if you're careful, get them to tell the right story but they can, easily, be totally misconstrued or give a completely false message.

No comments:

Post a Comment