Friday 25 October 2013

How not to do it ... Metrics

Metrics, measurements and KPIs are merely decision support tools. In order to make the correct decision you need the right information, at the right time, interpreted correctly and presented correctly. You can then make your decision based on accurate facts. Once you have acted upon the information you can then review the information collected both from the perspective of what it was but also in terms of how useful it proved.

Seems really easy doesn't it? But there seem to be innumerable ways to muck it up:
  • Failure to identify
    If you don't identify the right metrics to capture then how can you possibly make a decision based on the correct facts?
  • Failure to collect
    Even when you've identified the right metrics to capture, often technical limitations get in the way or an incorrect interpretation of the priority or importance is put on the metric and it's simply not collected.
  • Failure to interpretI saw one brilliant explanation of this where gun crime in the US was mapped against the take-up of Internet Explorer over the past 10 years which showed that as more people began using Internet Explorer, gun crime rates went up correspondingly. Whilst I'm no fan of that browser, I think that the facts are pretty unlinked but it shows how disparate facts can, if carelessly used, 'prove' a complete fiction. Equally, if you see a correlating increase in Incidents as the number of Changes increases, this could mean there is a relationship but not necessarily.
  • Failure to extrapolate
    Basing your decisions on the information as it is now fails to recognise that things change. You often need to project the information forward and make prediction. Obviously this introduces a degree of uncertainty in itself but better than using old information.
  • Failure to interpolateTaking a macro view may give a good indication of the overall direction but might miss the shorter term variability of the information. The trend might be upwards over the course of 6 months but might indicate a sharp dip in the intervening months which could indicate a different option.
  • Failure to connectAlmost the opposite of 'Failure to interpret' sometimes there are interconnections that we do not spot (e.g. the apparent inability of local authorities to predict the need for school places based on changes in birth-rates).
Just remember, it's not completely accurate that you can make statistics say whatever you want. You can, if you're careful, get them to tell the right story but they can, easily, be totally misconstrued or give a completely false message.

Friday 18 October 2013

Believe that you can …

Main Climbing Wall Southampton
I like rock climbing. I won't say that I'm very good, I'm alright at it. But I discovered, last night, that I'm limiting myself by my own perception of my abilities.

At the indoor climbing wall I go to, there's usually a list of all the routes and the 'grade' (an indication of the complexity or difficulty) of each but, earlier this week, they changed the routes around and haven't, yet, listed the grades for each route.

Now, normally, I'd look at the list to decide which routes to attempt based on the grade I think I'm capable of but, obviously, I can't do that at the moment, so I just started picking routes … and haven't failed on one yet.

The point is that sometimes we limit what we'll have a go at based on how complex something is versus what we perceive about ourselves. Doing this means we're unlikely to fail but, it also means that we're stopping ourselves reaching our potential.

Sometimes you just need to believe you can.