Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IT

The Four Fallacies of IT Metrics 223

snydeq writes "Advice Line's Bob Lewis discusses an all-too-familiar IT mistake: the use of incidents resolved per analyst per week as a metric for assessing help-desk performance. 'If you managed the help desk in question or worked on it as an analyst, would you resist the temptation to ask every friend you had in the business to call in on a regular basis with easy-to-fix problems? Maybe you would. I'm guessing that if you resisted the temptation, not only would you be the exception, but you'd be the exception most likely to be included in the next round of layoffs,' Lewis writes. 'The fact of the matter is it's a lot easier to get metrics wrong than right, and the damage done from getting them wrong usually exceeds the potential benefit from getting them right.' In other words, when it comes to IT metrics, you get what you measure — that's the risk you take."
This discussion has been archived. No new comments can be posted.

The Four Fallacies of IT Metrics

Comments Filter:
  • by stephencrane ( 771345 ) on Wednesday December 14, 2011 @09:30PM (#38378958)
    ..but I'm not so keen on /.'s article description here. "...the use of incidents resolved per analyst per week as a metric for assessing help-desk performance..." Having worked in this area for decades, I can tell you that I can't think of a single IT support org that uses this as a metric. It's a straw horse, of which there are many when it comes to metrics. The three most common metrics are: Cost per incident Customer Satisfaction Resolution on First Contact (sometimes FC is defined as 'resolved at/within tier 1, even if it means') There are usually two more, but those tend to vary on your business and priorities, if you have SLAs/OLAs, and what service channels you offer. Average speed of answer/Time to Respond to Client is usually next. Average Time to Resolution sometimes. People sometimes care about Abandon Rate, but only within the context of the customer satisfaction metric. A nice place may poll for employee satisfaction. A nicer place does it more than 1-2/year. I've never even seen 'resolved/analyst/week' come up in discussions, forums or books going back to the early 90s. And seriously - NOBODY running anything but a penny ante 100 call/week call center would ever try to regularly cook the stats by having friends and family calling in to boost the customer contacts. It's too much work for too little bang, and it's too easily caught. Any place with a real ACD system, eventually, will notice that a not-insignificant number of calls/emails are coming from the same 10 addresses/numbers. It's just not worth it. The description implies the exact opposite. If you don't have a real ACD system and a real incident-management/ticket-tracking software, you're not really measuring anything anyway and you're probably working at a place that's not complicated enough to care about metrics in the first place.
  • by bfwebster ( 90513 ) on Wednesday December 14, 2011 @09:35PM (#38379004) Homepage

    The quote above is from Jerry Weinberg, and it is true.

    There's an entire brilliant, short book about this problem: Measuring and Managing Performance in Organizations [amazon.com] by Robert Austin (1996). It's actually a fairly rigorous, somewhat philosophical work, but it is pretty unrelenting to documenting that, indeed, trying to manage by metrics almost always introduces distortions, which in turn are almost always counter-productive. The problem isn't just with IT, it's with any type of effort that seeks to reward or punish based on metrics.

    The only metrics that I've found actually useful in IT are those that are predictive -- for example, aiding to estimate the actual delivery date of a project under development. The metrics that seek to somehow measure "accomplishments to date" solely for the purpose of reward or punishment are always gamed and are almost always useless. ..bruce..

  • Re:This makes me sad (Score:5, Informative)

    by Trepidity ( 597 ) <[gro.hsikcah] [ta] [todhsals-muiriled]> on Wednesday December 14, 2011 @09:46PM (#38379098)

    Sounds like academia, actually. It's all about impact factor, citation count, and grant dollars these days...

  • by crath ( 80215 ) on Wednesday December 14, 2011 @10:21PM (#38379400) Homepage

    I can tell you that I can't think of a single IT support org that uses this as a metric

    Some years ago, I had a help desk in my organisation that did use this metric as part of how its analysts kept tabs on their performance. It was one metric in an overall package, and the whole team (all the analysts) reviewed the package every week. As I recall, other metrics in the package included Customer Satisfaction, Average Call Length, Number of Calls Back to Users per Agent, Incidents Resovled on First Contact, Incidents Escalated to Second Level, and others.

    The help desk team very successfully used the overall metrics package as part analyst self motivation and peer motivation (as well as management oversight). Bob Lewis's piece is provocative journalism: devoid of concrete detail and full of high level innuendo. It doesn't contain sufficent detail (say, by way of actual detailed examples) to allow a typical reader to apply the thoughts he has expressed.

Old programmers never die, they just hit account block limit.

Working...