Recently a colleague of mine was (who shall remain anonymous) called me up and wanted to ‘pick my brain’ on a few things. They indicated that since they recently started their initial ITSM processes six months ago they (senior leadership) were now in a position to see the fruits of their labour from a metrics standpoint. They currently are using a popular ITSM product which has good reporting capabilities, and have a team which performs the Service Request, Incident, Problem, and Change Management processes. To give even further context they use ‘Incident records’ to track all requests and Incidents and manage them through their priorities. Priority 1 being critical and 5 as a request etc.
So far it sounded as though they had a pretty good start on things, and this is where the question came in.
“While we planned everything out, some of the metrics might not reflect what work is really going on” I was told. They proceeded to inform me that when they went to their VP of IT with the preliminary stats that there was a high degree of confusion on their face. “This can’t be right”, they said abruptly, “We don’t have this many high priority incidents and that they don’t last this long”. “Your reporting tool must be spitting out bad data. Better check it again before I need to send this out to senior leadership.”
While my friend was telling me this I couldn’t help think of that part in A Few Good Men where Jack Nicholson says “you can’t handle the truth”
The question I then posed back to my friend was “What are you trying to gain from your metrics?” In reality we are all reporting on data with the intention that when we review it we are going to identify areas of improvement. At the very least when the senior leadership in this case reviews that numbers it should be a place to start a discussion for improvement possibilities. Despite what the intentions were (or believed to be) when they started with Service Management, they have already grown to a point where they see some room for improvement. In this case they identified that the way that they interact with the business and prioritization of their ‘incidents’ is the first place they need to tweak. My friend went on to mention that the services that they said were critical services maybe are not and as a result should also not impact the way the priority is calculated within the tool. This may also need to be revisited.
My suggestion was to deliver the metrics with a report which illistrates the areas where these numbers may not be an actual representation of what the business wants to accomplish. There really nothing wrong with what we captured as far as information goes, it simply lends itself to telling us where we need to make some adjustments. Gaining this alignment early on will also allow them to allocate the right resources to truly critical issues. Otherwise it would be easy to blame the tool with a miscalculation of the activities going on.
While it would appear that there is no way to produce complete stats the way that was originally envisioned, it will be better to fine tune these things before it further impacts other ITSM processes which rely on the outputs from Incident and Request Fulfilment, as this will ultimately impact their numbers as well. My friend will keep me updated i'm sure, but admits they won’t be able to get that Jack Nicholson bit out of their head when they review it with the VP.
Labels: Service Management Reporting