I’m reminded of my pre-ITSM days when I audited processes and occasionally how technology enables them. I once reviewed a relatively new corporate system and found it to have only two reports - one that didn’t do what it was mean to do and another that no one why it had been created.
For me this was a classic example of reporting being an afterthought at best or it being a case of what can we report on, rather than what do we need to report on (because we need to know X, Y, and Z)?
So, I guess my short answer is that people begin reporting without knowing what they actually need.
Service Desk Institute (SDI) research usually backs up your view too @BarclayRae - that ITSM tool customers usually rank reporting as one of the top-three issues with their tool (although I appreciate that it might not always be the tool’s fault).
Yes the SDI standards contains 44 reporting/metrics criteria - in simple terms a lot of things that can or should be reported on. The real trick is then what is done with that data of course - individual metrics on their own are pretty useless- however in context they are critical.
The focus on data - quality and data structure up front are key - what data outputs do we need, so therefore what inputs and level of information quality are also needed?
To me there are 4 main types of data in ITSM systems - transactional, organisational, service data and reporting data.
Yes the SDI standards contains 44 reporting/metrics criteria - in simple terms a lot of things that can or should be reported on. The real trick is then what is done with that data of course - individual metrics on their own are pretty useless- however in context they are critical.
The focus on data - quality and data structure up front are key - what data outputs do we need, so therefore what inputs and level of information quality are also needed?
To me there are 4 main types of data in ITSM systems - transactional, organisational, service data and reporting data.
I meant as a finding in their surveys - the From the Frontline report I think :)
There are many reasons organizations don’t have the reports they want, or need.
- The organization doesn’t understand what they can do with the data when defining reports.
- They don’t differentiate (thanks @BarclayRae ) transactional, organizational, service data, and reporting data and what decisions this data can help them with.
- The reports were appropriate for the person who ‘signed-off’ but they left before the project was finished, and no one else reviewed the reports.
- The data collected (maturity, integrity, accuracy) isn’t perfect - nor need it be, but you need to understand this to understand the risk (low or high) of using the data to make decisions.
- The organization is more mature now; however, the reporting is now less mature, so not as valuable as it was years ago.
- What the customer values has changed, yet reporting is based on previous strategy and goals.
- The business has changed, but the reporting is about what the old business model valued, not what is currently valued.
- No one uses the reports, or provides feedback on how they use them. Or no one asks how reports are used and how they can add more value?
- Too often reports are specified as columns of data someone thinks they want to see - to make a decision or understand performance. However, most times, no one clarifies how the output will be used to understand if the data will actually support the desired decision.
- Some reports are designed for real-time, some for daily, weekly, monthly, quarterly, or annual reporting - but do we understand why frequency might be a factor?
- Too many reports - many are useful only when you need to find out what happened.
- Lack of data and data structures that allow easy reporting for what the user wants.
- Believing that your organization is so unique, that you cannot use similar reporting to other organizations.
- Lots of data, but no information that facilitates decision making.
There are many reasons organizations don’t have the reports they want, or need.
- The organization doesn’t understand what they can do with the data when defining reports.
- They don’t differentiate (thanks @BarclayRae ) transactional, organizational, service data, and reporting data and what decisions this data can help them with.
- The reports were appropriate for the person who ‘signed-off’ but they left before the project was finished, and no one else reviewed the reports.
- The data collected (maturity, integrity, accuracy) isn’t perfect - nor need it be, but you need to understand this to understand the risk (low or high) of using the data to make decisions.
- The organization is more mature now; however, the reporting is now less mature, so not as valuable as it was years ago.
- What the customer values has changed, yet reporting is based on previous strategy and goals.
- The business has changed, but the reporting is about what the old business model valued, not what is currently valued.
- No one uses the reports, or provides feedback on how they use them. Or no one asks how reports are used and how they can add more value?
- Too often reports are specified as columns of data someone thinks they want to see - to make a decision or understand performance. However, most times, no one clarifies how the output will be used to understand if the data will actually support the desired decision.
- Some reports are designed for real-time, some for daily, weekly, monthly, quarterly, or annual reporting - but do we understand why frequency might be a factor?
- Too many reports - many are useful only when you need to find out what happened.
- Lack of data and data structures that allow easy reporting for what the user wants.
- Believing that your organization is so unique, that you cannot use similar reporting to other organizations.
- Lots of data, but no information that facilitates decision making.
There's likely a blog in this thanks @JPCusty