Skip to main content

I hear this all the time - generally its not always or even often down to a problem with the system in use (very occasionally it is but not often). 

 

So why are many organisations still struggling to get good MI data analysis/business intelligence out of their toolsets and data???

Maybe because of poor input data? lack of focus on the expected outputs? No service definitions or reporting on services? little information on user experience or outcomes?  Poor presentation of information? too much information? too much transactional data and information? ….

What do you think>??

I’m reminded of my pre-ITSM days when I audited processes and occasionally how technology enables them. I once reviewed a relatively new corporate system and found it to have only two reports - one that didn’t do what it was mean to do and another that no one why it had been created.

For me this was a classic example of reporting being an afterthought at best or it being a case of what can we report on, rather than what do we need to report on (because we need to know X, Y, and Z)?

 

So, I guess my short answer is that people begin reporting without knowing what they actually need.


Service Desk Institute (SDI) research usually backs up your view too @BarclayRae - that ITSM tool customers usually rank reporting as one of the top-three issues with their tool (although I appreciate that it might not always be the tool’s fault).


Yes the SDI standards contains 44 reporting/metrics criteria - in simple terms a lot of things that can or should be reported on. The real trick is then what is done with that data of course - individual metrics on their own are pretty useless- however in context they are critical.

 

The focus on data - quality and data structure up front are key - what data outputs do we need, so therefore what inputs and level of information quality are also needed?

To me there are 4 main types of data in ITSM systems - transactional, organisational, service data and reporting data.


Yes the SDI standards contains 44 reporting/metrics criteria - in simple terms a lot of things that can or should be reported on. The real trick is then what is done with that data of course - individual metrics on their own are pretty useless- however in context they are critical.

 

The focus on data - quality and data structure up front are key - what data outputs do we need, so therefore what inputs and level of information quality are also needed?

To me there are 4 main types of data in ITSM systems - transactional, organisational, service data and reporting data.

I meant as a finding in their surveys - the From the Frontline report I think :)


There are many reasons organizations don’t have the reports they want, or need.

  1. The organization doesn’t understand what they can do with the data when defining reports. 
  2. They don’t differentiate (thanks @BarclayRae ) transactional, organizational, service data, and reporting data and what decisions this data can help them with. 
  3. The reports were appropriate for the person who ‘signed-off’ but they left before the project was finished, and no one else reviewed the reports. 
  4. The data collected (maturity, integrity, accuracy) isn’t perfect - nor need it be, but you need to understand this to understand the risk (low or high) of using the data to make decisions. 
  5. The organization is more mature now; however, the reporting is now less mature, so not as valuable as it was years ago. 
  6. What the customer values has changed, yet reporting is based on previous strategy and goals. 
  7. The business has changed, but the reporting is about what the old business model valued, not what is currently valued. 
  8. No one uses the reports, or provides feedback on how they use them. Or no one asks how reports are used and how they can add more value?
  9. Too often reports are specified as columns of data someone thinks they want to see - to make a decision or understand performance. However, most times, no one clarifies how the output will be used to understand if the data will actually support the desired decision. 
  10. Some reports are designed for real-time, some for daily, weekly, monthly, quarterly, or annual reporting - but do we understand why frequency might be a factor? 
  11. Too many reports - many are useful only when you need to find out what happened. 
  12. Lack of data and data structures that allow easy reporting for what the user wants. 
  13. Believing that your organization is so unique, that you cannot use similar reporting to other organizations.  
  14. Lots of data, but no information that facilitates decision making.

There are many reasons organizations don’t have the reports they want, or need.

  1. The organization doesn’t understand what they can do with the data when defining reports. 
  2. They don’t differentiate (thanks @BarclayRae ) transactional, organizational, service data, and reporting data and what decisions this data can help them with. 
  3. The reports were appropriate for the person who ‘signed-off’ but they left before the project was finished, and no one else reviewed the reports. 
  4. The data collected (maturity, integrity, accuracy) isn’t perfect - nor need it be, but you need to understand this to understand the risk (low or high) of using the data to make decisions. 
  5. The organization is more mature now; however, the reporting is now less mature, so not as valuable as it was years ago. 
  6. What the customer values has changed, yet reporting is based on previous strategy and goals. 
  7. The business has changed, but the reporting is about what the old business model valued, not what is currently valued. 
  8. No one uses the reports, or provides feedback on how they use them. Or no one asks how reports are used and how they can add more value?
  9. Too often reports are specified as columns of data someone thinks they want to see - to make a decision or understand performance. However, most times, no one clarifies how the output will be used to understand if the data will actually support the desired decision. 
  10. Some reports are designed for real-time, some for daily, weekly, monthly, quarterly, or annual reporting - but do we understand why frequency might be a factor? 
  11. Too many reports - many are useful only when you need to find out what happened. 
  12. Lack of data and data structures that allow easy reporting for what the user wants. 
  13. Believing that your organization is so unique, that you cannot use similar reporting to other organizations.  
  14. Lots of data, but no information that facilitates decision making.

There's likely a blog in this thanks @JPCusty 🙂


Reply