One of the challenges for vendors that build infrastructure management products is the scalability of those products. Many vendors develop and release products without too much concern for scalability and it can be a huge challenge. I remember way back when competing against Tivoli and the Tivoli Enterprise Console, where it was a well known limitation of TEC that it could only scale to a very small number of messages/second (so much so that IBM eventually bought Micromuse and replaced TEC with that product).
But, I can’t help thinking that even the vendors that are the absolute best in this area today, have got a whole new set of issues coming along with a much higher bar to reach.
Why do I say that?
In the past the largest users of IT in the world maybe topped out at 100,000 devices or something similar. Maybe it was an investment bank or other IT intensive business but essentially there was a natural limiter on the amount of IT devices in use – which was driven by how large the individual company could become.
But, in the future (and today) these numbers are being taken to a completely different level. Think about some of those businesses…
- Social/web based businesses are serving hundreds of millions of users all of whom are posting tweets/updates with increasing regularity
- Service providers now hosting the infrastructures of thousands of companies
- Regular organizations (pharmaceutical etc) can get access to huge numbers of devices and huge amounts of computing power on an ad-hoc basis, thus enabling them to massively scale drug testing and reduce the elapsed time to process
And of course, virtualization has enabled much of this. Can you believe that just a few years ago it would take an investment bank 3 months to procure and provision a new server?
Existing vendors need to be re-architecting/re-designing their products to be able to deal with this phenomena because it is only going to get worse.
Where are the bottlenecks in existing solutions?
A typical mgmt solution needs to collect data, transport data, analyze data, store data, display data. The bottlenecks that I have most often witnessed are in the storing of the data and the displaying thereof. This is the case because they are using traditional databases to do data storage and associated tools/reporting for the displaying thereof. But, clearly this is not going to work for these “power” users (there will become more and more power users as the world moves to hosted and cloud based infrastructures). We are going to need to use more modern methods for data storage and we are going to use more modern tools for the analysis of that data.
But, for those that do this correctly there can be a much greater customer benefit because they can perform real time analytics on a much larger set of data leading to better results and better conclusions.
I’m going to look into this area more and see what vendors I can find addressing this issue. I can almost guarantee that it will be early stage start-ups with some very smart people that are looking to solve this problem.
More later….this is a very interesting area for me as it’s going to be very disruptive.