
There is no universal method for measuring return on investment (ROI) in IT. CIOs try to develop ROI formulas based upon their particular data center models; vendors willingly provide formulas that always show how investing in their solutions pencils out; and CFOs continuously ask for it, but ROI is elusive and indefinite. Soon, the ability to use big data for determining IT ROI — and also ascertaining risk — could change this.
“The problem begins with the fact that 40% of the data and business decisions made on the data that IT uses is inaccurate,” said Gary Oliver, CEO of Blazent, which provides intelligent and actionable data for IT.
Oliver is referring to an August 2014 Gartner report in which the research firm stated that poor IT data quality is the reason why 40% of corporate initiatives failed. Gartner also states that, “Information – including big data – without insight is an unrealized resource. Conversely, analytics without a solid information foundation is likely to lead to poor decisions. Big data analytics, therefore, is the application of analytic capabilities (descriptive, diagnostic, predictive and prescriptive) on enormous, varied or rapidly changing datasets.”
A major impediment to achieving a high-quality, holistic, and integrated view of ROI — and also risk management — in data centers is the siloed nature of IT functions and data. There are so many incoming sources of data that are “untied” to each other; the data sources include network and security operating statistics, CPU and storage utilization, Internet of Things (IoT) and web data, statistics on user access and usage of data, and power and facilities consumption. For an “ultimate” ROI formula that evaluates IT in toto, the challenge becomes finding a way to bring all of this disparate structured and unstructured data together into a composite picture that can produce actionable results for data center and IT optimization.
“To get at the problem, we use a five-step data evolution process that begins with data atomization, which breaks down IT data, regardless of its source, to a granular level,” said Oliver. “The data is then enriched with identity management, relationship analysis, purification, and historicity. We work with more than 230 siloed data sources such as ServiceNow (an IT workflow facilitator) to create a single data record or ‘master source of truth.'”
How does this consolidated data work in practice? Oliver cites three real-world examples: a financial services company found that 300 of its data center servers were not regularly being backed up; a services provider discovered that an entire floor of its data center that was being used by a client was not getting billed; and a large financial services company learned that an outage cost $60 million a minute as a result of bad data in a change management process.
In these tales of risk and IT investment evaluations, it undoubtedly made it easier for the CIOs to sell new technology investments in IT asset management systems, backup software, and change management systems.
“When I started working with the technology years ago, big data wasn’t a buzzword,” said Colin Bottomley, Global Account General Manager at IT services provider CSC. “However, the problem was very real, and Blazent’s view across our entire IT environment enabled us to eliminate the limitations of existing IT service management tools, as well as maximize our revenue stream and minimize security exposure for our client.”
These new IT visibility tools can give CIOs and CFOs end-to-end views across hundreds of thousands of IT assets. The tools could signal a major advance in the struggle for definitive ROI and better risk management, and they would be a welcome addition for data center managers because IT environments aren’t getting any simpler.