For years, enterprises have labored to figure out their IT costs of doing business by developing formulas that attempted to break down computing costs on a per transaction basis. In other cases, they divided IT costs across the company by assigning a proportionate amount of the cost to various business departments in the enterprise on a chargeback basis.

Few enterprises have arrived at meaningful ways to evaluate their true costs of computing in a business context that the CEO and other C-level executives can comprehend. This has rendered life difficult for CIOs at the budget table, because new investments in IT infrastructure don’t sound particularly compelling to those who must approve such recommendations if they can’t see how the investments will benefit sales or the bottom line.

The good news is that breakthroughs in big data approaches and technologies in the data center might be changing this.

“Companies want new ways of capturing and stating data center costs that go beyond virtualization, initial energy savings from greening the data center, and other technologies that they have been deploying to eliminate or reduce sunk costs over the past decade,” said David Wagner, product manager at TeamQuest, a provider of IT optimization services. “What they want to do now is to look at the bigger picture.”

Getting to that bigger picture means finding ways to combine the data coming in from computing resources that tells the story of how much CPU, disc, and other IT-related resources are being consumed with machine-generated data coming in from the environmental aspects of the data center such as HVAC and energy consumption. The latter systems generate continuous usage data that may or may not be in digestible form, so it is up to big data stream analysis functions to weed out the noise and distill this machine-generated data into useful information that can contribute to the overall IT usage picture.

The next step is deploying a methodology capable of merging this machine generated data from facilities systems with the CPU and disc usage data emanating from computing systems so a total cost of IT that includes computing and facilities.

“We call this bimodal IT because we are using analytics techniques that combine diverse data feeds into a composite picture of IT that doesn’t leave out vital cost elements,” said Wagner.

Just as importantly, this “complete” picture of IT costs is plugged into follow-on analytics that spell out the cost of IT in business contexts that decision makers can understand.

In one case, a national car rental agency added up its total IT costs (facilities plus IT), and then took this cost to determine what the true IT cost per car rental is. In another case, a major grocery chain computed its total cost of IT and then performed additional analytics to determine what the IT cost of running an individual store and product promotions is, and it has gone even further by calculating out the IT costs per customer acquired from these promotions.

From the standpoint of non-IT executives, it becomes easier to understand the value of IT investments when the production from these investments can be viewed in a business context. This also aids the CIO, who might have to fight less fiercely for needed IT infrastructure improvements.

“Methodologies like this are still in beginning stages,” acknowledged Wagner. “But over time, we can first expect to see cost allocations emerge in data centers that are able to use big data to capture the true total cost of IT. The next logical extension will be follow-on analytics that plug these costs into business contexts that assist executives in understanding where their IT dollars are going — and how these IT investments contribute to profits in the business.”